Setting up a private cloud – no matter in which industry, or even at which data centre – requires a whole set of hardware, software and networking elements to be consistently running at their peak performance.
You can test them as they’re on the go but, ideally, you need to have tested every single part of your infrastructure ‘from cold’ and have a set of benchmarks in place from the start.
We think Erwan Velu’s recent article for Cloud Computing Intelligence will set you down the right track when it comes to considering this area, either from a theoretical standpoint or for practical, day-to-day measurement and control. Though rest assured that the team at Datanet will take care of this for you, or work in partnership to provide what’s needed.
As Erwan, a Senior Engineer at French developers eNovance, explains, “once your hardware is properly setup and cabled, it’s really important to take some time to do a complete analysis of your platform performance. The following components should be verified…”
- CPU computing power
- Memory bandwidth
- Storage IOPS & Bandwidth
- Network bandwidth
“To help reduce inconsistencies in your systems,” he says, “you need to embed your testing tools and automation scripts into the operating system level, that way you get a clean operating system with minimal dependencies.”
The bottom line? “The benchmark methodology should be strict, reproducible and eradicate any possible source of doubt.” Only then will your private cloud be primed and ready for peak performance. In fact, only then will you have a yardstick as to what peak performance actually is!
Cloud Computing Intelligence is at http://cloudcomputingintelligence.com/, eNovance are at http://www.enovance.com/ and you can read an overview of our own cloud proposition here: https://www.datanet.co.uk/cloud/.