Benchmarking Your IT Environment is One Thing – Simulating it is Quite Another

By Len Rosenthal, CMO

pw=f(wl) equationMuch new IT infrastructure today, as in the past, is deployed in production environments before IT has exercised any meaningful due diligence to determine its suitability for the job. That’s unfortunate, but it may come as no surprise.

In the not-too-distant past, that was excusable, because available benchmarking and workload modeling tools were rudimentary. Take, for example, storage. For more than a decade, there have been a multitude of approaches available for benchmarking storage. That’s a critical need in an industry where 25 to 30 percent of IT budgets is spent on improving and growing storage.

The good news is that benchmarking can indicate that current storage capacity could be used more efficiently, staving off the need for new capacity for a year or more. And that single realization could reduce CapEx significantly.

But could benchmarking deliver such a finding with any certainty? Most would say no. That’s likely because those first-generation tools fell short of measuring anticipated loads and changing workload behavior in the IT environment. “If you’re running any of the standard off-the-shelf benchmarking tools, you’re really not simulating a real-world workload,” said industry analyst George Crump of Storage Switzerland.

Today, more comprehensive testing is available through workload modeling and simulation: you can examine your application workload I/O profiles to know with certainty whether new storage is needed, what type is needed, whether additional efficiencies can be gained, or whether the solution would best be deployed in a cloud.

With these tools, such as our Load DynamiX Enterprise storage performance analytics solution, you can simulate workloads in a test environment under a variety of conditions and visualize how changing workloads will affect your infrastructure performance.

But be cautious with third-party benchmarks. For one thing, these benchmarks typically run under ideal network situations, and therefore deliver idealized results – not what you would see in a real-world IT environment. Those benchmarks also are limited because they don’t – and can’t – represent your specific application workloads.

Today’s more comprehensive storage simulation solutions are a breath of fresh air for IT managers. Here’s why: every time an application changes, a switch is updated or a storage system is modified, performance can be negatively affected. For example, if a software update is introduced prematurely into production, it can negatively affect or, worse, bring down a key business application. Being able to automate and validate the effect on latency of application upgrades, changes and firmware updates on the storage infrastructure, before putting them into production, allows IT teams to mitigate risk, ensure consistent performance and save time troubleshooting initial deployments.

The ideal simulation tool is one that allows both pre-defined and customized storage workload modeling and load generation – which effectively allows emulation of real-world application workload behavior. In short, the simulation tool should make it easy to model workloads, create test cases and evaluate results.

Simulation is all about enabling IT managers, storage engineers and architects to make intelligent deployment decisions regarding their storage infrastructure. And that may mean using the storage capacity they already have at their disposal more wisely – which often removes the need for new purchases. For years, you could not make that determination with any certainty – but today you can.

To learn more about Load DynamiX and our app-centric IPM solutions, drop us a line or connect with us on Twitter – we’re @Virtual_Inst.