Blogs

How Software-Defined Storage (SDS) Affects Storage Testing

By Jim Bahn, Senior Director of Product Marketing

Software-defined storage (SDS) is an evolving concept where server-based storage software manages policy-based provisioning and management of data storage independent of hardware. Software-defined storage definitions typically include a form of storage virtualization to separate the storage hardware from the software that manages the storage infrastructure. The operating software enabling a SDS environment may also provide policy management for feature options such as deduplication, replication, thin provisioning, snapshots and backup. SDS may be implemented via appliances over a traditional storage area network (SAN), or implemented as part of a scale-out network-attached storage (NAS) solution, or as the basis of an object-based storage solution.

You often find silos of proprietary storage bifurcated by vendors in today’s enterprise data center because the control plane from various vendors contain tools that usually don’t play nicely together. Efforts through storage resource management solutions met only modest success because the proprietary interfaces were never made openly available. SDS benefits from open source initiatives, which promise easier management and lower overall costs.

Some people claim that without the constraints of a physical system, a storage resource can be used more efficiently and its administration can be simplified through automated policy-based management. Potentially, a single software interface could be used to manage a shared storage pool that runs on commodity hardware. The only thing separating this promise from “hardware-defined storage” is that SDS adherents claim big cost savings using commodity x86-based hardware. It’s no secret that the gross selling margins on propriety storage hardware and software have been pretty healthy. If the same data availability, performance and management functionality that the classic storage vendors have spent decades refining can be obtained while using low-margin commodity hardware, costs savings are potentially huge. There’s the rub.

The total cost of storage ownership isn’t just about commodity versus proprietary hardware. It includes the value of all the software, as well as the integration and testing work. In the SDS model, it falls on the customer, who acts as his own general contractor or integrator, to do the testing that storage vendors do. Today, the storage vendors do a substantial amount of testing before they release new products, including:

  • Limits finding – determining the workload conditions that drive performance below minimal thresholds, and the documenting of storage behavior at failure point
  • Functional testing – the investigation under a simulated load of various functions of the storage system (e.g., backup)
  • Error Injection – the investigation under a simulated load of specific failure scenarios (e.g., fail-over when a drive fails)
  • Soak testing – the observation of the storage system under a load sustained over significant time (e.g., two days, one week)
  • Compatibility testing – determining that the interaction of storage hardware, software and networking is compatible with other major subsystems (e.g., virtualizers and database systems)
  • Regression testing – a huge effort is made to ensure that new releases don’t break things that used to work; it can be the single largest QA testing job and requires either massive manual efforts or extremely automated, well-scripted test beds

And then there are the intangibles. When – not if – a serious problem pops up, storage admins are used to calling on their local vendor support team to help them triage, find and fix the problem. When the customer IT staff plays the part of the integrator and support engineer, that role falls on the customer. Instead of calling your storage or switch vendor to unsnarl a sticky issue, you’re potentially calling multiple vendors and doing online postings in a user forum to solicit help from the open source community – with no guarantee of a timely response. Or worse, some critical component is supplied by a small startup who’s about to be gobbled up by a larger legacy supplier desperate to acquire innovation. Usually, the larger company lacks the deep technical skills on the new products, so it takes forever to get support issues resolved.

The point is that if you’re moving to SDS, you have to ensure that someone is doing all that testing to avoid as many problems as possible because finding problems and resolving them is only going to be harder. Fortunately, some of the same testing tools and methodologies that the vendors trust, including Load DynamiX Enterprise, are available to the IT customer. You simply need to include them when you factor in your total cost of implementing SDS.

For an example of how Load DynamiX Enterprise can be used to validate SDS performance characteristics, check out this benchmark report.