IT has traditionally worked out its storage requirements by looking at data growth levels and making a projection. It has generally paid for increases by replacing old arrays – in an asset management life cycle – and put new costs into specific projects, for example, paying for a storage cost in a new business process—even sneaking it in. But as the African market moves to software defined storage, businesses should have one storage pool, one storage budget, unlimited scale— and a different conversation with suppliers.
Enterprise Storage Cost: It’s Complicated
To anyone but storage architects, calculating the total cost of ownership for enterprise storage is a substantial challenge. There are endless major and minor differences between appliance types and architectures, the software running on them, the demands of the applications they are supporting, the associated support costs and service level agreements, even the cost of plugging into the main systems. Then there is data tiering: keeping critical data used on a daily basis close to the application using it on high-performance and high-cost equipment, while keeping the less useful, but often legally required data somewhere as cheap as possible. There is also the data businesses can’t live without on the system that cannot have downtime, which is replicated in real time. No matter how well managed or de-duped or tiered or stored, enterprise data has a life of its own.
Ordinary consumers could be forgiven for being baffled by enterprise storage pricing. The natural comparison is on cost per TB, as understood by a visit to a PC retailer or a quick search on Google. However, enterprise storage costs per TB are greater by an order of magnitude. The reason is that with enterprise data there’s no such thing as a single instance of data.
Imagine for a moment you have one TB of data, and you need to be careful with it. Because that data is important and “failure is not an option,” you make a mirror copy of it. Now one TB just became two TBs. Then to protect against the chance of a node failure, your data is synchronised to a second node. Two TB just became four TB. So far so good, but what if there is a problem? You need at least one on-site back up, so you take point-in-time copies a couple of times a week. Over a month or so that adds about another three TB, taking the total to seven TB. Next you must deal with the risk of more serious outages. So you make a separate back up at a different site, and even if you’re doing that just once a month, you are adding another one TB. Now, of course, you are going to perform de-duplication to reduce the volume of data, and you are going to be as clever as you can with tiering. However, you are supporting a series of complex processes with critical data, and that makes you conservative. “If it ain’t broke, don’t fix it” sums up the attitude. It wouldn’t matter so much if the business gave you a chance to build the systems you really need, but more often than not, storage architecture is as much a product of short-term requirements as long-term planning.
It All Seemed Like Such a Good Idea at the Time
Change happens in business. Companies grow by mergers and acquisitions. New products are launched; old products are retired. New laws mandate new compliance obligations on what data can and can’t be stored, for how long, in what format and with what level of security.
Trying to make a long-term plan in these circumstances is difficult—like trying to shoot a moving target while riding a roller coaster. In an ideal world, enterprise storage systems would be perfectly configured to serve the needs of the business. In the real world this is seldom the case, not because IT is making bad decisions, but because the criteria for those decisions is constantly in flux, influenced by uncontrollable external events and often driven by immediate requirements -for the storage you need for the new business process, for a new product launch, for compliance with a new law.
Taken one by one, storage decisions look rational. Looked at as a whole, the result of these storage decisions is an environment that verges on the chaotic.
Data Volumes Are Only Going to Get Bigger
Amidst all this change, there is one certainty: the volume of data will steadily grow. Where once we dealt in megabytes, we now deal in gigabytes, terabytes, petabytes and exabytes. Since Gartner analyst Doug Laney coined the defining three Vs of big data as “variety, volume and velocity” in 2001, the growth of data has become as certain as death and taxes. And data no longer grows by percentages; it grows by orders of magnitude.
Following best practice, using best-of-breed hardware and software for current requirements, often following the advice of analysts and consultants, storage has been built piecemeal into an extraordinarily complex environment in its architecture, upkeep and financial liability. It’s hard to budget for, hard to maintain, and impossible to do without.
IT teams have a choice. They can use the approach they always have, adding new storage as current circumstances demand and the business drives them. Alternatively, they can use new technology – software defined storage – to re-architect storage to be smarter, less complicated and infinitely scalable. New choices are available that can meet the need of tomorrow, are cheaper and eliminate proprietary software and hardware vendor lock-in. With IDC predicting that growth will run in excess of 40% every year for the next decade, the current costs associated with that growth are unsustainable.
Enter Software Defined Storage
Software defined storage separates the physical storage plane from the data storage logic (or control plane). This approach eliminates the need for proprietary hardware and can generate cost savings compared to traditional arrays and appliances. It is extensively scalable from storage appliance to cost-effective cloud solution and portable across different cloud providers. Add in the open source element – such as SUSE Enterprise Storage – powered by Ceph – means your storage is flexible, highly reliable and easy to manage. What’s more, it is significantly less expensive to manage and administer than proprietary systems. It will enable you to effectively manage even a projected data growth rate of 40–50% in your organisation without exceeding your established IT budget.
As African businesses tighten operational costs and look for more ways to be leaner and more innovative, software defined storage will change how you budget and how you take the big data challenges of today.