The amount of data businesses create and store is growing exponentially, with an estimated increase in annual data growth of 4300% by 2020. Companies are generating and storing huge volumes of information and are still expecting to access it in the blink of an eye. We want it fast. We simply don’t have the time or inclination to wait around. We expect the impossible.
So, as the volume of data we’re storing grows, in parallel with our urgency to access it, so do the costs. IT managers face a constant pressure to store as much as possible and keep the systems performing to meet the demands of the services they provide, but do it in a way which keeps costs to a minimum. It’s a conundrum which has them scratching their heads.
We remember the day when data was stored locally on servers and the traditional tape recovery took days. The arrival of virtualisation saw our data leave the server and reside on storage devices accessed via the SAN/NAS. We emerged into a society where IT systems were regarded as a commodity, expected to function 24/7 and our level of intolerance escalated. The growing data problem intensified, and became an ever-increasing chore on the IT Manager’s list.
Gone are the days when it’s acceptable to just throw extra disks at the problem until the storage has reached its capacity. This is just money down the drain. Data storage features have advanced significantly over the last couple of years whereby optimising storage functionality and performance has become an essential task to meet the end user’s increasing demands. Storage tiering is now common practice within enterprises, and regarded as a cost effective means of managing the company’s storage requirements.
At the forefront of IT managers’ frustrations about cost and performance is the debate about Flash storage, typically in the form of SSDs (solid state drives), versus HDDs (hard disk drives). IT managers understand that Flash offers extremely fast read and write speeds, but they know only too well that they cannot necessarily service the entire infrastructure with these. Sure, the end user would receive constant high performance and high availability, but Flash is a more expensive storage option compared to a traditional hard disk.
In cases of uncontrollable data growth, the IT manager must identify the highest priority systems, typically business critical operating systems, which are therefore required on the primary Flash storage tier, and vice versa. Flash systems can then be employed intelligently within the infrastructure, to integrate where they are needed most, namely for the operational data in high demand on a day to day basis.
A combination of Flash systems alongside some higher capacity spinning drives with slower read-write, or in other words a hybrid system, is the ideal solution for most enterprises. Of course, it is recognised that data has different values to the business at different times, and so the storage system will be specifically designed to accommodate these changes. The enterprise creates a bespoke system, specially designed to suit their ever-changing requirements.
As storage systems have developed and matured over the past few years, they have created a highly competitive marketplace. There is no end of providers touting ‘holy grail’ solutions ‘to solve all your storage needs’, but the days of committing to expensive technology without knowing how effective it would be are gone. There are now analysis tools that can run as a background task on your servers and play ‘what if?’ rules.
What if we stub all files of a certain type that haven’t been opened for four months? The resulting reports can show how much disk space you would save, the rate of data growth you can expect if you maintain these policies and the volumes of data you are storing that hasn’t been accessed for long periods of time.
Armed with this level of analysis businesses are able to make informed decisions, and ultimately save money. IT Managers can better assess the needs of their business and create the best solution by making informed decisions around functionality and cost. Many storage decisions are made through fear and this attitude needs to change. Data will continue growing but the industry needs to turn this around sooner rather than later and admit the scope for greater efficiencies and be more realistic when it comes to assessing the data business realistically needs to access.