Data continues to grow exponentially driven by enterprise applications, cloud and mobile broadband, and the expanding use of mobile video. In fact, IDC states that we are currently doubling the data we generate, every 18 months. What is more this data needs to be stored and managed effectively to extract its maximum value.
While real time access to data is often critical, current hard disk drive storage solutions are struggling to keep up with the demands being placed on them by the rapidly growing number of enterprise applications that bind that data together and offer users real-time insight.
Solid-state disk (SSD) and Flash memory is nothing new. Anyone that uses USB sticks, or owns a smart phone or tablet uses Flash memory all the time. However, while flash memory is fairly widespread at a consumer level, adoption amongst enterprises and business customers grows relatively slowly.
In a recent survey of 400 datacentre managers, 70% of those responding identified network and storage access as the two primary areas of infrastructure bottlenecks. Application performance suffers as storage and network infrastructures struggle to keep place with the demands placed upon them. Whilst 93% recognise the importance of application performance to their business, three quarters admit applications are failing to deliver the required performance.
The scale of this issue was highlighted by 40% of respondents who stated that they want their core applications to run twice as fast as they do today, almost a third want applications to run 50% faster and 15% want them to be four times faster. They recognise that the consequences of underperforming applications are significant, making their business less competitive (40%), leading to lost revenue (26%) or lost customers (16.6%).
This problem is exacerbated by the “data deluge gap”, the fact that IT budgets cannot keep pace with the amount of data and traffic being created. The challenge for data centre managers is to address this gap, ensuring there are no major bottlenecks slowing networks and maintaining a quality of service consistent with user expectations.
So what’s to be done? By bringing acceleration to systems, customers can meet the challenge of maintaining the speed even as the volume increases exponentially. The focus is on driving out latency because there is huge value to be gained, especially in areas like the financial sector, in accelerating access to data.
This acceleration can be achieved at the storage level by incorporating flash-based solid state disks (SSDs) into the infrastructure. SSD technology is becoming a popular storage technology because it can break the performance bottleneck of traditional mechanical hard disks and achieve extremely high storage performance. This means SSDs can greatly improve transaction processing performance and shorten response times.
However, this is expensive to do and not everyone can afford to store all of their data on flash storage, a point reiterated by the survey findings. Many of those surveyed recognised SSDs would lead to acceleration of database throughput and response times, as well as higher application availability, however, were deterred from investing in the technology due to cost. Over 90% said perceived costs were holding back their adoption of SSDs, citing acquisition costs and total cost of ownership as important factors in any decision to adopt the technology.
By using the right amount of flash in the right place so that cost is kept under control and applications themselves can be left largely untouched. There are now solutions available that are redefining the economics and deployment models of flash storage so that it can be widely deployed, targeting the right amount and providing the best value.
For example, software run at the server level can allow for a copy of the most frequently accessed data to be saved to a PCIe flash card, reducing millisecond latency to microseconds. In this way a small amount of flash storage put in front of existing hard disk drives can bring significant gains.
So adding intelligence is critical to improving application performance and providing faster and more efficient networks. This can be done via software solutions or at a hardware level through silicon-based products. A blended approach of Flash and traditional HDDs is a smart and cost effective way of making data centres and networks more intelligent, enabling them to prioritise data and recognise what needs to be accessed most frequently and quickly.
To put this in context, three years ago, a study by UCLA published in the Journal of Neuroscience, confirmed that the smarter a person was, the faster he or she processed information. It should come as no surprise to learn that with the aid of acceleration and intelligence, the same principle can apply to customer data centres and networks.
So whilst it seems difficult to grasp the magnitude of how underlying data and network infrastructures must be architected around the when and how users need to access information – building intelligence into the infrastructure with smart silicon is the answer to solving massive data growth.