This shift has resulted in huge volumes of machine data – known as big data. Email and phone call habits, online browsing patterns and what your favourite brand or fizzy drink is; all this data and more is out there and according to Gartner, the volume of global information is growing annually at a rate of at least 59 per cent.
Despite the growing hype and the fact that data analysis has actually been around for years, it’s not just a buzzword; the new scale of data is the biggest game-changer to hit IT since the world-wide web was invented by Sir Tim Berners-Lee. The concept of big data is creating a new industry and disrupting many others.
Data driven future
Imagine walking down a high-street, and as you pass each shop or restaurant you receive a discount voucher or special offer from that company to tempt you inside? This is where retail will benefit most and integrate their customer experience with online tools.
And even online shops can challenge their brick and mortar competitors. Or imagine a future where scientists from all over the world could work with medical data in real-time to work out what causes a disease like cancer? The future is only just around the corner and scenarios like these are only the logical next step from targeted ads on the internet or working collaboratively on Google docs.
So it’s not surprising that according to a report from the McKinsey Global Institute the use of big data is becoming a key way for leading companies to outperform their peers. For example, McKinsey estimates that a retailer embracing big data has the potential to increase its operating margin by more than 60%.
Avoiding the virtual hop
The obstacle we need to overcome to truly realise the power of big data is the time it takes to derive useful information from it. Analysing more data requires more compute power, and if that power is not immediately available it adds a ‘virtual hop’ into the network ecosystem. In other words, it introduces a time element into the analysis of that data.
For big data to truly deliver on its promise, this time needs to be reduced. It’s also crucial to present that information to the end-user in as near real-time as possible. For example, there’s no point getting an offer for a restaurant you walked past five minutes ago. Realising the potential of big data requires getting information to the user in a format that is timely and relevant, via a near real-time connection to the mobile device. This velocity of data is the real challenge; the speed at which it must be processed to meet demand.
The need for speed
Highly connected colocation data centres can reduce the problematic ‘virtual hop’ by taking out an element of that transmission time. That element could be the critical difference between just-in-time and just-too-late. This isn’t a new problem; the need for low-latency is something with which the financial world is only too familiar. When it comes to trading, milliseconds are money and it’s crucial that the ‘virtual hop’ is kept as small as possible. So to ensure that big data delivers, we can look to the data centres that provide the financial market with the speed it needs.
One way to meet the demands of big data is to utilise network service providers who have developed specific products and services which can provide consistent low latency but also high bandwidth.
Organisations looking to deliver high-speed application performance while driving down bandwidth and other associated costs can take advantage of a highly connected and carrier-neutral data centre to provide the exceptional levels of connectivity big data applications require. These data centres protect user experience and enable increased network resilience for the cloud based services or platforms that can process large data sets.
The humming of a data centre is the sound of the future and, as data volumes continue to grow, it will be the world’s data centres that are the unseen core of modern life.