The Big Data analytics trend to glean valuable business insight has recently started to significantly increase. Per Gartner, over 30% of analytics projects by 2015 will provide insights based on structured and unstructured data. Challenges arise when trying to map the business insights – and the IT changes they imply – back into operational IT management. How can IT ever more positively impact the business ROI and ensure ongoing balance of infrastructure cost, services risks, and end user service performance?
Many are aware the use of highly sophisticated predictive analytics predate big data by decades for uses cases such as Capacity Planning. There is an evolving understanding around how these traditional approaches work with big data analytic approaches.
The two approaches are highly synergistic across both technical and business value spectrums. There is also evolving awareness of an analytic use-case maturity curve; starting with descriptive (what we have and what is it doing), through diagnostic (why is it happening), predictive (what will it do in the future, when and why), and ultimately prescriptive analytics – technology to recommend (perhaps automate) specific courses of action.
More mature predictive and prescriptive analytics are appealing to IT leaders because they add a highly proactive “future view” of mission critical processes and resources – not only potential issues, but also choices and expected results.
Speaking at a TeamQuest ITSO Summit, Mark Gallagher, of CMS Motor Sports, described how Formula One data analysts successfully analyse data to not only ensure the safety of team drivers but to win races. “In 2014 Formula One, any one of these data analyst engineers can call a halt to the race if they see a fundamental problem developing with the system like a catastrophic failure around the corner. It comes down to the engineers looking for anomalies. 99% of the information we get, everything is fine,” Gallagher said. “We’re looking for the data that tells us there’s a problem or that tells us there’s an opportunity.”
Building on Formula One, it is apparent there is an overlying theme of exception-based, predictive and prescriptive analytics, as well as the game-changing nature of measuring and analysing what matters. In Formula One, it’s about winning races by getting proactively ahead of technical problems with race cars. In sports it is about winning games by best using the talents of team members. A workable summarisation is: “good (the right) metrics + powerful analytics = best possible results.”
Progressing the spectrum from descriptive to diagnostic to predictive analytics, the business intelligence can be used to see ahead, plan, and make decisions when there are too many variables to evaluate the best course of action without help from advanced technology.
Prescriptive analytics tools develop business outcome recommendations by combining historical data, business rules and objectives, mathematical models, variables, and machine-learning algorithms. This enables virtual experimentation to significantly reduce both risk and cost. Beyond insight, prescriptive analysis can foresee possible consequences of action choices, and conversely, recommend the best course of action for the desired outcome(s).
Adoption of more mature levels of advanced analytics is still nascent; Gartner surveys show most organizations are assessing current and past performance (descriptive analytics), with only 13 percent making extensive use of predictive analytics. Under 3 percent use prescriptive capabilities. Growth is certain. Gartner analyst Rita Sallam claims, “Those that can do advanced analytics on top of Big Data will grow 20 percent more than their peers.”
It seems clear a comprehensive understanding of application/service performance –as experienced by the end users – must be deeply integrated with data centre technology management tools, processes and measurement data to in order to ensure that automatic provisioning of resources is simultaneously cost-effective and service risk minimising.
Automated provisioning of storage, bandwidth, and computing power is indeed one of the primary benefits of virtualisation and a powerful feature of highly virtualised and dynamic, hybrid-cloud environments. But without integrated business intelligence as a decision driver, all that is likely to happen is that sub-optimal decisions will be automatically implemented, more quickly than ever – with no assurance of continuous, acceptable service performance, let alone optimised cost (CapEx and OpEx).
IT teams will be more successful if they’re able to look at the right data in combination with powerful analytics. IT must understand what’s important to the business to be successful by delivering accurate, strategic advice – sometimes in a matter of seconds. It’s important for IT to use the “descriptive, diagnostic, predictive, prescriptive” spectrum of analytic approaches which reinforces that it’s not just about getting good information; it’s about knowing what to do with that information, when, and importantly, why.
This journey can start with whatever existing level of tool, process, and skill maturity is extant within IT environments and yield immediate and “game changing” results towards complete Data Centre Optimisation. So-called “rip and replace” approaches are not required to get started and achieving rapid results. Simple first steps include:
- Inventory existing IT management tools, processes and metric data.
- Map to asset and service catalogue information to ensure context.
- Apply spectrum of analytics to data – in the context of the applications and services.
- Identify “low hanging fruit” projects, doing initial analysis manually.
- Ultimately select appropriate targets for more automated solutions to scale to the business.