Technology development has long followed a familiar pattern: Only large companies could afford the capital investment needed to create new hardware or software, and only similarly large companies could afford to buy it. Even the advent of broadband Internet technology didn’t solve this problem; while the cost to develop software dropped, for example, the cost to host applications did not.
The rise of business-class cloud computing deployments, however, has largely negated this initial investment requirement, effectively leveling the playing field and changing the role of business IT.
A consumerised world
Two separate cloud-based trends have conspired to alter the IT landscape. First is the democratization of technology; it’s now possible for employees to access resources of similar efficacy as IT admins, but without IT permissions. Second is consumerisation. Everything – from computing resources to storage space to server infrastructure – is available on-demand, and for a price. The result is a market where IT pros can pick and choose what they need, how much of it, and when.
Consumerisation has rapidly become a buzzword for providers, who use it as a springboard to sell their version of cloud-based technologies. Five years ago this bred a market full of competing solutions which refused to play well together; now, providers recognise the benefits of getting along – with so many choices, IT professionals won’t settle for a solution that doesn’t provide data portability or isn’t clear about pricing structure. In short, the cloud has removed common barriers to application and technology development.
Devices versus applications
Much noise is made about bring you own device (BYOD). Some of this comes from employees, determined to use their personal smartphone or tablet at work, and some comes from IT professionals, understandably concerned that allowing a multitude of consumer devices through the server door will ultimately burn down the house. But despite the hype, this drive for increased mobile is a secondary effect of the cloud: Bring your own application (BYOA) is its true strength.
Lowered costs to both develop and purchase applications mean business users can easily find – or create – what they need, instead of waiting months to get their hands on a working version. Public cloud servers now support thousands of concurrently-running apps, letting companies to pick and choose which particular set works best for their needs. Accounting applications, for example, don’t have to come from the same provider as supply chain apps or customer relationship management (CRM) solutions; the line between established provider and startup company is beginning to blur.
For IT professionals, the multitude of apps necessitates a change in perspective. No longer do they act as gatekeepers, turning away applications which don’t align with department goals or may prove more trouble than their worth. Instead, IT pros are now consumers themselves, working alongside employees and executives to ensure their network remains secure.
The Internet of Things
Consumerisation also represents a major step forward in what Kevin Ashton (way back in 1999) termed the “Internet of Things.” The underlying concept here is that when every object connected to a network – including hardware, software and people – is uniquely indentified, the result is transformative. Supply and demand could be precisely controlled, and resources could be managed in such a way as to prevent over or under-use. The Internet of Things has been linked with the generation and processing of big data; mining these indentified resources offers real benefits, for example predictive decision making.
But critical to the concept is adding more “things,” and additionally finding ways to monitor them. The distributed, elastic resources offered by cloud computing are perfect for this task, as they greatly enhance the ability of companies to both add and monitor whatever they add to the network. In a world where only big-money players had access to new technology, they consequently controlled what “things” were added, and growth was slow. Now, broad access to applications by IT pros and employees alike has changed the landscape and helped make it infinitely more complex.
A consumer-driven future?
What’s the bottom line for a consumer-driven IT future? That cost and speed are critical. Much app development software is now free (or nearly) and the time from concept to development to deployment has shortened significantly. There is some understandable resistance on the part of IT admins, concerned that the security of their jobs is suddenly in peril as employees gain the ability to access and perform similar high-level functions.
But the cloud is more than just an easy way onto networks for users, or a fast-track to application deployment. It represents a step forward in the development of the Internet of Things, an Internet which fairly cries out for IT oversight. Big data; predictive analytics and employee education – these are new roles of IT administrators, all enabled by the cloud. Being a consumer has its perks, after all.