If you were a fly on the wall at any IT department’s strategy planning meeting for 2011, chances are that virtual desktop infrastructure (VDI) will be on the agenda.
While the financial arguments rage (quite correctly) about the costs and benefits, it is the good old foot soldiers of the IT department who will be rapidly pushed from the trenches into no-man’s land, setting up the new system under fire and trying to ensure that it works on day one, scales up rapidly from day two and then keeps running smoothly thereafter. Why all the interest and pressure? To put it simply, in today’s economic environment there has never been greater emphasis on “getting it right first time”.
Why? Well many organisations are going to need increased flexibility from their staff in order to meet a combination of cost reductions and changes to working patterns. Savings can be made through hot-desking, reducing premises costs, and by making staff more efficient by reliably delivering them applications and data securely at their point of use. More flexible working applies both to when and from where employees work, and how easily new users can be given the tools they need to do their job.
Technology that empowers users to work remotely (as long as it is simple, quick and secure) will also improve the standing of the IT department. Indeed, the perception of IT itself can be transformed from restrictive to progressive – but only as long as supply keeps that one step ahead of demand.
At the risk of generating sympathy for the IT department, the cross-fire is becoming more intense. Having successfully held off upgrading the desktop hardware to Vista the long-promised desktop refresh is now under fire both from budget reductions and increased options around virtualisation and cloud computing. As if that wasn’t enough, the carbon emission reduction targets are still there, governance and information security are still on the CIO’s agenda and the executive team have all taken the consumerisation debate so seriously that they’ve decided to try out iPads.
There’s a temptation to wave a white flag and simply shove everything up into the cloud, but that’s a bit like shoving everything up into the attic when selling your house. It all looks neat and tidy at first, but you quickly find that essential items are hard to reach…
As a consequence many organisations are looking for a journey with a number of stages rather than a single destination. So the future might be desktops on demand with streamed subscription apps and centralised data, all re-charged on a usage model, but the reality for most organisations is a desktop and application transformation that simply lifts the computing environment away from the device.
If this can make the user’s working environment less costly and more flexible (and also deliver high-speed WAN performance, multimedia and support for all their peripherals) it might just get our beleaguered troops over that last mile of mud and into the warm. So what do organisations need to do to achieve what the industry increasingly calls “abstraction”?
Virtualising and centralising need to be at the heart of any strategy; allowing the organisation to deliver software and desktops as a service. But effective automation and systems management are required to ensure that every virtual machine is built to a known standard and that the standard can be updated in one place with changes distributed automatically.
Personalisation is another attractive abstraction target. Separating the individual user’s settings from their device and operating system can have significant performance and end-user experience benefits, particularly when users may access their desktops in a number of ways, such as by laptop, desktop and thin client.
Not surprisingly storage is one of the most urgent considerations. It’s well known that the business case for many of the early VDI projects fell foul when the storage costs of scaling the solution cost up from pilot to production became known, as desktop virtualisation of any scale requires a different profile of storage and simply transferring disk images from cheap SATA devices in PCs to expensive SAN fabric is not an option for most organisations.
What’s more surprising is the range of answers to the “how do I get this storage cost under control” question. Thin provisioning, cloning, de-duplication, PAM cards, caching and a range of products and upgrades can all be applied at a cost. But for most customers asking your storage vendor which one is right for you is a little too much like opening your wallet and closing your eyes at the same time. In an emerging market it’s important to get advice on the range of choices, implications and costs.
Finally, at the client end, customers increasingly turn their attention to the costs and environmental issues of power consumption, cooling, disposal and technology refresh; any move towards a thinner client has significant appeal, particularly as the local multimedia capability of the devices increases.
Only with a good understanding of the available technologies, together with existing investments and strategies, can an organisation choose the right solution with any degree of confidence. Finding the perfect solution for individual companies is not straightforward, although the simple rule of ‘you get what you pay for’ tends to apply. So what are the options?
Centralis is a long-term advocate of Citrix, and I make no apology for highlighting the vendor’s advantage in the scope of its integrated suite, allowing both a mix of delivery models under a single XenDesktop license and a range of networking product optimised for security, performance and WAN delivery.
Having pretty much created the commercial virtualisation market, VMware continues to be the vendor of choice for most enterprise-scale server virtualisation projects. With a long-established record in high-availability, site recovery and fault tolerance, VMware has earned its place in the corporate data centre.
The VMware View desktop virtualisation offering has seen a slower adoption, however. From both a product and services perspective the difference between “back end server” and “application and desktop” virtualisation has led to a steep learning curve, but VMware now has a product very much focussed on the VDI space and a new protocol (PCoIP) on which to focus innovation and development. VMware’s strength is in the “pureplay” VDI focus and the cost effectiveness of its VMware View product lies principally in this area for now.
Of course, anyone reckoning without Microsoft as a major player in any corner of the infrastructure market is always likely to be making a mistake. Although they have allied themselves very strongly with Citrix with regard to desktop virtualisation in all but the smallest accounts, Microsoft’s Hyper-V is being adopted in many organisations.
In addition, System Center is seeing a rapid uptake in medium through to enterprise-sized organisations as the most effective cross-platform management tool. Recent changes in Microsoft client operating system and client access licensing with regard to virtual desktops has removed one of the biggest barriers to adoption, although organisations can still fall foul of application software licensing if not advised correctly.
Microsoft’s acquisitions around highly graphical content suggest that they may well have a long term interest in developing their own end-to-end solution, but for now the full weight of Redmond is behind the Citrix XenDesktop offering, which is another compelling commendation.
A range of application virtualisation options is also available, including App-V from Microsoft, ThinApp from VMware and Citrix streaming. Virtualising the applications allows them to be streamed into the virtual machines as they are provisioned, reducing the number of builds needed and reducing resource requirements and overheads. Application virtualisation can also eliminate application conflicts and allow multiple versions of an application (e.g. Office 2003 & 2007) to be delivered to the same device.
Any organisation looking to become more agile needs a robust, scalable personalisation solution – particularly if they are planning a move to Windows 7 from Windows XP and seeking to retain user’s personal settings.
The latest thin clients from WYSE and IGEL, which are optimised for desktop virtualisation, are attractive for any client looking for low-energy solutions to reduce their carbon emissions. This remote management software consumes the equivalent power of a Christmas tree bulb and, having no moving parts, they are silent and very durable.
Seamlessly combining these with the power of a full XP or Windows 7 desktop is a compelling proposition. The emergence of client–side virtualisation for mobile devices later this year will add a further dimension to strategic virtualisation and may finally cure IT of one of its enduring headaches: the cost and complexity of laptop support.
But I would be guilty of being disingenuous if I did not flag up the potential hazards of virtualisation as an abstraction tool. Without a clear end goal and strategic vision the journey is difficult to plan and success is hard to quantify. Increased pressure and scarce resources mean that planning and preparation must be perfect, or else performance and productivity will suffer. A small budget put aside for analysis can prevent costly mistakes.
Of course there can be a temptation to go fast and cheap, but for many users abstraction will touch every part of their working experience and skimping on product, strategy, analysis or design is likely to deliver some very unhappy abstracted users.
The technology to reduce costs and increase flexibility is currently available on the market. The next step is to convince your financial controller to read from the same menu and invest in that technology. Maybe ask him just after his computer has crashed for the fifth time…