With the April 2014 end-of-life deadline for Windows XP looming large on the horizon, 2012 was the year that saw the majority of organisations on the migration journey to Windows 7. Certainly for those that have yet to start, time is now of the essence.
The move to XP took larger organisations at least a couple of years to complete, because it was a much richer environment and introduced a whole new set of features. Despite the perception in some quarters that the move from XP to Windows 7 will be quicker, there are a several reasons why it will follow a similar timescale.
Firstly, because many organisations will have been using XP for the best part of a decade, they will most likely have a vast collection of disparate legacy applications to port over and an increasingly diverse set of users to cater for. Secondly, there has been a whole host of additional pressures building in the form of increased demand for remote access, more flexible ways of working, and the emergence of tablet PCs and BYOD (bring your own device).
These trends gathered real momentum in 2012, meaning organisations now face a knowledge gap not just in terms of how they are going to accommodate their current applications and user profiles within the new Windows 7 environment, but also in terms of the level of complexity that their migration will entail.
Thirdly, there are many more possibilities emerging as cloud computing and virtualisation technologies mature, but there is yet to be a clear leader in terms of solutions or desktop delivery model. So any transformation at the desktop remains a question of matching the immediate needs of the business to long-term strategic goals. This is why it makes sense for organisations to enlist an experienced third party to help shape and manage their migration.
Workspaces and work-styles
One of the major changes witnessed in 2012 was the explosion in the number of devices entering the workspace and the continued move to more flexible ways of working. The adoption of tablet PCs and smart phones has gained incredible speed and brought about a significant change in the way employees access information.
Business users now want mobile access to their applications and data wherever they are, and on whatever device they have to hand – even within their own office. It’s now commonplace for users to take their tablets into meetings so that they can access information on-the-fly.
In addition, the instant-on, pick-up/put-down tablet experience has meant end users are much more flexible about how and when they work. Much of this can be attributed to the social media wave, with tablets and smart phones enabling users to tweet and post updates as often as they like. This constant ‘snacking’ is reflected in the form of more flexible work styles, as it’s now a lot easier for employees to dip in and out of work and personal life with these instant-on devices.
Given that traditional desktops and laptops take time to boot-up, which is prohibitive to the ‘snacking’ style of work, businesses must accommodate tablets and smart phones within their desktop strategy – especially if they want to be able to recruit and retain the next generation of talent.
The 2013 watershed
The fact that the average user often has three devices (smart phone, tablet, desktop PC) and that work styles are evolving makes 2013 a watershed for the desktop environment. That we are arriving at this watershed was validated by Microsoft’s move into the tablet PC market and the launch of the Windows 8 operating system with its touch-based user interface.
This was driven as much by the success of Microsoft’s rivals as it was by the need to address the changes in user demand. As Apple has so successfully demonstrated, the only way to innovate in this area today is to do so at hardware, operating system, and application level.
Although the majority of enterprise applications today are still based on the traditional-style Windows user interface, a growing number will move to a touch-based mode of operation as the adoption of Windows 8 grows. Again, this change will be driven by user demand and ultimately will influence corporate desktop strategy.
The key challenge moving forward will be delivering applications and data in a way that fits users’ needs. This requires a firm understanding of the applications deployed within an organisation and the various types of user profile and devices being supported. So not only do organisations now have to implement a new operating system in the form of Windows 7, they also have to transform the way they deliver their applications and data.
In effect, this will mean moving away from fixed-desktop XP installations, to a more centralised model of delivery. At the same time, it means finding the most cost-effective way to enable this transformation, which is where the growing range of virtualisation and cloud technology options comes into play.
A question of delivery
Organisations that haven’t yet started to implement a shared/hosted desktop delivery model via terminal services/RDS could well be tempted by the desktop virtualisation or virtual desktop infrastructure (VDI) route, given that these technologies deliver a familiar desktop-based operating system. However, the major barrier so far has been the cost of storage, and the fact that the density of users that can be achieved via desktop virtualisation is significantly less than with a terminal services/RDS-based solution.
Nevertheless, desktop virtualisation or VDI makes it possible to offer a lot more flexibility to end users in terms of enabling them to install their own applications. This is not possible with terminal services/RDS. Major Hypervisor manufacturers have also made considerable progress in lessening storage requirements by adding caching to their solutions, while several niche vendors are targeting storage optimisation using SSD or intelligent software.
While the VDI versus terminal services/RDS, versus desktop virtualisation debate will continue to rumble on, the growth in the number of enterprise tablet users has served to muddy the waters further. Here, VDI could be perceived as being an easier desktop delivery option when looking to ensure application compatibility, as it means organisations can use the 32 bit version of Windows (RDS is 64-bit only) and are able to publish applications to tablets.
Vendors such as Citrix for example, already support the capability to publish applications from the desktop to tablets, and moving forward there will be more focus on enabling this capability, rather than necessarily trying to deliver the desktop in its entirety.
Back to the future
Virtualisation technology is also seeing on-going development in the area of persistent desktops and imaging techniques. By separating the user changes from the base image using layering techniques it is possible to have the flexibility of implementing a persistent desktop whilst retaining the benefits of a single gold image in terms of storage, caching and management. A substantial number of organisations are still adopting the persistent model, as it enables them to deliver a similar experience and level of control to traditional desktops that users are familiar with.
However, in terms of overall efficiency, it’s much better to have pools of desktops – i.e. non-persistent desktops or use terminal services/RDS. Moreover, one of the issues with persistent desktop delivery is that organisations can end up either managing multiple images, or simply providing power users with a customised desktop that in effect means they are back to managing desktops on a one-to-one basis.
Although both forms of virtualisation are in essence quite similar, the benefit of the terminal services/RDS model is that there is only one instance of that operating system code serving many users, thus it is inherently going to deliver a higher density of users at potentially less cost.
VDI could, in the long term, become the preferred option for centralised desktop delivery, provided the cost of doing so moves closer to that of the hosted/shared services model. In the short term however, it seems likely that organisations will continue to employ a mixture of virtualisation and conventional technologies to centralise desktop delivery. Laptops for example, will always need a degree of local operation.
Enterprise app stores
As desktop delivery becomes increasingly centralised within the data centre (whether on-premise or in the cloud), we will start to see many more apps being delivered to devices. This includes more native apps, those delivered under the software as a service (SaaS) model, and those enabled via web browser.
The issue today is that traditional Windows enterprise applications are not designed for a touch-based user interface and do not translate well as apps for the various different platforms. This is why organisations will be looking to develop more native apps and use HTML5 to enable browser-based delivery of business information to the increasingly large installed base of Android and iOS-based tablets and smart phones.
The demand for more ‘appified’ access to business information will see a growing number of organisations develop app stores of their own, as opposed to using those of Apple or Google. Here, the thinking is that end-users will be able to subscribe to the apps they require for their specific platform and the type of information they wish to access.
However, enterprise app stores will need to be built with more intelligence and control than consumer-focused storefronts. Corporate IT will need to be able to identify the device and the platform on which it is based – as well as provide the level of security required for compliance. Citrix is working towards enabling this type of enterprise storefront model with its CloudGateway solution. In essence, this type of solution can be contextually aware with regards to device type, apps supported and the data being delivered.
Having an app store and centralised data enables organisations to start to move into areas such as identity management, dynamic provisioning of accounts, and delivery of web and SaaS-based apps. Users will not only have instant access, but single sign-on, and self-service selection of applications with automated provisioning of their accounts too. Workflows will be managed by IT to allow the business to provide approvals that enable accounts to be provisioned and removed in an on-demand fashion.
Intelligent data and personal cloud
Finally, one of the challenges brought about by the consumerisation of IT and BYOD is the fact that an increasing proportion of devices entering the workspace today are owned by the user rather than the organisation. The conventional approach of locking a device down and controlling the applications they are allowed to use via enforceable policies will no longer be applicable.
We are therefore going to see a move away from IT being able to manage the apps and the device, towards controlling the user access and more importantly, the data being delivered. That’s why I believe we are going to see the emergence of an ‘intelligent data layer’ – i.e. data will be encapsulated by a layer of encryption enabling corporate IT to control through policy how, when and where it is accessed, given it is not possible to know the type of device and maybe even the applications being used.
And with most end users now equipped with three devices, we are also going to see a lot more of what is touted as the ‘personal cloud’. This enables individuals to sync data between their devices – and not just for email, but personalisation of their desktop environment (passwords, favourites etc.), and documents and files too. Citrix is enabling this with Sharefile, as is Microsoft with SkyDrive and Windows 8. In the future, enterprise app stores will also embrace this data syncing functionality to be able to deliver the data as well as the applications.
Certainly what we have seen with the public cloud is much more automation in the desktop lifecycle. And in 2013, I believe this ‘orchestration’ capability will be coming to the enterprise, enabling IT to spin-up and manage desktops more efficiently and in accordance with end user demand.