VDI Savings Can Kick In Now and Later

Business Men in Data CenterVirtual Desktop Infrastructure (VDI) isn’t right for every organization or for every user. Many users, teams, and entire companies rely heavily on the virtualization of powerful desktop or mobile hardware for content creation, CAD, analysis, number crunching, and more. However, where the focus is on mainstream computing needs, productivity applications, Internet access, and even deployment of specific client-server applications, VDI can save money both in the short term and over the complete hardware life cycle.

The beauty of VDI is two-fold:

  1. Inexpensive hardware can be used to deliver a customized desktop experience to many users quickly and easily.
  2. Management of the computing experience is centralized to the data center where IT staff can focus on security, reliability, and efficiency.

The end user hardware can come in many forms, whether as so-called “zero clients” that simply feed data to a display and accept user input, thin clients that can take on some small degree of local processing, or even in a BYOD scenario where any device can still present standard desktops or applications to users, regardless of the underlying operating system.

In the first two cases, there is no incentive for theft because the clients are of little or no value by themselves. The latter scenario erases the problems of application compatibility with user devices and even supports the repurposing of aging hardware.

To a large extent, VDI transfers hardware acquisition costs to the data center, where relatively powerful servers or clusters are required to deliver satisfactory experiences to users in the form of virtual desktops or virtualized applications.

However, even if initial cost savings are only marginal, long-term cost savings can be considerable. Client hardware, for example, tends to have a much longer lifespan than traditional desktops or laptops and uses dramatically less power (as do a few servers vs. potentially hundreds of desktops).

Similarly, the ability to centrally manage and deploy desktop images is a major time and labor saver for IT staff, even if they had previously used standard desktop imaging software. In some cases, software licensing can also realize cost savings; in application virtualization scenarios, only those applications the user needs are presented and need to be licensed.

IT administrators looking at VDI as a possible solution, however, need to answer several questions in the affirmative before undertaking the move to virtual desktops:

  • Is the network infrastructure sufficiently robust to handle the constant traffic associated with delivering a desktop experience to end users? Even with modern compression and desktop streaming technologies, the demands on a network will increase over standard desktop and laptop deployments.
  • Do decision-makers and stakeholders understand the need for over-engineered, redundant backend systems? At first blush, the hardware acquisition costs seem like they could be extraordinary and naive managers may jump at the opportunity to cut costs. All involved need to understand that the long-term cost savings are more important than short-term gains generated by skimping on server hardware.
  • Do end users realize that their experience is going to change, in some cases drastically? Even if users experience full desktop virtualization, the level of control over their applications and settings will most likely be much greater with centralized image management. In application virtualization, the concept of a “desktop” may disappear entirely.

That said, properly designed and deployed VDI should increase overall reliability and uptime.

Done right, for the right users, VDI can be a real boon for both IT and end users; done wrong, for the wrong users, the cries to return to desktop computers could become very loud indeed.

Chris Dawson is a research analyst and writer for Ziff Davis, among others.

Tags: Technology,Virtualization