Security, Simplicity and Control Ease Make Desktop Virtualization Ready for Enterprise Uptake

Updated: February 02, 2010

Listen to the podcast. Find it on iTunes/iPod and Read a full transcript or download a copy. Learn more. Sponsor: Hewlett-Packard.

Here to help us learn more about the role and outlook for desktop virtualization, we're joined by Jeff Groudan, vice president of Thin Computing Solutions at HP. The BriefingsDirect interview is conducted by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Groudan: There certainly are some things in the market that are sure driving a potential inflection point [for client virtualization]. The market-driven things coming out of the recession are opening a lot of customers up to re-looking at some deployments that they may have delayed or specific IT projects that they have put on hold.

Just to put it into context, there was recently some data from Gartner. They feel like there are well over 600 million desktop PCs in offices today. Their belief is that over the next five years, upwards of 15 percent of those could be replaced by thin clients. So that's quite a number of redeployments and quite an inflection point for client virtualization.

In addition, there has been an ongoing desire to increase security and a lot of new compliance requirements that the customers have to address. In addition, in general, as they are looking for ways to save on costs, they are consistently and constantly looking for different ways to more efficiently manage their distributed PC environments. All of these things are driving the high level of interest in virtualizing PCs.

One of the key benefits of client virtualization is the ability to keep all the data behind the firewall in the data center and deploy thin clients to the edge of the network. Those thin clients, by design, don't have any local data.

You're also seeing better performance on the hardware side and the infrastructure side. It's really also helping bring the cost per seat of the client virtualization deployment down into ranges that are lot more interesting for large deployments. Last, and near and dear to my heart, you're seeing more powerful, yet cost-effective, thin clients that you can put on the desk and that really ensure those end-users get the experience that you want them to get.

Not an IT panacea

Our general coaching to customers is that client virtualization is not necessary for everyone, for every user group, or every application set. But, certainly, for environments where you need to get them more manageable, you need more flexibility.

When you think about the cost savings of client virtualization, usually the costs come from some of the long-term acquisition costs.

You need higher degrees of automation in order to manage a high number of distributed PCs with the benefits from centralized control, reduced labor costs, and the ability to manage remote or hard to get at locations -- things like branches, where you don't have a local IT. Those are great targets for early client virtualization deployments.

All of a sudden, the data-center guys need to be thinking about the end-user. The end-user guys need to be thinking about the data center. Roles and responsibilities need to be hammered out. How do you charge the capital expense versus operational expense? What gets budgeted where? My advice is: as you're thinking about the technical architecture and all of the savings end-to-end, you need to also be thinking about the internal business processes.

We look at this market in two ways, in the context of client virtualization and in the broader context of thin computing. Just zeroing in on client virtualization, we call it Client Virtualization HP. It's desktop virtualization. It's the same animal.

We look it as a specific set of technologies and architectures that dis-aggregate the elements of a PC, which allows customers to more easily manage and secure their environment. What we're really doing is taking advantage of a lot of the new software capabilities that matured on the server side, from a server virtualization and utilization perspective. We're now able to deploy some of those technologies, hypervisors, and protocols on the client side.

The first is that you don't want to have customers having to figure out how to architect the stuff on their own. If you think about PCs 20-25 years ago, customers didn't know how to architect a distributed PC environment. In 25 years, everybody has gotten good at it. We're still at the early stages on client virtualization.

Our specific objective is figuring out how to simplify virtualization, so that customers get past the technology, and really start to deliver the full benefit of virtualization, without all the complexity.

So our focus is to deliver more complete integrated solutions, end to end from the desktop to the data center, lay it all out, and reference designs so customers can very comfortably understand how to go build out a deployment. They certainly may want to customize it. We want to get them 80-90 percent there just by telling them what we have learned.

Related Categories
Featured Research
  • Office365 Adoption eGuide

    Microsoft moved to the cloud in 2014, and, as a result, Office 365 is taking off. Now, Okta customers are connecting to Office 365 in increasing numbers. This eGuide explains why IT departments should plan and deploy solutions around identity and mobility management in concert with their Office 365 roll out to get maximum user adoption. more

  • Okta Directory Integration

    For most companies, Active Directory (AD) or Lightweight Directory Access Protocol (LDAP) play a central role in coordinating identity and access management policies. When on-premise applications are integrated to Active Directory or LDAP, users get the best possible experience. That's why Okta's cloud-based identity and access management service provides a highly useful single integration point. more

  • Securing Enterprise Information Technology

    In the 1980s and 1990s, business applications and data were largely confined within and protected by a Local Area Network (LAN). The 2000s introduced a significant change. Download this white paper now to learn why the shift to the cloud is changing how companies think about and manage their IT infrastructure. more

  • Top 8 Identity and Access Management Challenges with Your SaaS Applications

    With more and more businesses adopting Software-as-a-Service (SaaS) applications, enterprise IT is fundamentally changing. This whitepaper presents the eight biggest Identity and Access Management (IAM) challenges associated with adopting and deploying cloud and SaaS applications, and discusses best practices for addressing each of them. more

  • Better BYOD with Pulse Secure and MDM Partners

    Learn how Pulse Secure and leading MDM product partners are transforming the way employees and IT benefit from the productivity and flexibility of BYOD — without compromising security or increasing management complexity. more