Category Archives: Licensing

Understanding Virtual Desktop Infrastructure

VDI (or Virtual Desktop Infrastructure) has been all the rage in IT circles for the past couple of years. Once the virtualization of servers became the norm, looking to desktops as the next frontier in virtualization was natural. Unlike servers, however, desktops are affected by several major factors that need to be addressed and considered before we simply jump on the VDI bandwagon. VDI is a great and wonderful technology, but like any technology has its place and needs to be considered carefully.

First we have to look at an important concept that affects VDI – the idea of shared computing. With servers we naturally assume that the servers and the services that they provide are not for one to one consumption but will be utilized by multiple users. This concept exists and has always existed in the world of desktops and is often referred to as terminal services. Terminal Servers are the server world’s answer to the need for centralized desktop resources and have been standard since literally before desktops even existed.

It is an interesting bit of computing history that brings us to how the Windows world interacts with the concept of terminal servers. Most operating systems and nearly all that remain in use today are designed from the ground up, and always have been, as multi-user systems. The idea that one user would sit in front of the computer as the “main” user and others would either be second class citizens or not exist at all did not really exist and all users were considered equal. Windows, unlike the UNIX family for example, came from a world of single user systems originating with DOS and DOS/Windows which were not multi-user and when Windows NT was developed as a multiuser system a great many software developers treated it as they always had making software that would not function well or often even at all in a multiuser mode.

This software ecosystem that is unique to Windows (it could effectively never exist on Linux, for example, because such software would simply be seen as broken due to the way that the ecosystem and deployments exist) has created an interesting problem making some software and some tasks easily addressable by the use of terminal servers identically to how it would be tackled by any UNIX OS while many other applications cannot be addressed using a terminal server and require a dedicated operating system instance for every user.

It is because of this historical factor leading to a significant difference in software ecosystems that has created the fundamental need for VDI and explains why VDI arose as a unique phenomenon within the Windows world and remains, for all intents and purposes, exclusive to it. So, it is very important to understand, that VDI arose conceptually as a means of addressing a need that existed only by a deficiency in third party applications and not because of an intrinsic nature of Windows itself in its current form or because VDI was a better approach to virtualizing or hosting end user desktop services. In fact, we could even look at VDI as an unfortunate kludge only needed in situations where we want to virtualize or centralize desktop resources and where some of the software needed to run on those systems cannot run in a multiuser mode. VDI is a fallback mechanism for special situations and not a desired approach to virtualized or centralized end user services.

It is important to note that due to the widespread use of VDI and necessity of it that the investment into support technologies around VDI has led to the possibility that in many cases VDI might actually outperform terminal servers even though architecturally this seems almost impossible. Basically this is happening because the incredible amount of research and development going into the hypervisor layer may be outpacing the same components in the operating system itself making for potentially better CPU and memory management and resource sharing. This is completely dependent on the unique situation, of course, as every OS and every hypervisor and every set of VDI tools is unique as well as the workloads being tested so mileage will vary significantly.

Also of serious consideration is, because of the aforementioned Windows-centric nature of the VDI concept, licensing. If we were to look at VDI from the Linux perspective we would have little to no licensing concerns and VDI would be up against traditional terminal services based on technical merits alone, but this is effectively never the case. The biggest single factor around VDI decision making is Microsoft licensing.

VDI licensing is both expensive, as well as complex. Companies wanting to consider the virtualization of Windows desktop resources have to carefully way the benefits against both the large cost of appropriate licensing and also the potentially large overhead of license management. Moving into VDI will likely mean lots of expensive IT time dedicated to license research, monitoring and training which is an often overlooked aspect of licensing costs.

VDI is a somewhat difficult concept to speak about in generalities because it is a slightly amorphous topic. If we virtualize a desktop, does it not become a server? If we use an operating system intended for server use, does that change what is and is not VDI? Is VDI based around use cases, licensing or product categories?

The real answer lies in that to the industry VDI is technically one thing but in practical terms to Microsoft, the only key licensing player in the space, it means something somewhat different. VDI is technically the virtualization of one to one “graphical end user” instances – meaning a single virtual machine being used by a single user much as a traditional, physical desktop or laptop would be used. To Microsoft, whose concerns are slightly different than those of the industry, the term refers to the virtualization of Windows “desktop class” operating systems. If you virtualize Windows “server class” operating systems, Microsoft does not view you as doing VDI. So we have to understand these two views of the concept to keep from becoming confused. In fact, using Windows Server OSes to get around VDI licensing needs from Windows desktops has become very standard and common. However, we have to remember the kludge nature of VDI and while this does solve the failure to write software that is multiuser in nature it does not address the very real potential that software was written with expectations of desktop-branded operating systems and we are somewhat likely to find end user software that is either locked (intentionally or unintentionally) to desktop operating systems only or is potentially licensed only on those platforms.

The last major consideration around VDI decision making is that unlike servers which when virtualized are completely virtualized, a desktop cannot be treated in the same way because there is always a physical component to it. The end user will always need a monitor to look at, a keyboard to type on, speakers to listen to, and so on. So when we are looking to move to VDI we must take care not to overlook the fact that we are not eliminating the need to purchase and maintain desktops, we are simply moving where the operating system will reside. We may redeploy older hardware to be used for remote access, move to thin clients or the newly termed and mostly meaningless zero clients or use otherwise in use “fat clients” to pull double duty handling both activities as a remote access client as well as providing its own desktop services.

Certainly virtualizing the desktop offers us many great opportunities and much value if we are doing it for the right reasons and understanding the hows, whys and whens of VDI. It has sadly, like so many technology trends, become a knee-jerk reaction to want to move to VDI without performing proper evaluations and developing a clear picture as to how VDI will fit into our own environments. If we lack a clear reason for choosing VDI it will be very unlikely that we deploy it in a positive manner.

Finally it is very important that we consider the necessary skill sets that will be required in order to properly move to VDI. From a purely technical standpoint, throwing a Windows 10 VM onto Hyper-V constitutes VDI but from a practical perspective this is not how effective VDI will be designed. VDI not only requires the special licensing knowledge that I mentioned above but will typically involve rather unique knowledge of modern and very specialized VDI toolsets and products, shared storage as it applies to VDI, remote access protocols, thin clients or zero clients, and more. VDI deployments tend to be one of the technical and unique components of an infrastructure leading to a great number of unknowns and challenges for any organization.

Linux Virtualization Deployment Advantage

As more and more businesses begin to deploy virtualization broadly, we must begin to step back and reconsider the opportunities presented to us by this shift in datacenter architecture.  Virtualization comes with new challenges and potential not only for cost savings but for aggressive project implementation.  Small businesses, especially, when using virtualization tend to prepare themselves for projects that they could never have envisioned doing during the era of physical-only servers.

The big winners in this space of emerging virtualization opportunity are the open source operating systems such as Linux, OpenSolaris and FreeBSD.  The reason that these particular operating systems have unique opportunities that Windows and Mac OSX do not is because of the way that they are, or can be, licensed.  Each of these operating systems has an option by which they are available completely for free – something that cannot be done with Windows or Mac OSX.

Traditionally, when purchasing a new server a business would price out expensive hardware with relatively inexpensive software.  An enterprise operating system, such as Windows, would typically represent a relatively small percentage of the cost of a new server.  Even a small server would cost a few thousand dollars and Windows Server can easily be purchased for less than one thousand dollars.  In this scenario a business looking to purchase a new server would see only a very small cost savings in opting for a “free” operating system since introducing a new OS has its own risks and the bulk of the cost of the new server is in the hardware which would still need to be purchased.

Given that equation, only a rare small business would consider the purchase of a non-Windows-based server.  The opportunity for failure is too high given the risk of change and the cost savings are too small.  Today, though, virtualization is commonplace and becoming more ubiquitous every day.  Businesses virtualizing their infrastructure typically have excess capacity on their servers that is going unused.  As these businesses and their IT departments begin to look to utilize this spare capacity they will increasingly find that the cost of deploying virtualized Windows Server remains high while the cost of deploying a virtualized Linux or OpenSolaris server is nominal – generally nothing more than the effort to do so without any capital expenditure or its associated risk.

The ability to deploy new servers, at any time, without any cost is a significant advantage that companies have not begun to truly comprehend.  If a business wants a new web server, for instance, they can have one provisioned and built in thirty minutes without buying any licenses.  Having redundant virtualization hardware means that a redundant web server can be had as well – again without any capital cost.  Unlike Windows (or other commercial operating systems) there is no need to purchase a second license just to have a backup server.

This means that for the first time many businesses can begin to consider clusters as well.  Typically the cost of licensing software for clustering was prohibitive but if that licensing becomes free then suddenly clusters become very attractive options.

Of course, as open source proponents will point out, the low cost of Linux and other free and open source solutions have long been reasons to move to these platforms, but this discounts the incredible shift in pricing structure that occurs only when spare usable capacity meets the previously existing free licenses.  It is only because so many business have already implemented virtualization strategies, or are in the process of doing so, that this new opportunity truly presents itself.

The first challenge will be in getting businesses to begin to think of operating systems and application platforms as being free.  The ways in which businesses may take advantage of this has yet to be seen.  Businesses are so used to being hamstrung by the need to buy new hardware and expensive server software licenses for every new system deployment that the widespread availability of spare server images is quite novel indeed.

Of course, as with many new technology changes, it is the small and medium business space where the greatest change will likely take place.  Large enterprises are already doing datacenter consolidation and do not necessarily have spare capacity available to them as their capacity plan already takes into account virtualization.  But in the smaller business space where capacity planning is a practically non-existent practice we see a different type of opportunity.

What we typically see in small businesses moving to virtualization is an over-purchasing of hardware.  This generally comes from a misunderstanding of how capacity planning and virtual guest interaction will occur in the virtualized environment but also from a desire to err on the side of overpowered versus underpowered and the nature of virtualization capacity planning being a bit of a “black art”.  Because of this, however, many small businesses have server resources sitting idle.  It is not uncommon to see a powerful server virtualizing just two server instances when there is capacity to virtualize a dozen or more.

It is this overprovisioning of hardware that offers unique opportunity.  Many small businesses, and even medium sized businesses, may manage to effectively virtualize their entire existing server infrastructure leaving no further opportunity for cost savings through consolidation.  At this point the spare capacity of the existing servers offers no further cost savings and can now be viewed as capacity for growth instead.

This begs the question of “What new deployment opportunities exist given these opportunities?”  This question is difficult to answer as it will be different for nearly every business, but we can look at some commonalities to build a rough picture of where we may see new value presenting itself.

The most obvious new opportunity is in new web applications.  Small businesses often would like to take advantage of free web-based applications but do not want to risk deploying new, low-priority applications to their existing Windows-based web server of do not even have a server available to do so.  Creating one or more open source application servers is incredibly simple.  Deploying a wiki, corporate web portal, a blogging engine or news site, bug or incident tracking application, microblogging platform (a la laconi.ca,) CRM, ERP or any of thousands of similar applications can be done quickly and easily with minimal cost using only “spare” time from the existing IT resources.  Any number of internal applications such as these could bring value to the company and produce very little impact on a virtualization platform so many could be deployed utilizing only a small amount of excess capacity.

Beyond obvious web apps there are more feature-rich systems that could be deployed for no cost.  A great example is the OpenFire instant messaging and presence server.  Companies can suddenly roll out complete enterprise class, secure, internal instant messaging applications at no cost whatsoever.  Another example is in monitoring systems such as Nagios, Zenoss or Zabbix – all of which are available for free and represent a real benefit for companies that currently have no such system.  Enterprise monitoring completely for free.

Beyond new applications there is also an “environmental” benefit to be had.  In an enterprise environment changes going into production go through a series of testing.  Typically big businesses will maintain a development server environment, a user acceptance testing environment and then the production environment.  For a small business to do this with Windows is extremely cost prohibitive as the servers in each environment need to be licensed.  But with open source servers being virtualized using spare capacity deploying virtual servers for each of these environments is completely free and allows small businesses to test their own processes before making production changes giving them added stability previously unaffordable to them.

After all of these growth benefits there is one additional benefit to consider – flexibility.  Because these new systems can be deployed and tested with no cost it provides a new opportunity for small shops to deploy open source solutions that may replace expensive Windows solutions that they are currently using.  This could include replacing Exchange with Zimbra or replacing IIS with Apache or Active Directory with an LDAP server.  Doing a project like this would be risky and potentially costly if the hardware and software had to be purchased up front.  But if the project can be done, only using free time from the existing IT department, and can be done as a free “proof of concept” before looking to do a pilot and then full production replacement then risk can be minimized and the entire project can be effectively free.

While a full architectural replacement may be very aggressive for an average small business it is also a very significant potential cost savings.  Moving completely to open source systems is not for everyone and should be evaluated carefully.  The ability to evaluate a project of this magnitude, for free, is very important and small businesses should consider doing so to be sure that they are using the systems that make the most sense for their business model and needs rather than simply using the solutions with which they are already familiar or are already in place.

There are many additional ways in which free and open source products, deployed using existing, excess server capacity, can be used to expand the IT infrastructure of small businesses.  Learning to seek out opportunities rather than seeking cost savings from IT is a new process for most small businesses and requires some relearning, but those that take the time to pursue these opportunities have many benefits to be gained.