Category Archives: Windows

Understanding Virtual Desktop Infrastructure

VDI (or Virtual Desktop Infrastructure) has been all the rage in IT circles for the past couple of years. Once the virtualization of servers became the norm, looking to desktops as the next frontier in virtualization was natural. Unlike servers, however, desktops are affected by several major factors that need to be addressed and considered before we simply jump on the VDI bandwagon. VDI is a great and wonderful technology, but like any technology has its place and needs to be considered carefully.

First we have to look at an important concept that affects VDI – the idea of shared computing. With servers we naturally assume that the servers and the services that they provide are not for one to one consumption but will be utilized by multiple users. This concept exists and has always existed in the world of desktops and is often referred to as terminal services. Terminal Servers are the server world’s answer to the need for centralized desktop resources and have been standard since literally before desktops even existed.

It is an interesting bit of computing history that brings us to how the Windows world interacts with the concept of terminal servers. Most operating systems and nearly all that remain in use today are designed from the ground up, and always have been, as multi-user systems. The idea that one user would sit in front of the computer as the “main” user and others would either be second class citizens or not exist at all did not really exist and all users were considered equal. Windows, unlike the UNIX family for example, came from a world of single user systems originating with DOS and DOS/Windows which were not multi-user and when Windows NT was developed as a multiuser system a great many software developers treated it as they always had making software that would not function well or often even at all in a multiuser mode.

This software ecosystem that is unique to Windows (it could effectively never exist on Linux, for example, because such software would simply be seen as broken due to the way that the ecosystem and deployments exist) has created an interesting problem making some software and some tasks easily addressable by the use of terminal servers identically to how it would be tackled by any UNIX OS while many other applications cannot be addressed using a terminal server and require a dedicated operating system instance for every user.

It is because of this historical factor leading to a significant difference in software ecosystems that has created the fundamental need for VDI and explains why VDI arose as a unique phenomenon within the Windows world and remains, for all intents and purposes, exclusive to it. So, it is very important to understand, that VDI arose conceptually as a means of addressing a need that existed only by a deficiency in third party applications and not because of an intrinsic nature of Windows itself in its current form or because VDI was a better approach to virtualizing or hosting end user desktop services. In fact, we could even look at VDI as an unfortunate kludge only needed in situations where we want to virtualize or centralize desktop resources and where some of the software needed to run on those systems cannot run in a multiuser mode. VDI is a fallback mechanism for special situations and not a desired approach to virtualized or centralized end user services.

It is important to note that due to the widespread use of VDI and necessity of it that the investment into support technologies around VDI has led to the possibility that in many cases VDI might actually outperform terminal servers even though architecturally this seems almost impossible. Basically this is happening because the incredible amount of research and development going into the hypervisor layer may be outpacing the same components in the operating system itself making for potentially better CPU and memory management and resource sharing. This is completely dependent on the unique situation, of course, as every OS and every hypervisor and every set of VDI tools is unique as well as the workloads being tested so mileage will vary significantly.

Also of serious consideration is, because of the aforementioned Windows-centric nature of the VDI concept, licensing. If we were to look at VDI from the Linux perspective we would have little to no licensing concerns and VDI would be up against traditional terminal services based on technical merits alone, but this is effectively never the case. The biggest single factor around VDI decision making is Microsoft licensing.

VDI licensing is both expensive, as well as complex. Companies wanting to consider the virtualization of Windows desktop resources have to carefully way the benefits against both the large cost of appropriate licensing and also the potentially large overhead of license management. Moving into VDI will likely mean lots of expensive IT time dedicated to license research, monitoring and training which is an often overlooked aspect of licensing costs.

VDI is a somewhat difficult concept to speak about in generalities because it is a slightly amorphous topic. If we virtualize a desktop, does it not become a server? If we use an operating system intended for server use, does that change what is and is not VDI? Is VDI based around use cases, licensing or product categories?

The real answer lies in that to the industry VDI is technically one thing but in practical terms to Microsoft, the only key licensing player in the space, it means something somewhat different. VDI is technically the virtualization of one to one “graphical end user” instances – meaning a single virtual machine being used by a single user much as a traditional, physical desktop or laptop would be used. To Microsoft, whose concerns are slightly different than those of the industry, the term refers to the virtualization of Windows “desktop class” operating systems. If you virtualize Windows “server class” operating systems, Microsoft does not view you as doing VDI. So we have to understand these two views of the concept to keep from becoming confused. In fact, using Windows Server OSes to get around VDI licensing needs from Windows desktops has become very standard and common. However, we have to remember the kludge nature of VDI and while this does solve the failure to write software that is multiuser in nature it does not address the very real potential that software was written with expectations of desktop-branded operating systems and we are somewhat likely to find end user software that is either locked (intentionally or unintentionally) to desktop operating systems only or is potentially licensed only on those platforms.

The last major consideration around VDI decision making is that unlike servers which when virtualized are completely virtualized, a desktop cannot be treated in the same way because there is always a physical component to it. The end user will always need a monitor to look at, a keyboard to type on, speakers to listen to, and so on. So when we are looking to move to VDI we must take care not to overlook the fact that we are not eliminating the need to purchase and maintain desktops, we are simply moving where the operating system will reside. We may redeploy older hardware to be used for remote access, move to thin clients or the newly termed and mostly meaningless zero clients or use otherwise in use “fat clients” to pull double duty handling both activities as a remote access client as well as providing its own desktop services.

Certainly virtualizing the desktop offers us many great opportunities and much value if we are doing it for the right reasons and understanding the hows, whys and whens of VDI. It has sadly, like so many technology trends, become a knee-jerk reaction to want to move to VDI without performing proper evaluations and developing a clear picture as to how VDI will fit into our own environments. If we lack a clear reason for choosing VDI it will be very unlikely that we deploy it in a positive manner.

Finally it is very important that we consider the necessary skill sets that will be required in order to properly move to VDI. From a purely technical standpoint, throwing a Windows 10 VM onto Hyper-V constitutes VDI but from a practical perspective this is not how effective VDI will be designed. VDI not only requires the special licensing knowledge that I mentioned above but will typically involve rather unique knowledge of modern and very specialized VDI toolsets and products, shared storage as it applies to VDI, remote access protocols, thin clients or zero clients, and more. VDI deployments tend to be one of the technical and unique components of an infrastructure leading to a great number of unknowns and challenges for any organization.

The End of the GUI Era

We should set the stage by looking at some historical context around GUIs and their role within the world of systems administration.

In the “olden days” we did not have graphical user interfaces on any computers at all, let alone on our servers. Long after GUIs began to become popular on end user equipment, servers still did not have them. In the 1980s and 1990s the computational overhead necessary to produce a GUI was significant in terms of the total computing capacity of a machine and using what little that there was to produce a GUI was rather impractical, if often not even completely impossible. The world of systems administration grew up in this context, working from command lines because there was no other option available to us. It was not common for people to desire GUIs for systems administration, perhaps because the idea had not occurred to people yet.

In the mid-1990s Microsoft, along with some others, began to introduce the idea of GUI-driven systems administration for the entry level server market. At first the approach was not that popular as it did not match how experienced administrators were working in the market. But slowly, as new Windows administrators and to some degree as Novell Netware administrators, began to “grow up” with access to GUI-based administration tools there began to be an accepted place in the server market for these systems. In the mid to late 1990s the UNIX and other non-Windows servers completely dominated the market. Even VMS was a major player still and on the small business and commodity server side Novell Netware was the dominant player mid-decade and still a very serious contender late in the decade. Netware offered a GUI experience but one that was very light and should probably be considered only “semi-GUI” in comparison to Windows NT’s rich GUI experience offered by at least 1996 and to some degree earlier with the NT 3.x family, although Windows NT was only just finding its place in the world before NT 4’s release.

Even at the time, the GUI-driven administration market remained primarily a backwater. Microsoft and Windows still had no major place on the server side but was beginning to make inroads via the small business market where their low cost and easy to use products made a lot sense. But it was truly the late 1990s panic and market expansion brought on by the combination of the Y2K scare, the dotcom market bubble and excellent product development and marketing by Microsoft that a significant growth and shift to a GUI-driven administration market occurred.

The massive expansion of the IT market in the late 1990s meant that there was not enough time or resources to train new people entering IT. The learning curve for many systems, including Solaris and Netware, was very steep and the industry needed a truly epic number of people to go from zero to “competent IT professional” faster than it was possible to do with the existing platforms of the day. The market growth was explosive and there was so much money to be made working in IT that there were no available resources to effectively train new people who needed to be coming into IT as anyone qualified to handle educational duties was also able to earn so much more working in the industry rather than working in education. As the market grew, the value of mature, experienced professionals became extremely high, as they were more and more rare in the ever expanding field as a whole.

The market responded to this need in many ways but one of the biggest ones was to fundamentally change how IT was approached. Instead of pushing IT professionals to overcome the traditional learning curves and develop the needed skills to effectively manage the systems that were on the market at the time, the market changed which tools that they were using to accommodate less experienced and less knowledgeable IT staff. Simpler and often more expensive tools often with GUI interfaces began to flood the market allowing those with less training and experience to at least begin to be useful and productive almost immediately even without ever having seen a product previously.

This change coincided with the natural advancement of the performance of computer hardware. It was during this era that for the first time the power of many systems was such that while the GUI still made a rather significant impact to performance, the lower cost of support staff and speed at which systems could be deployed and managed generally offset this loss of computing capacity taken by the GUI. The GUI rapidly became a standard addition to systems that just a few years before would never have seen one.

To improve the capabilities of these new IT professionals and to rush them into the marketplace the industry also shifted heavily towards certifications, more or less a new innovation at the time, which allowed new IT pros, often with no hands on experience of any kind, to establish some degree of competence and to do so commonly without needing any significant interaction or investment from existing IT professionals like university programs would require. Both the GUI-based administration market, as well as the certification industry, boomed; and the face IT significantly changed.

The result certainly was a flood of new, untrained or lightly trained IT professionals entering the market at a record pace. In the short term this change work for the industry. The field went from dramatically understaffed to relatively well staffed years faster than it could have done so otherwise. But it did not take long before the penalties for this rapid uptake of new people began to appear.

One of the biggest impacts to the industry was that there was an industry-wide “baby boom” with all of the growing pains that that would entail. An entire generation of IT professionals grew up in the boot camps and rapid “certification training” programs of the late 1990s. This resulted in a long term effect of the rules of thumb and general approaches common in that era becoming often codified to the point of near religious belief in a way that previous, as well as later, approaches would not. Often, because education was done quickly and shallowly, many concepts had to be learned by rote without an understanding of the fundamentals behind them. As the “Class of 1998” grew into the senior IT professionals in their companies over time, they became the mentors of new generations and that old rote learning has very visibly trickled down through similar approaches in the years since, even long after the knowledge is outdated or impractical and in many cases it has been interpreted incorrectly and is wrong in predicable ways even for the era from which it sprang.

Part of this learning of the era was a general acceptance that GUIs were not just acceptable but that they were practical and expected. The baby boom effect meant that there was little mentorship from the former era and previously established practices and norms were often swept away. The baby boom effect meant that the industry did not exactly reinvent itself as much as it simple invested itself. Even the concept of Information Technology as a specific industry unto itself took its current form and took hold in the public consciousness during this changing of the guards. Instead of being a vestige or other departments or disciplines, IT came into its own; but it did so without the maturing and continuity of practices that would have existed with more organic growth leaving the industry in possibly a worse position than it might have been would it have developed in a continuous fashion.

The lingering impact of the late 1990s IT boom will be felt for a very long time as it will take many generations for the trends, beliefs and assumptions of that time period to finally be swept away. Slowly, new concepts and approaches are taking hold, often only when old technologies disappear and knew ones are introduced breaking the stranglehold of tradition. One of these is the notion of the GUI being the dominant method by which systems administration is accomplished.

As we pointed out before, the GUI at its inception was a point of differentiation between old systems and the new world of the late 1990s. But since that time GUI administration tools have become ubiquitous in their availability. Every significant platform has and has long had graphical administration options so the GUI no longer sets any platform apart in a significant way. This means that there is no longer any vendor with a clear agenda driving them to push the concept of the GUI. The marketing value of the GUI is effectively gone. Likewise, not only did systems that previously lacked a strong GUI nearly all develop one (or more) but the GUI-based systems that did not have strong command line tools went back and developed those as well and developed new professional ecosystems around them. The tide most certainly turned.

Furthermore, over the past nearly two decades the rhetoric of the non-GUI world has begun to take hold. System administrators working from a position of a mastery of the command line, on any platform, generally outperform their counterparts leading to more career opportunities, more challenging roles and higher incomes. Companies focused on command line administration find themselves with more skilled workers and a higher administration density which, in turn, lowers overall cost.

This alone was enough to make the position of the GUI begin to falter. But there was always the old argument that GUIs, even in the late 1990s, used a small amount of system resources and only added a very small amount of additional attack surface. Even if they were not going to be used, why not have them installed “just in case.” As CPUs got faster, memory got larger, storage got cheaper and as system design improved the impact of the GUI became less and less so this argument of having GUIs available got stronger. Especially strong was the proposal that GUIs allowed junior staff to do tasks as well making them more useful. But it was far too common for senior staff to retain the GUI as a crutch in these circumstances.

With the advent of virtualization in the commodity server space, this all began to change. The cost of a GUI became suddenly noticeable again. A system running twenty virtual machines would suddenly use twenty times the CPU resources and twenty times the memory and twenty times the storage capacity of a single GUI instance. The footprint of the GUI was noticeable again. As virtual machine densities began to climb, so did the relative impact of the GUI.

Virtualization gave rise to cloud computing. Cloud computing increased virtual machine deployment densities and exposed other performance impacts of GUIs, mostly in terms of longer instance build times and more complex remote console access. Systems requiring a GUI began to noticeably lag behind their GUI-less counterparts in adoption and capabilities.

But the far bigger factor was the artifact of cloud computing’s standard billing methodologies. Because cloud computing typically exposes per-instance costs in a raw, fully visible way IT departments had no means of fudging or overlooking the costs of GUI deployments whose additional overhead would often even double the cost of a single cloud instance. Accounting would very clearly see bills for GUI systems costing far more than their GUI-less counterparts. Even non-technical teams could see that the cost of GUIs was adding up even before considering the cost of management.

This cost continues to increase as we move towards container technologies where the scale of individual instances becomes small and smaller means that the relative overhead of the GUI becomes more significant.

But the real impact, possibly the biggest exposure of the issues around GUI driven systems is the industry’s move towards the DevOps system automation models. Today only a relatively small percentage of companies are actively moving to a full cloud-enabled, elastically scalable DevOps model of system management but the trend is there and the model leaves GUI administrators and their systems completely behind. With DevOps models direct access to machines is no longer a standard mode of management and systems have gone even farther than working solely from the command line to being built completely in code meaning that not only do system administrators working in the DevOps world need to interact with their systems at a command line but they must do so programmatically.

The market is rapidly moving towards fewer, more highly skilled systems administrators working with many, many more servers “per admin” than in any previous era. The idea that a single systems administrator can only manage a few dozen servers, a common belief in the GUI world, has long been challenged even in traditional “snowflake” command line systems administration with numbers easily climbing into the few hundred range. But the DevOps model or similar automation models take those numbers into the thousands of servers per administrator. The overhead of GUIs is becoming more and more obvious.

As new technologies like cloud, containers and DevOps automation models become pervasive so does the natural “sprawl” of workloads. This means that companies of all sizes are seeing an increase in the numbers of workloads that need to be managed. Companies that traditionally had just two or three servers today may have ten or twenty virtual instances! The number of companies that need only one or two virtual machines is dwindling.

This all hardly means that GUI administration is going to go away in the near, or even the distant, future. The need for “one off” systems administration will remain. But the ratio of administrators able to work in a GUI administration “one off” mode versus those that need to work through the command line and specifically through scripted or even fully automated systems (a la Puppet, Chef, Ansible) is already tipping incredibly rapidly towards non-GUI system administration and DevOps practices.

What does all of this mean for us in the trenches of the real world? It means that even roles, such as small business Windows administration, that traditionally have had little or no need to work at the command line need to reconsider the dependence on the local server GUI for our work.   Command line tools and processes are becoming increasingly powerful, well known and how we are expected to work. In the UNIX world the command line has always remained and the need to rely on GUI tools would almost always be seen as a major handicap. This same impression is beginning to apply to the Windows world as well. Slowly those that rely on GUI tools exclusively are being seen as second class citizens and increasingly relegated to more junior roles and smaller organizations.

The improvement in scripting and automation tools also means that the value of scale is getting better so that the cost to administer small numbers of servers is becoming very high on a per workload basis which means that there is a very heavy encouragement for smaller companies to look towards management consolidation through the use of outside vendors who are able to specialize in large scale systems management and leverage scripting and automation techniques to bring their costs more in line with larger businesses’ costs. The ability to use outside vendors to establish scale or an approximation to it will be very important, over time, for smaller businesses to remain cost competitive in their IT needs while still getting the same style of computing advantages that larger businesses are beginning to experience today.

It should be noted that happening in tandem with this industry shift towards the command line and automation tools is the move to more modern, powerful and principally remote GUIs. This is a far less dramatic shift but one that should not be overlooked. Tools like Microsoft’s RSAT and Server Administrator provide a GUI view that is leveraging command line and API interfaces under the hood. Likewise Canonical’s Ubuntu world now has Landscape. These tools are less popular in the enterprise but are beginning to make the larger SMB market able to maintain a GUI dependency while also managing a larger set of server instances. The advancement in these types of GUI tools may be the strongest force slowing the adoption of command line tools across the board.

Whether we are interested in the move from the command line, to GUIs and back to the command line as an interesting artifact of the history of Information Technology as an industry or if we are looking at this as a means to understanding how systems administration is evolving as a career path or business approach for our own uses it is good for us to appreciate the factors that caused it to occur and why the ebb and flow of the industry is now taking us back out to the sea of the command line once again. By understanding these forces we can more practically asses where the future will take us, when the tide may again change, how to best approach our own careers or decide on both technology and human talent for our organizations.

The Desktop Revolution Is Upon Us

With the pending end of support for Windows XP looming just around the proverbial corner, it is time to take stock of the desktop landscape and make hard decisions. Windows XP has dominated the desktop landscape in both home and business for more than a decade. Sure Windows 7, and to some small degree Windows 8, have widely replaced it, there is still a huge Windows XP install base and many companies have failed to define their long term strategy in the post-XP world and are still floundering to find their footing.

A little context is, I feel, pretty important. Today it may seem a foregone conclusion that Microsoft will “own” the business desktop space with Mac OSX fighting for a little piece of the action that Microsoft barely notices. This status quo has been in place for a very long time – longer than the typical memory of an industry that experiences such a high degree of change. But things have not actually been this way for so long.

Let’s look instead to the landscape of 1995. Microsoft had a powerful home user product, Windows 95, and were beginning to be taken seriously in the business space. But their place there, outside of DOS, was relatively new and Windows 3.11 remained their primary product. Microsoft had strong competition from many fronts including Mac OS and OS/2 plus many smaller niche players. UNIX was making itself known in high end workstations. Linux existed by had not yet entered the business lexicon.

The Microsoft business desktop revolution happened in 1996 with the landmark release of Windows NT 4.0 Workstation. Windows NT 4 was such a dramatic improvement in the desktop experience, architecture, stability and networking capability that it almost instantly redefined the industry. It was Windows NT 4 that created the momentum that made Microsoft ubiquitous in the workplace. It was NT 4 that defined much of what we think of as modern computing. NT 4 displaced all other competitors relegating Mac OS to the most niche of positions and effectively, completely eliminating OS/2 and many other products. It was in the NT 4 era that the concept of the Microsoft Certified Professional and the MCSE began and where much of the corpus of rote knowledge of the industry was created. NT 4 introduced us to pure 32-bit computing in the x86 architectural space. It was the first mainstream operating system built with the focus being on being networked.

Windows NT 4 grew from interesting newcomer to dominate the desktop space between 1996 and 2001. In the interim Windows 2000 Pro was released but, like Vista, this was really a sidelined and marginalized technology preview that did little to displace the incumbent desktop product. It was not until 2001, with the release of Windows XP, that Windows NT 4 had a worthy successor. A product of extreme stability with enough new features and additional gloss to warrant a wide-spread move from the old platform to the new. NT 4 would linger on for many more years but would slowly fade away as users demanded newer features and access to newer hardware.

Windows NT 4 and Windows XP had a lot in common. Both were designed around stability and usability, not as platforms for introducing broad change to the OS itself. Both were incremental improvements over what was already available. Both received more large scale updates (Service Packs in Microsoft terms) than other OSes before and after them with NT 4 having seven (or even eight depending on how you count them) and XP having three. Each was the key vanguard of a new processor architecture, NT 4 with the 32bit x86 platform and XP being the first to have an option for the 64bit AMD64 architecture. Both were the terminal releases of their major kernel version. Windows NT 4 and Windows XP together held unique places in the desktop ecosystem with penetration numbers that might never be seen again by any product in that category.

After nearly eighteen years, that dominance is waning. Windows 7 is a worthy successor to the crown but it failed to achieve the same iconic status as Windows XP and it was rapidly followed by the dramatically changed Windows 8 and now Windows 8.1 both built on the same fundamental kernel as Windows 7 (and Vista too.)

The field is different today. Mobile devices; phones, tablets and the like; have introduced us to new operating system options and paradigms. The desktop platform is not a foregone conclusion as the business platform of choice. Nor is the Intel/AMD processor architecture a given as ARM has begun to make serious inroads and looks to be a major player in every space where Intel and AMD have held sway these last two decades.

This puts businesses into the position of needing to decide how they will focus their end user support energy in the coming years. There are numerous strategies to be considered.
The obvious approaches, those that I assume nearly all businesses will take if for no other reason than to maintain status quo, is to either settle into a “wait and see” plan that involves implementing Windows 7 today and hoping that the new interface and style of Windows 8 goes away or that they will find an alternative between now and when Windows 7 support ends. This strategy suffers from focusing on the past and triggering an earlier than necessary upgrade cycle down the road while leaving businesses behind on technology today. Not a strategy that I would generally recommend but very likely the most common strategy as it allows for the least “pain today” – a common trend in IT. Going with Windows 7 represents an accumulation of technical debt.

Those businesses willing to really embrace the Microsoft ecosystem will look to move to Windows 8 and 8.1 to get the latest features, greatest code maturity and to have the longest support cycle available to them. This, I feel, is more forward thinking and embraces some low threshold pain today in order to experience productivity gains tomorrow. This is, in my opinion, the best investment strategy for companies that truly wish to stick with the Microsoft ecosystem.

However, outside of the Microsoft world, other options are now open to us that, realistically, were not available when Windows NT 4 released. Most obvious is Apple’s Mac OSX Mavericks. Apple knows that Microsoft is especially vulnerable in 2014 with Windows XP support ending and users fearing the changes of Windows 8 and is being very aggressive in their technical strategy both on the hardware side with the release of a dramatic new desktop device – the black, cylindrical Mac Pro – and the free release (for those on Apple’s hardware, of course) of Mac OSX 10.9. They are pushing hard to get non-Mac users interested in their platform and to get existing users updated and using the latest features. Apple has made huge inroads into Windows territory over the last several years and they know full well that 2014 is they biggest opportunity to take a sizable market chunk all at once. Apple has made their Mac platform a serious contender in the office desktop space and is worth serious consideration. More and more companies are either adding Macs to their strategy or switching to Mac altogether.

The other big player in the room is, of course, Linux. It is easy to make the proclamation that 2014 will be the “Year of the Linux Desktop” which, of course, it is not. However, Linux is a powerful, mature option for the business desktop and with the industry’s steady move to enterprise web-based applications the previous prohibitions against Linux have significantly faded. Linux is a strong contender today if you can get it in the door. Cost effective and easy to support Linux’s armor’s chink is the large number of confusing distros and desktop options. Linux is hardly going to take the desktop world by storm but the next five months do offer one of the best time periods to demo and trial some Linux options to see if they are viable in your business. In preparation for the likely market surge that Linux will feel most of the key Linux desktop players, Suse, Ubuntu and Mint, have released big updates in the last several weeks giving those looking to discover Linux for the first time (or for the first time in a long time) something especially tempting to discover. The Mint project has especially taken the bull by the horns in recent years and introduced the Mate and Cinnamon desktops which are especially appealing to users looking for a Windows 7-esque desktop experience with a forward looking agenda.

Also in the Linux family but decidedly its own animal, Google’s ChromeOS is an interesting consideration for a company interested in a change. ChromeOS is, most likely, the most niche of the desktop options but a very special one. ChromeOS takes the tack that a business can run completely via web interfaces with all applications being written to be accessed in this manner. And indeed, many businesses are approaching this point today but few have made it completely. ChromeOS requires a dramatic rethinking of security and application architectures for a normal business and so will not be seeing heavy adoption but for those unique businesses capable of leveraging it, it can be a powerful and extremely cost effective option.

Of course, an entire new category of options has appeared in recent years as well – the mobile platforms. These existed when Windows XP released but they were not ready to, in any way, replace existing desktops. But during the Windows XP era the mobile platforms grew significantly in computational power and the operating systems that power them, predominantly Apple iOS and Google Android, have come into existence and become the most important players in the end user device space.

iOS and Android, and to a lesser extent Windows Phone and Windows RT, have reinvented the mobile platform into a key communications, productivity and entertainment platform rivaling the traditional desktop. Larger mobile devices, such as the iPad, are widely displacing laptops in many places and, while different, often provide overlapping functionality. It is becoming more and more common to see an iOS or Android device being used for non-intensive computing applications that traditionally belonged to desktop or laptop devices. Mobile platforms are hard to imagine being able to be the sole computing platform of a business over the next few years but it is possible that we will see this begin to happen in fringe case businesses during this product cycle.

Of course, any talk of the desktop future must take into account changes not just in products but in architectures. Marketing around VDI (Virtual Desktop Infrastructure) has propelled virtualized and centralized computing architectures into the forefront and with the the concept of hosted or “cloud” desktop offerings (including Desktop as a Service.) While still nascent the category of “pay by the hour” utility desktop computing will likely grow over the next several years.

Of course, with so many changes coming there is a different problem that will be facing businesses. For the past two decades just about any business could safely assume that nearly all of its employees would have a Windows computer at home where they would become accustomed to any current interface and possibly much of the software that they would use on a day to day basis. But this has changed. Increasingly iOS and Android are the only devices that people have at home and for those with traditional computers keeping current Windows is less and less common while Mac OSX and Linux are on the rise. One of the key driving forces making Windows cost effective, that is a lack of training necessary, may swing from being in its favor to working actively against it.

Perhaps the biggest change that I anticipate in the next desktop cycle is not that of a new desktop choice but of a move to more heterogeneous desktop networks where many different OSes, processor architectures and deployment styles co-exist. As BYOD proliferates and support of different device types becomes necessary and as user experience changes and business apps move to web platforms the advantages of a disparate “choose the device for the task or user” strategy will become more and more common. Businesses will be free to explore their options and choose more freely based on their unique needs.

The era of desktop lock-in are over. Whether because of market momentum or existing user experience or application limitations – the reasons that kept business tightly coupled to the Windows platform are fading quickly. The future offers a landscape of choices both in what we deploy but also in how we deploy it.

Is it Time to Move to Windows 8

Microsoft’s latest desktop reboot is out in the wild and lots of people are getting their hands on it and using it today.  Is it time to consider moving to Windows 8?  Absolutely.

That doesn’t mean that Windows 8 should be your main desktop later this afternoon, but considering a move to Windows 8 is important to do early.  It is a popular approach to hold off on new software updates until systems have been in production use for months or years and there is value to this concept – allowing others to vet, test and uncover issues while you sit back and remain stable on existing, well known software.  But there is a reason why so many businesses forge ahead and that is because using software early delivers the latest features and advantages as early as possible.

Unlike software coming from a small company with limited support and testing resources, Microsoft’s software is incredibly well tested both internally and by the community before it is available to end users.  Few software is more heavily vetted prior to release.  That doesn’t mean that release day rollouts are wise but beginning to evaluate new products early can have major advantages both because of the newest features are available to those that decide to use the new product but also the most time to find an alternative for those decide to migrate away.  Early decision making is important to success.

The reality is, that while many businesses should take the time to evaluate Windows 8 versus alternative solutions – a practice that should be done regularly regardless of new features or changes to environments to ensure that traditional choices remain the best current choices, nearly all businesses today will be migrating to Windows 8 and remaining in the Microsoft ecosystem for quite some time to come.

This means that many companies should be looking to make the jump to Windows 8 sooner, rather than later.  Windows 8, while seemingly shockingly new and innovative, is based on the same Windows NT 6 family kernel that began with Windows Vista and Windows Server 2008 and continued through the Windows 7 and Windows Server 2008 R2 era and is shared with Windows Server 2012.  This kernel is mature and robust and the vast majority of the code and features in Windows 8, user interface aside, are well tested and extremely stable.  Windows 8 uses fewer resources, on the same hardware, as Windows 7 which, in turn, was lighter and faster than Windows Vista.  The sooner that you move to Windows 8, the sooner you get more performance out of your existing hardware and the longer you have to leverage that advantage.

Windows 8 brings some big changes that will impact the end users, without a doubt.  These changes can be, in some cases, quite disruptive but with proper training and preparation users should return to regular productivity levels in a reasonable amount of time and often will be more productive once they are comfortable with the new environment and features.  Those that do not fall into one of these two categories are the smaller, niche user group that are prime candidates for moving to a completely different ecosystem where their needs can be more easily met.

If you are an organization destined to be running Windows 8, or its successors, “someday” then most likely you should be running Windows 8 today to start leveraging its advantages as soon as possible so that you can use them as long as possible.  If Windows truly is the platform that is best for you you should embrace it and accept the “hit” of transitioning to Windows 8 now, swallow that bitter pill and be done with it, and for the next several years while your competitors are whining about having to move to Windows 8 “someday” you will be happily leveraging your older hardware, your more efficient workflows and your more modern systems day after day, reaping the benefits of an early migration to a stable platform.

It is common for IT departments to take a “wait and see” approach to new system migrations.  I am convinced that this is created by a culture of hoping that IT staff will leave their current positions before a migration occurs and that they will land a new position elsewhere where they have already migrated.  Or perhaps they hope to avoid the migration completely awaiting a later version of Windows.  This second argument does carry some weight as many shops skip operating system revisions but doing so often brings extra overhead in security issues, application compatibility effort and other issues.

Windows 8 is unique in that it is a third release of the Windows NT 6 kernel series so it comes as a rare, very stable late release member of its family (the NT 6 family is sometimes called the “Vista Family.”)  Windows 8’s NT designation is 6.2.  The only other Microsoft NT operating system to reach the x.2 status was when Windows XP SP3 and Server 2003 R2 released with the NT 5.2 kernel – a part of the Windows 2000 family.  Late release kernels are important because they tend to deliver the utmost in reliability and represent an excellent point in which to invest in a very long term deployment strategy that can last for nearly a decade.

Whether you agree with Microsoft’s unified platform vision or the radical approach to user interface included in Windows 8, you need to decide if you are continuing down the path of the Microsoft platform and if so, embrace it rather than fight it and begin evaluating if a move to Windows 8 and, by extension, Windows Server 2012 are right for you.  Don’t avoid Windows 8, it isn’t going to go away.  For most shops making the decision to move today will sow the seeds for long term benefits that you can reap for years and years to come.