{"id":98,"date":"2010-08-28T00:39:26","date_gmt":"2010-08-28T05:39:26","guid":{"rendered":"http:\/\/www.smbitjournal.com\/?p=98"},"modified":"2017-02-18T09:46:54","modified_gmt":"2017-02-18T14:46:54","slug":"state-of-thin-clients","status":"publish","type":"post","link":"https:\/\/smbitjournal.com\/2010\/08\/state-of-thin-clients\/","title":{"rendered":"State of Thin Clients"},"content":{"rendered":"
The IT world loves to swing back and forth between moving processing out to the user via fat clients and moving processing back to the server leaving users with thin clients. \u00a0The battle is a long running one that started with the first appearance of multiuser computer systems several decades ago and has continued to this day and will likely continue for a very long time to come.<\/p>\n
When I began working in IT, thin clients were simple text terminals attached to a single, central server via serial connections. \u00a0Limited to very basic text input these served their purpose at the time to provide relatively low cost computing to a large number of users. \u00a0The system wasn’t pretty or glamorous, but it was quite functional.<\/p>\n
These ancient terminals gave way to the personal computer and computing power shifted from the datacenter to the desktop allowing users to run powerful apps like Lotus 1-2-3 and WordPerfect. \u00a0Responsive graphical applications were a powerful draw for decentralized processing. \u00a0Users were enthralled with the new usability. \u00a0The text terminal went into very rapid decline.<\/p>\n
Eventually centralized power was available in such quantities and at such a low price point that graphical applications could be run with almost as much responsiveness from the server while clients could be “thin” needing just a shim of an operating system – enough to provide remote access back to the server. \u00a0Thin computing became the darling of the industry again and the term itself arose and moving towards centralized processing again came into vogue.<\/p>\n
Administrators love the central computing model because data and configuration remains in one place. \u00a0Backups and management are a breeze. \u00a0The idea, at least in theory, is that in doing so desktop support becomes a non-issue with all desktop clients being nothing more than commodity components that can be replaced anytime with completely interchangeable parts. \u00a0Since nothing is stored or configured on the desktop there is nothing to support there.<\/p>\n
In the initial swings of the “thin computing pendulum” the market movement was dramatic. \u00a0When text terminal computing first became available this was practically the only model used in the real world. \u00a0The value was so dramatic that no one could really justify doing anything else. \u00a0When the PC was introduced the movement to the fat client was so ubiquitous that many younger IT professionals today have never actually seen text terminals in use even though the move to fat “PC” clients was not as all encompassing as the move to text terminals had been one pendulum swing previous.<\/p>\n
The PC model was generally better for end users because it mimicked how they used computers at home – those that had computers at home. \u00a0It also gave them more options for customization and, for better or for worse, opportunity for them to begin installing software of their own rather than only that software preconfigured for them on the central server.<\/p>\n
Over time there have been a lot of developments from both camps giving each more and more advantages of the other. \u00a0Central domain services such as Microsoft’s Active Directory have come along allowing central management to extend out to fat clients bringing control and management more in line with traditional thin computing models. \u00a0Likewise, companies like Citrix have worked very hard developing new technologies that allow thin clients to perform much more like robust fat clients making their use as seamless as possible for end users and even making offline use possible for laptop users.<\/p>\n
Most shops today have adopted hybrid models. \u00a0Fat clients where they make sense and thin clients for certain categories of users and for remote workers and continuity of business scenarios.<\/p>\n
Over the past decade we have seen a shift in the way that business applications are created and deployed. \u00a0Today almost all business applications are web-based and have no client platform dependency. \u00a0This affords IT departments of today with a potential new opportunity – to shift from a traditional thin client platform – that requires remote graphical access – to the browser as the new thin client platform.<\/p>\n
The move to web apps has happened slowly and most businesses have a rather large legacy codebase on which they are quite dependent that cannot be easily transferred to the new web app architecture and some apps simply are not good candidates for this architecture. \u00a0But by and large the majority of new business applications are web based, written most often in Java or .NET, and these apps are prime candidates for a new thin computing model.<\/p>\n
If our custom business apps are available via the browser then our only commonly used apps that remain holding us back are the traditional productivity apps such as our office suites that are widely used by nearly all staff today (if they have a computer at all.) \u00a0Very few desktop apps are actually pervasive except for these. \u00a0Increasingly we are seeing browser-based alternatives to the traditional office suites. \u00a0Everyone is very aware of Google Apps as a pioneer in this area with Microsoft now offering online MS Office as well. \u00a0But the popular offerings making consumer news headlines require businesses to totally rethink long term strategies involving keeping critical business data within their walls and are not likely to be highly disruptive to the enterprise for quite some time.<\/p>\n
What does pose a threat to the status quo is other alternative software products such as ThinkFree office which is installed within the organization and used and secured internally just like any other normal business application. \u00a0This category of “traditionally installed internal web applications” will allow enterprise IT departments to begin to reconsider their end users’ platforms without having to reevaluate their entire concept of IT in general. \u00a0The biggest barriers to this today are lingering business applications and power users using specific desktop apps that cannot be encapsulated within a browser.<\/p>\n
One of the great advantages, however, of the browser as the new thin client is how simple it is to mix browser-based apps with traditional apps. \u00a0The move is transparent and most large businesses are moving in this direction today even if there is no overarching strategy to do so. \u00a0The market momentum to develop all new apps for the web is causing this to happen naturally.<\/p>\n
Another key advantage of a completely “web based” architectural model is the great ease with which it can be exposed for users outside of the corporate network. \u00a0Instead of using cumbersome VPN clients and company laptops employees can find any web browser, sign in to the company network and have secure business applications delivered to any browser, anywhere.<\/p>\n
Bringing this almost unnoticed shift into sharp relief today are a handful of, of all things, consumer devices such as: Apple’s iPhone and iPad and Google’s Android and ChromeOS platforms. \u00a0What all of these devices have in common is a focus upon being primarily thin web appliances – thin clients for consumers. \u00a0With the majority of consumer computing focused upon web connectivity the need for anything else from a platform is nearly non-existent in the consumer market. \u00a0This means that within a very short period of time users who once brought home PC experience to the office as their expectation of a computing environment will soon be beginning to bring web-based thin computing as their new expectation.<\/p>\n
When this shift happens IT departments with need to rethink their internal application delivery strategy.\u00a0 The change doesn’t have to be dramatic if current development trends are used commonly and legacy systems are routinely updated.\u00a0 In fact, one of the great benefits of this new model is that traditional fat clients function very well as browser platforms and will do so for a very long time to come most likely.\u00a0 Companies adopting this model will likely be able to slow desktop purchasing cycles and prepare for purchasing some form of traditional thin client with embedded browser or move to a business version of the new Nettop trend that we are beginning to see emerge in the consumer space.\u00a0 Some businesses may even attempt the rather dangerous path of using consumer devices but the lack of management and security features will likely keep this from being popular in all but rare instances.<\/p>\n
I believe, though, that this swing of the pendulum will not be as dramatic as the last one just as it was not as dramatic as the swing before that.\u00a0 It will be an important trend but IT departments understand more and more that no new technological shift is a silver bullet and that with each new opportunity comes new challenges.\u00a0 Most IT departments will need to implement some degree of browser-based thin computing over the next few years but most will retain a majority user base of fat clients.\u00a0 Hybrid environments, like we’ve seen for many years with more traditional models, will continue as before with each technology being used in target areas where they make the most sense.<\/p>\n
The one area where thin clients continue to be challenged the most is in mobile computing where disconnected users end up being digitally marooned away from their company networks unable to continue working until network connectivity is reestablished.\u00a0 This is a significant issue for power users who must travel extensively and need to be able to continue working regardless of their current connectivity.\u00a0 Today this is being solved in the traditional thin client arena thanks to companies like Citrix who continue to advance the state of the art in thin application delivery.<\/p>\n
In the browser-based arena we have had to turn to technologies like Google Gears and Adobe AIR in the past to make this possible but these had poor market penetration.\u00a0 Coming down the pike, however, is the new HTML 5 Offline API which is set to redefine how the web works for users who need to go “off the grid” from time to time.\u00a0 With HTML 5 incorporating offline capabilities and a richer feature set into the specification for the web itself we expect to see broad and rapid adoption from all of the leading vendors – most likely even before the draft standard is finalized.\u00a0 While still quite some ways away this new standard will certainly lay the groundwork for a significant shift towards the browser as a ubiquitous, standard and robust platform.<\/p>\n
The future of thin computing looks to be incredibly promising both in the enterprise as well as, for the first time, in the consumer arena as well.\u00a0 Adoption of thin computing models will be spurred on by the current movement towards Software as a Service models and SaaS adoption will continue to be encouraged by the widespread presence of thin computing devices.\u00a0 In many ways browser-based thin computing represents the technology aspect that is now maturing in the SaaS arena where SaaS itself is maturing in social acceptance rather than technical feasibility.<\/p>\n","protected":false},"excerpt":{"rendered":"
The IT world loves to swing back and forth between moving processing out to the user via fat clients and moving processing back to the server leaving users with thin clients. \u00a0The battle is a long running one that started with the first appearance of multiuser computer systems several decades ago and has continued to … Continue reading State of Thin Clients<\/span>