Category Archives: Career

Choosing a University for IT Education

In previous articles I have tackled the questions around approaching university education and selecting a degree program but, thus far, I have not provided any guidance in selecting an institution at which to study.  That will be rectified now.

There are basically five categories of universities in the United States that we need to consider.  These types of educational institutions are:

  • Unaccredited schools
  • Accredited Trade schools
  • Accredited Online schools
  • Accredited Brick and Mortar Private schools
  • Public Brick and Mortal schools

There are more types of schools than these but we can basically lump all schools into one of these categories as these are the general categories that a hiring manager will view schools on a candidate’s resume.  University education has two key benefits, the first is in broadening thought processes and introducing students to many topics through liberal studies.  The second is in providing beneficial resume line items and for this second category we need a university that provides a positive reaction.

So assuming that we are concerned about putting our degrees and education onto our resumes, we need to consider carefully how our choices of educational institution will reflect on us.  You will notice that I carefully did not say that universities provide skill training to prepare workers for the jobs that they will do.  This I have covered in other articles; the university system is not intended nor generally capable of training people directly for work.  There is no mandate to do this, no expectation and little potential capacity especially when we are considering highly technical or quickly changing career fields.  IT maybe be among the most extreme of these kinds of fields, but this issue applies across the board.

Because such a huge portion of the value of a degree comes from how that degree is perceived by a hiring manager, we have to consider that impression very carefully.  And this produces what I would consider “the dead line” in selecting educational institutions.

For a large percentage of hiring managers, and much of the population, only certain types of universities are considered valid.  This is not a judgment call, only an observation of hiring reality.  Whether the quality of education, rigors of study and such are valuable or not, certain categories of schools are considered non-valid in enough of the marketplace that we must effectively discount them from consideration.

From the list that I have provided, any school that is unaccredited, purely online or a tech/trade school should be completely avoided.  These three categories are routinely views as such a strong negative that in a great many cases a candidate will be eliminated based on this one factor alone.  It is commonly said that hiring managers will see one of these schools and throw a resume directly out without any further consideration, but in reality in many cases an HR filter will do this before any human even sees the resume.  The same logic that says that we use degrees to get passed human resource gatekeepers to get our resumes in front of hiring managers based on “black and white” filter requirements, also tells us that we must avoid schools that would be considered to be on a “black list.”

This leaves only two categories of schools for any serious consideration: private, accredited brick and mortar schools and public, accredited brick and mortar schools.  Now, it must be noted, that just because a school is brick and mortar does not mean that they do not also offer online or alternative classes.  And at no point has it been suggested that it is necessary to attend a school in person.  What is critical is simply that the school be perceived as a valid, traditional educational institution.  In many cases, online classes are the best option as they provide more flexibility and better use of time avoiding time wasted in commuting, moving between classrooms and such.

Of this remaining category, public schools fare far better than private ones because the lower cost of attendance lowest, quite dramatically, the risk inherent in spending time and money on education: the less money spent, the less risk taken.  In only rare cases are private schools any better than public ones and in very many cases, they are worse.  The risk/reward calculation on most public schools is simply far better in the majority of cases.

With any school choice, reputation matters.  Schools with a good reputation are best, especially those that are broadly known.  Schools that have no reputation can be fine, as long as they truly are unknown and fall into good categories.  Schools can get a bad reputation regionally or globally, however, and this poses a risk that is difficult to predict or to avoid.  What is a top ranked school today can be poorly viewed tomorrow, and vice versa.  Large schools have the advantage of increasing the chances that someone on a hiring team will have attended that school increasing personal affinity.

There is no simple answer to selecting the right school.  Does the school benefit you through education, reputation or association (with people that will help you later in your career) is unique to each person and school combination.  But he universal guideline to follow is to stick to accredited, broadly well respected, brick and mortar, public or private not for profit schools and consider cost carefully.  Avoid online and/or for profit schools or any school that lacks proper accreditation.

As a modern side note: many schools, even sometimes others good ones, that advertise heavily especially on television or radio, often earn a bad reputation simply because of the medium of attempting to lure students.  If you have seen a school because of their marketing campaign, assume that a hiring manager has as well and while some good schools do this, it may not matter.

Finding A Job, or Finding THE Job

Nearly everyone overlooks this incredibly basic question and yet nearly everyone has to face this it when thinking about their career decision making and their future. This applies to middle school students, those preparing for university, university grads and even mid-career professionals making key decisions about life goals.  Is our goal in our career and career preparation to land a job, meaning any job more or less (at least within our field); or is our goal to get to push our careers higher and higher looking for “the” job, the one that pays great, satisfies us, challenges us and fulfills us?  Everyone has to answer this question and nearly everyone does, even if they fail to admit it to themselves or anyone else.

Our answer to this question plays a part in effectively every decision that we make around our careers, and by extension in our lives. It affects what careers we choose to pursue, how we pursue them, what education we get, when we get it, which job offers we accept, to which jobs we submit our resume, when we start hunting for the next promotion or change, lateral shift or external opportunity, when we relocate, when we buy a home, if we take a consulting position or standard employment, what certificates we get, what books we read, what communities we participate in, when or if we decide to get married, when or if we decide to have children and how we interact with our colleagues among many, many other things.  And yet, with all of these things not just being influenced by this decision, but often being almost solely governed by it, few people really sit down and take the time to evaluate their personal career goals to determine how the decisions that they make and planning that they do will determine what kind of jobs they are likely to be able to pursue.  One of the most critical and defining choices of our lives is often given little thought and is treated as being practically a casual, trivial background decision.

People rarely want to talk about questions like this because the harsh reality is that most people, in fact nearly all people, cannot realistically achieve “the” job.  Their dream job or a top career position is likely out of their reach – at least while trying to maintain any kind of work/life balance, have a family, rear children or whatever.  No one wants to admit that they are the “majority” and are really just looking for “a” job and even fewer want to look at them and point out that this is the case for them.  But it is something that we should do (for ourselves, not pointing at others.)  We have to determine what matters for us, where our own priorities lie.

To our ears, going after any old job sounds horrible while seeking the pinnacle of the field sounds like a perfect goal, a natural one.  This is, to some non-trivial degree, an extension of that problem that we have all been talking about for a generation – the need for the glorification of the trivial, rewarding everyone as if average life events are something special (like having graduation parties for people moving from second to third grade, or awards for attendance because “just showing up” is worth an award?)

Life is not that simple, though, for several reasons.  First is statistics.  Realistically amazing jobs only make up something like .1% of all available jobs in the world.  That means that 99.9% of all workers have to go after less than apex jobs.  Even if we broaden the scope to say that “great” jobs represent just 2% of available jobs and 98% of people have to go after more mundane jobs, we still have the same situation: the chances that you are in the .1 – 2% is quite low.  Almost certainly, statistically speaking, you are in the 98%.  The numbers are not as terribly bad as they may seem because awesome jobs are not necessarily apex jobs, that is just one possibility.  The perfect job for you might be based on location, flexibility, benefit to humanity, ability to do rewarding work or compensation.  There are many possible factors, the idea of “the” job is not that it is purely about title or salary, but those are reasonable aspects to consider.

The second part is the other prices that need to be paid.  Attempting to go after “the” job generally relies on a lot of things such as being a good self starter, thinking outside of the box (career-wise), relocating, working longer hours, studying more, challenging others, self promotion, putting in long hours away from the office to improve faster than others, starting your career sooner, being more aggressive, etc.  None of these factors are strictly required, but commonly these and many others will play an important role.  Going after the dream job or apex role means taking more risks, pushing harder and setting ourselves apart.  It requires, on average, far more work and has a much less defined path from start to finish making it scarier, ambiguous and more risky.  High school guidance counselors cannot tell you how to get from point A to point B when talking about “the” job; they lack the knowledge, exposure and resources to help you with that.  When going after “the” job you are almost certainly forging your own path.  Everyone is unique and everyone’s perfect job is unique and often no one knows what that perfect job is exactly until they arrive at it, often after many years of hard work and seeking it.

These two mindsets change everything that we do.  One: we design our careers around optimum performance while accepting high chance of failure.  And two: we design our careers around risk mitigation and we hedge our bets sacrificing the potential for big payoffs (salary, position, benefits, whatever) in exchange for a more well defined job and career path with better stability and less chance of finding ourselves floundering or worse, out of work completely and maybe even unemployable.

If you spend a lot of time talking to people about their career goals you will often see these two mindsets at work, under the surface, but essentially no one will verbalize them directly.  But if you listen you can hear them being mulled about from time to time.  People will talk about priorities such as being able to live in the same house, town or region and their willingness to give up career options in exchange for this.  This is an important life decision, and a common one, where most people will choose to control where they live over where and how they work.  Another place you hear in the undertone of conversation is when people are contemplating their next career move – do they focus on the potential for opportunity or do they focus on the risks caused by instability and the unknown?

A major area in which these kinds of thoughts are often expressed, in one way or another, is around education and certification.  In IT especially we see people often approach their educational choices from a position of risk mitigation, rather than seized opportunity.  Very few people look to their education as “the path to this one, specific dream position” but instead generally speak about their education’s “ability to get them more interviews and job offers at more companies.”  It’s about a volume of offers, which is all about risk mitigation, rather than about getting the one offer that really matters to them.  Each person only needs one job, or at least one job at a time, so increasing the volume of potential jobs is not, realistically, a chance for greater achievement but rather simply a means of decreasing the risk around job loss and unemployment.

This is especially true when people discuss the necessity of certain educational factors for certain types of low paying, more entry level jobs – even people focusing on getting “a” job may often be shocked how often people target rather significant education levels for the express purpose of getting very low paying, low mobility, low reward jobs, but ones that are perceived as being more stable (often those in the public sector.)  This is mirrored in many certification processes.  Certifications are an extension of education in this way and many people go after common certifications, often in many different areas of study, in order to hedge against job loss in the future or to prepare for a change of direction at their current job or similar.  Education and certification are not generally seen as tools for success, but attempts to hedge against failure.

You may recognize this behavior expressed when people talk about creating a resume or CV designed to “get past HR filters.”  This makes total sense as a huge percentage (whether this is 5% or 80% does not matter) of jobs in the market place are gate-kept by non-technical human resources staff who may eliminate people based on their own prejudices or misunderstandings before qualified technical resources ever get a chance to evaluate the candidates.  So by targeting factors that help us to successfully pass the HR filter we get many more opportunities for a technical hiring manager to review our candidacy.

Of course, nearly everyone recognizes that an HR filtering process like this is horrific and will eliminate incredibly competent people, possibly the best people, right off of the bat.  There is no question that this is not even remotely useful for hiring the best potential employees.  And yet most everyone still attempts to get past these HR departments in the hopes of being hired by firms that have no interest, even at the most basic level, of hiring great people, but rather are looking mostly to eliminate the worst people.  Why do we do this so reliably?  Because the goal here is not to get the best possible job, but rather to have as many opportunities as possible to get, more or less, “a” job.

If we were seeking the best possible jobs we would actually be challenged in the opposite direction.  Rather than hoping to get past the HR filters, we might be more interested in being intentionally caught and removed by them.  When looking for the “perfect” career opportunity we care more about eliminated the “noise” of the interviewing process than we are in increasing the “hits”.  It is a completely different thought process.  In the “any job” case, we want as many opportunities as we can get so that we have one to take.  But in the “the job” case, we want less rewarding jobs (however this is defined for the individual) to filter themselves out of the picture as we would otherwise have them potentially wasting our time or worse, have them appear like a great opportunity that we might accidentally accept when we would not have done so had we known more about them up front.

When going after “a” job we expect people to accept jobs quickly and give them up reluctantly.  Those in the opposite position generally do exactly the opposite, giving a lot of thought and time to choosing the next career move but having little concern as to remaining at their last “stepping stone” position.

Somewhat counter-intuitively we may find that those wiling to take job offers more quickly may actually find themselves with fewer useful career opportunities in the long run.  The appearance of stability is not always what it seems and the market pressures are not always highly visible.  There are a couple of factors at play here.  One is that the path to the most common jobs is one that is well trodden and the competition for those jobs can be fierce.  So even though perhaps 90% of all jobs would be seen as falling into this category, perhaps 95% of all people are attempting to get those jobs.  The approach taken to get “a” job generally results in a lack of market differentiation for the potential worker (and for the job as well) making it difficult to stand out in a field so full of competition.

On the other hand, those that have worked hard to pursue their goals and have taken unique paths may be presented with technically fewer options, but those that they are presented with are usually far better and have a drastically smaller pool of competition vying for those positions.  This can mean that actually getting “the” job might be more likely than it would otherwise seem even to the point of being potentially easier than getting “a” job, at least through traditional means and approaches.  By taking the path less traveled, for example, the candidate working extremely hard to reach a dream position may find ways to bypass otherwise stringent job requirements, for example, or may simply leverage favorable statistical situations.

Also working in the favor of those seeking “the” job is that they tend to advance in their careers and develop powerful repertoires much more quickly.  This alone can be a major factor in mitigating the risk of going this route.  Powerful resumes, broad experience and deep skill sets will often allow them to command higher salaries and get into jobs in a variety of categories across more fields.  This flexibility from a capability and experience perspective can heavily offset the inherent risks that this path can appear to present.

At the end of the day, we have to evaluate our own needs on a personal level and determine what makes sense for us or for our families.  And this is something that everyone, even middle school students, should begin to think about and prepare for.  It requires much self reflection and a strong evaluation of our goals and priorities to determine what makes sense for us.  Because factors like high school classes and high school age interning and projects, university decisions, and more happen so early in life and are so heavily dependent on this realization of intention we can all benefit greatly by promoting this self evaluation as early on as possible.

And this information, this self evaluation, should be seen as a critical factor in any and all job and career discussions.  Understanding what matters to us individually will make our own decisions and the advice from others so much more meaningful and useful.  We so often depend on assumptions, often wrong, about whether we are looking for the chance to climb the ladder to a dream job or if we looking for a lifetime of safety and security and few, if any, are willing to outright state what factors are driving their assumptions and how those assumptions drive decisions.

How about you?  Are you looking at every career decision as “how does this get me to the best, most amazing position possible” or are you thinking “how will this put me at risk in the future?”  What are your priorities.  Are you looking for a job; or are you looking for the job.

Business: The Context of IT

I would estimate that the vast majority of people working in the IT field come to it out of an interest in or even a passion for computers. Working in IT lets them play with many big, fast, powerful computers, networks, storage devices and more.  It’s fun.  It’s exciting.  We tend to love gadgets and technical toys.  We love overseeing the roaring server room or datacenter.  This is, almost universally, true of IT people everywhere in the industry.

Because of this somewhat unnatural means by which people are introduced to IT as a career we are left with some issues that are not exactly unique to IT but that are, at the very least, relatively extreme in it.  Primarily the issue that we face, as an industry and especially within the SMB portion of the industry, is a lack of business context within our view of IT.

IT exists only with a business context, this is crucial for understanding all aspects of IT.  Without a business to support, IT would not be IT at all but would just be “playing with computers.”  Other departments that are directly tied to business support such as finance, accounting, human resources, legal, etc. have far more typical business involvement and less “inwardly focused interest” so that they tend to not lose focus on their role in supporting the business environment in everything that they do.  But IT is often so far removed from the business itself, at least mentally, that it is easy to begin to think that IT exists for its own sake.  But it does not.

Moreso than nearly any other department IT is and must be an integral part of the business.  IT has some of the deepest and broadest insight into the business and is invaluable as a partner with management in this aspect.  Everything that happens in IT must be considered within the context of, and in regards to the needs of, the business.

Of course there are roles within IT, within any department, that can function essentially completely without understanding the context of the business that they are supporting.  Job roles that are highly structured and rely on procedure rather than decision making can often get away without even knowing what the business does let alone considering its needs.  But once any role in IT moves into an advisement or decision making one, the business is the core focus.  In reality, the business is the only focus.  IT is an enabler of business, if it is not enabling the business, what is it doing?  Because of this we must remain ever cognizant of the business reasonings behind all decision making and planning.

This cannot be overstated: The primary role of IT is a business one, not a technical one.

IT needs to think about the business at every turn.  Every decision should be made with a keen sense of how it impacts the business in efficiency, cost effectiveness, etc. It is so easy, especially when working with other IT staff from other companies, to lose this perspective and begin to think that there are stock answers, that there are accepted “it should be done this way” approaches, that IT should dictate what is best for the business from an IT perspective.

These concepts become especially poignant when we talk about areas of risk.  It is commonly an IT perspective to think of risk as something that must be overcome, but a business perspective is to balance risk against the cost of mitigation.  If left to run on their own without oversight, most IT departments would see the business as so critical that any amount of money should be spent on a “better” IT infrastructure in order to make sure that downtime could never happen.  But this is completely wrong.  “Better” should never be associated with uptime, it should be associated with “what best serves the goals of the business.”  Perhaps that is uptime, perhaps it is a lowering of capital expenses: it depends on the unique business scenario. Often what is best for the business is not what is perceived as being best for IT.

Concepts such as “the business cannot go down” or “cost is no object” have no place in a business, and therefore cannot in IT.  Every business has a cost of uptime threshold where it is more cost effective to be done.  No IT project has cost as no object, in a business cost is always an object.

What IT needs to do is learn to think differently.  The needs of the business should be at the forefront of IT concepts of what is good and what is applicable.  The idea that there is a “proper or best level of protection” for a system should never even occur to IT decision makers.  Instead, IT should immediately think about value to the business, cost of downtime, cost of risk mitigation and make decisions based around the value to the business.

Thinking about “business first” or really “business only” can be a struggle for IT staff that come to IT from a technology perspective instead of from a business one, but it is a critical skill and will fundamentally change the approach and effectiveness of an IT department.

Businesses need to look for IT staff in decision making and guidance roles that have a firm understanding or and interest in business and can consistently maintain their IT work within that perspective.

The End of the GUI Era

We should set the stage by looking at some historical context around GUIs and their role within the world of systems administration.

In the “olden days” we did not have graphical user interfaces on any computers at all, let alone on our servers. Long after GUIs began to become popular on end user equipment, servers still did not have them. In the 1980s and 1990s the computational overhead necessary to produce a GUI was significant in terms of the total computing capacity of a machine and using what little that there was to produce a GUI was rather impractical, if often not even completely impossible. The world of systems administration grew up in this context, working from command lines because there was no other option available to us. It was not common for people to desire GUIs for systems administration, perhaps because the idea had not occurred to people yet.

In the mid-1990s Microsoft, along with some others, began to introduce the idea of GUI-driven systems administration for the entry level server market. At first the approach was not that popular as it did not match how experienced administrators were working in the market. But slowly, as new Windows administrators and to some degree as Novell Netware administrators, began to “grow up” with access to GUI-based administration tools there began to be an accepted place in the server market for these systems. In the mid to late 1990s the UNIX and other non-Windows servers completely dominated the market. Even VMS was a major player still and on the small business and commodity server side Novell Netware was the dominant player mid-decade and still a very serious contender late in the decade. Netware offered a GUI experience but one that was very light and should probably be considered only “semi-GUI” in comparison to Windows NT’s rich GUI experience offered by at least 1996 and to some degree earlier with the NT 3.x family, although Windows NT was only just finding its place in the world before NT 4’s release.

Even at the time, the GUI-driven administration market remained primarily a backwater. Microsoft and Windows still had no major place on the server side but was beginning to make inroads via the small business market where their low cost and easy to use products made a lot sense. But it was truly the late 1990s panic and market expansion brought on by the combination of the Y2K scare, the dotcom market bubble and excellent product development and marketing by Microsoft that a significant growth and shift to a GUI-driven administration market occurred.

The massive expansion of the IT market in the late 1990s meant that there was not enough time or resources to train new people entering IT. The learning curve for many systems, including Solaris and Netware, was very steep and the industry needed a truly epic number of people to go from zero to “competent IT professional” faster than it was possible to do with the existing platforms of the day. The market growth was explosive and there was so much money to be made working in IT that there were no available resources to effectively train new people who needed to be coming into IT as anyone qualified to handle educational duties was also able to earn so much more working in the industry rather than working in education. As the market grew, the value of mature, experienced professionals became extremely high, as they were more and more rare in the ever expanding field as a whole.

The market responded to this need in many ways but one of the biggest ones was to fundamentally change how IT was approached. Instead of pushing IT professionals to overcome the traditional learning curves and develop the needed skills to effectively manage the systems that were on the market at the time, the market changed which tools that they were using to accommodate less experienced and less knowledgeable IT staff. Simpler and often more expensive tools often with GUI interfaces began to flood the market allowing those with less training and experience to at least begin to be useful and productive almost immediately even without ever having seen a product previously.

This change coincided with the natural advancement of the performance of computer hardware. It was during this era that for the first time the power of many systems was such that while the GUI still made a rather significant impact to performance, the lower cost of support staff and speed at which systems could be deployed and managed generally offset this loss of computing capacity taken by the GUI. The GUI rapidly became a standard addition to systems that just a few years before would never have seen one.

To improve the capabilities of these new IT professionals and to rush them into the marketplace the industry also shifted heavily towards certifications, more or less a new innovation at the time, which allowed new IT pros, often with no hands on experience of any kind, to establish some degree of competence and to do so commonly without needing any significant interaction or investment from existing IT professionals like university programs would require. Both the GUI-based administration market, as well as the certification industry, boomed; and the face IT significantly changed.

The result certainly was a flood of new, untrained or lightly trained IT professionals entering the market at a record pace. In the short term this change work for the industry. The field went from dramatically understaffed to relatively well staffed years faster than it could have done so otherwise. But it did not take long before the penalties for this rapid uptake of new people began to appear.

One of the biggest impacts to the industry was that there was an industry-wide “baby boom” with all of the growing pains that that would entail. An entire generation of IT professionals grew up in the boot camps and rapid “certification training” programs of the late 1990s. This resulted in a long term effect of the rules of thumb and general approaches common in that era becoming often codified to the point of near religious belief in a way that previous, as well as later, approaches would not. Often, because education was done quickly and shallowly, many concepts had to be learned by rote without an understanding of the fundamentals behind them. As the “Class of 1998” grew into the senior IT professionals in their companies over time, they became the mentors of new generations and that old rote learning has very visibly trickled down through similar approaches in the years since, even long after the knowledge is outdated or impractical and in many cases it has been interpreted incorrectly and is wrong in predicable ways even for the era from which it sprang.

Part of this learning of the era was a general acceptance that GUIs were not just acceptable but that they were practical and expected. The baby boom effect meant that there was little mentorship from the former era and previously established practices and norms were often swept away. The baby boom effect meant that the industry did not exactly reinvent itself as much as it simple invested itself. Even the concept of Information Technology as a specific industry unto itself took its current form and took hold in the public consciousness during this changing of the guards. Instead of being a vestige or other departments or disciplines, IT came into its own; but it did so without the maturing and continuity of practices that would have existed with more organic growth leaving the industry in possibly a worse position than it might have been would it have developed in a continuous fashion.

The lingering impact of the late 1990s IT boom will be felt for a very long time as it will take many generations for the trends, beliefs and assumptions of that time period to finally be swept away. Slowly, new concepts and approaches are taking hold, often only when old technologies disappear and knew ones are introduced breaking the stranglehold of tradition. One of these is the notion of the GUI being the dominant method by which systems administration is accomplished.

As we pointed out before, the GUI at its inception was a point of differentiation between old systems and the new world of the late 1990s. But since that time GUI administration tools have become ubiquitous in their availability. Every significant platform has and has long had graphical administration options so the GUI no longer sets any platform apart in a significant way. This means that there is no longer any vendor with a clear agenda driving them to push the concept of the GUI. The marketing value of the GUI is effectively gone. Likewise, not only did systems that previously lacked a strong GUI nearly all develop one (or more) but the GUI-based systems that did not have strong command line tools went back and developed those as well and developed new professional ecosystems around them. The tide most certainly turned.

Furthermore, over the past nearly two decades the rhetoric of the non-GUI world has begun to take hold. System administrators working from a position of a mastery of the command line, on any platform, generally outperform their counterparts leading to more career opportunities, more challenging roles and higher incomes. Companies focused on command line administration find themselves with more skilled workers and a higher administration density which, in turn, lowers overall cost.

This alone was enough to make the position of the GUI begin to falter. But there was always the old argument that GUIs, even in the late 1990s, used a small amount of system resources and only added a very small amount of additional attack surface. Even if they were not going to be used, why not have them installed “just in case.” As CPUs got faster, memory got larger, storage got cheaper and as system design improved the impact of the GUI became less and less so this argument of having GUIs available got stronger. Especially strong was the proposal that GUIs allowed junior staff to do tasks as well making them more useful. But it was far too common for senior staff to retain the GUI as a crutch in these circumstances.

With the advent of virtualization in the commodity server space, this all began to change. The cost of a GUI became suddenly noticeable again. A system running twenty virtual machines would suddenly use twenty times the CPU resources and twenty times the memory and twenty times the storage capacity of a single GUI instance. The footprint of the GUI was noticeable again. As virtual machine densities began to climb, so did the relative impact of the GUI.

Virtualization gave rise to cloud computing. Cloud computing increased virtual machine deployment densities and exposed other performance impacts of GUIs, mostly in terms of longer instance build times and more complex remote console access. Systems requiring a GUI began to noticeably lag behind their GUI-less counterparts in adoption and capabilities.

But the far bigger factor was the artifact of cloud computing’s standard billing methodologies. Because cloud computing typically exposes per-instance costs in a raw, fully visible way IT departments had no means of fudging or overlooking the costs of GUI deployments whose additional overhead would often even double the cost of a single cloud instance. Accounting would very clearly see bills for GUI systems costing far more than their GUI-less counterparts. Even non-technical teams could see that the cost of GUIs was adding up even before considering the cost of management.

This cost continues to increase as we move towards container technologies where the scale of individual instances becomes small and smaller means that the relative overhead of the GUI becomes more significant.

But the real impact, possibly the biggest exposure of the issues around GUI driven systems is the industry’s move towards the DevOps system automation models. Today only a relatively small percentage of companies are actively moving to a full cloud-enabled, elastically scalable DevOps model of system management but the trend is there and the model leaves GUI administrators and their systems completely behind. With DevOps models direct access to machines is no longer a standard mode of management and systems have gone even farther than working solely from the command line to being built completely in code meaning that not only do system administrators working in the DevOps world need to interact with their systems at a command line but they must do so programmatically.

The market is rapidly moving towards fewer, more highly skilled systems administrators working with many, many more servers “per admin” than in any previous era. The idea that a single systems administrator can only manage a few dozen servers, a common belief in the GUI world, has long been challenged even in traditional “snowflake” command line systems administration with numbers easily climbing into the few hundred range. But the DevOps model or similar automation models take those numbers into the thousands of servers per administrator. The overhead of GUIs is becoming more and more obvious.

As new technologies like cloud, containers and DevOps automation models become pervasive so does the natural “sprawl” of workloads. This means that companies of all sizes are seeing an increase in the numbers of workloads that need to be managed. Companies that traditionally had just two or three servers today may have ten or twenty virtual instances! The number of companies that need only one or two virtual machines is dwindling.

This all hardly means that GUI administration is going to go away in the near, or even the distant, future. The need for “one off” systems administration will remain. But the ratio of administrators able to work in a GUI administration “one off” mode versus those that need to work through the command line and specifically through scripted or even fully automated systems (a la Puppet, Chef, Ansible) is already tipping incredibly rapidly towards non-GUI system administration and DevOps practices.

What does all of this mean for us in the trenches of the real world? It means that even roles, such as small business Windows administration, that traditionally have had little or no need to work at the command line need to reconsider the dependence on the local server GUI for our work.   Command line tools and processes are becoming increasingly powerful, well known and how we are expected to work. In the UNIX world the command line has always remained and the need to rely on GUI tools would almost always be seen as a major handicap. This same impression is beginning to apply to the Windows world as well. Slowly those that rely on GUI tools exclusively are being seen as second class citizens and increasingly relegated to more junior roles and smaller organizations.

The improvement in scripting and automation tools also means that the value of scale is getting better so that the cost to administer small numbers of servers is becoming very high on a per workload basis which means that there is a very heavy encouragement for smaller companies to look towards management consolidation through the use of outside vendors who are able to specialize in large scale systems management and leverage scripting and automation techniques to bring their costs more in line with larger businesses’ costs. The ability to use outside vendors to establish scale or an approximation to it will be very important, over time, for smaller businesses to remain cost competitive in their IT needs while still getting the same style of computing advantages that larger businesses are beginning to experience today.

It should be noted that happening in tandem with this industry shift towards the command line and automation tools is the move to more modern, powerful and principally remote GUIs. This is a far less dramatic shift but one that should not be overlooked. Tools like Microsoft’s RSAT and Server Administrator provide a GUI view that is leveraging command line and API interfaces under the hood. Likewise Canonical’s Ubuntu world now has Landscape. These tools are less popular in the enterprise but are beginning to make the larger SMB market able to maintain a GUI dependency while also managing a larger set of server instances. The advancement in these types of GUI tools may be the strongest force slowing the adoption of command line tools across the board.

Whether we are interested in the move from the command line, to GUIs and back to the command line as an interesting artifact of the history of Information Technology as an industry or if we are looking at this as a means to understanding how systems administration is evolving as a career path or business approach for our own uses it is good for us to appreciate the factors that caused it to occur and why the ebb and flow of the industry is now taking us back out to the sea of the command line once again. By understanding these forces we can more practically asses where the future will take us, when the tide may again change, how to best approach our own careers or decide on both technology and human talent for our organizations.