The History Series 1 – On Premise Service Desk Solutions

The on premise/cloud argument is one of enterprise IT’s oldest rivalries. Even today, in the age of the cloud, the argument still rages on between factions almost fanatically devoted to each side. And like all big arguments, each of them have something going for them. Just that over time and with improvements in technology and in the way organizations have evolved, on-premise software is being driven to the edge of obsolescence. This isn’t entirely its own fault, it’s time has come and gone, and the cloud has stepped in as an easier, more flexible, and advanced successor.

I wrote about this earlier on the blog as well, and declared it to be our last word on it, but to truly understand why this happened, I realized that we need to go back. Way back.

Which is why we are kicking off a three part blog series on the rise of the cloud – we’ll start with the history of on-premise solutions, which of course includes the history of computing itself, and then move on to the history of the cloud. Lastly, we’ll concentrate on the SaaS revolution, and what it means for business and enterprise software.

Here we go!

The history of on-premise service desk solutions

The history of on-premise software is inextricably linked to the history of modern computing. In hindsight, this is the only way it could have been. The first significant event in modern computing was in 1942, when the computer developed by Professor John Vincent Atanasoff and graduate student John Berry of the University of Iowa was declared ‘unpatentable’ by federal court as it was ‘open-to-all’ technology.

This was a landmark ruling, as it ensured that computing as a concept was recognized as something special, something that should not be regulated by patents and individual property claims.

The consequences of this judgement would change our world, ensuring that computing moved from university enclaves to individual enthusiasts, the obsessives who would rewrite the map of the digital world.

In 1945, the first ‘bug’ was discovered in a computer – it was a live moth which had gotten caught in midst of the superheated vacuum tubes that were an integral part of the system then.

Several advancements ensued, but it was only in 1950 that computing went really mainstream, with the world’s first mass produced computer, Remington’s UNIVAC I. IBM responded in 1952 with the IBM 701, its own commercial computer that could be mass produced.

This back & forth probably signified the beginning of the great tech battles of our age, variations of which, bigger or smaller in scale, last to this day.

Until then, individuals or companies with a need to use computers had to lease time from the universities or military funded tech laboratories that usually housed the machines. But what the computers themselves could accomplish were still limited. This changed at the end of the decade when companies understood the commercial implications of faster computing.

In 1959, IBM introduced the IBM 1401 Data Processing systems, the first computer in the world to reach 10000 units in sales. This was followed by advances in system performance and speed with the Seymour Cray super computer arriving on the stage, with superior processing and unheard of speed. In 1967, IBM one-upped the game with the IBM S/360, an even faster computer that used innovations in transmission lines. 1969 also saw the birth of UNIX, the first operating system that became popular among computer scientists, academics and developers alike, who by now had understood the value of having a standardised operating system to build applications on.

The hegemony of the enterprise

The pace at which the computing world was moving hid one basic failing – until the early 1970’s, everything was aimed and used by the enterprise. The software developed invariably tended to be business applications and since the only ones who could afford all this were big enterprises who had the data processing needs, the SMB and personal segments remained neglected. Companies like Oracle, Baan and SAP made database or ERP or accounting software for the giants that wanted them.

These were huge, cumbersome operation run on the company’s own servers, requiring a mini-army of standing manpower and a horde of techies specialising in its intricate functionings. But at that time, this was state-of-the-art technology and even operating like this, the software applications were saving organizations that could afford them significant amounts of time and money. Classic on-premise software may be derided now for all its shortcomings, but once upon a time, it was all we had, and it did what it was supposed to the best it could.

Just that the market for the neglected layman, the individual consumer, was about to come into its own; this would take the rise of one of the most iconic, most successful technology companies of all time.

The ‘consumer’ companies

1975 saw the rise of Microsoft. Starting with developing software for the Altair 8800 microcomputer, Microsoft gradually upped its game, snapping up MS-DOS and releasing it on IBM PCs.

The slew of new computers continued, costs coming down with each iteration – 1977 saw the release of the Apple II, Commodore’s PET and of all people, Radio Shack’s Tandy TRS 80. The personal computer revolution was well and truly here.

It was at this juncture, in the 1980s, that innovation threw open what had been a closed field, and several things began to happen in the field of software application development. This was because Microsoft’s MS-DOS was becoming the standard operating system for personal computing, and when you have a standard, things that can be built to run on top find a market; software was about to become very very lucrative.

Businesses started shifting to the Windows-based client-server environment, and LANs started to become ubiquitous. As storage started to become cheaper and data processing sped up, companies started to demand applications for specific functions. Though CRM, Human Resource Management, ERP, Financial Management and other big-ticket software ruled the day, the fragmentation of software towards more specialized, specific functions had taken its baby steps. Another notable company in the history of on-premise at this time was Siebel.

There’s a storm coming

By this time the PC industry was in full swing, with several companies making different models and Dell slowly winning the volume game with a super-successful supply chain management model that is still taught in business schools across the world. And with at least one computer making its way into every American household, business adoption of software was about to skyrocket. And enterprise software, which had for so long dominated the world of business applications was about to undergo a radical transformation.

The year was 1990, and an Englishman called Tim Berners-Lee was able to communicate between a computer and a data network using something called a web browser.

The internet was coming.