From The Network Computer to Cloud Computing – The Long and Winding Road

“Everything old is new again”, is a popular expression that usually applies to trends and fashion. Surprisingly, it can also be an apt way to describe many current trends in computing. In particular, the emergence of cloud computing, is a classic example of how an old idea that failed and died before, can be reborn and get a brand new lease on life, when conditions change. Sometimes timing is everything. To put this in perspective requires a short recap of computing history from the mid-nineties.

Does anyone remember the Network Computer? The Network Computer, or NC was the brain child of Larry Ellison of Oracle Corporation, who announced it shortly after the launch of Windows 95 in August 1995. The NC was to be a classic “thin” client – or a computer with very little software and intelligence unlike the traditional Personal computer, which was considered a “fat client”. The NC would function by connecting over a network, or the internet, to a central more powerful computer or server which contained all the software and intelligence to make it work. This was very close to the way Unix or Linux computers worked, and also similar to the centralized computing model favored by the large mainframes sold in earlier eras by the likes of Digital and IBM. Ellison was inspired by the emergence of the internet and the world wide web in the early nineties, which he thought provided the perfect, ubiquitous network for his NC to function.

The PC was the anti-thesis of the NC, as all or most of the computing was done locally on a processor (usually made by Intel) on a user’s desktop by running programs (usually from Microsoft) stored on a PC’s hard drive. This allowed the creation of a wide variety of rich interactive applications, ranging from word processors and spreadsheets to computing intensive, fast-moving, real-time games such as Sims, or Microsoft Flight Simulator -the first realistic simulation of flying a single-engine aircraft on a desktop that took full advantage of the computing power of the PC’s processor. The computer revolution of the eighties and nineties, was fueled mainly by the mass adoption of hundreds of these software applications running on PC’s that empowered users worldwide, to become more productive, work smarter, or simply have a little fun. By a rare combination of business acumen, smarts, and being at the right place at the right time, Gates and his company, Microsoft, came to provide not only the basic operating system (DOS and later Windows), but also subsequently many of the software applications – including Microsoft Word, Excel and others that powered the PC. By 1995, Microsoft dominated computer desktops with their software along with Intel – the manufacturer of the CPU or hardware powering most personal computers worldwide – leading to what has since been called the “Wintel” (Windows + Intel) duopoly.

The Network Computer was designed from the start with the ambitious goal of disrupting the Wintel duopoly by fundamentally altering the prevailing computing paradigm. Moving the intelligence from the desktop to the network, would have the effect of making the desktop irrelevant. There would be no need to purchase individual licenses of Microsoft’s latest Operating system or software – it would all run off the network using just a web browser and an internet connection. At least that was the theory for the genesis for the Network Computer. A number of Microsoft’s biggest competitors were of course thrilled with the concept of the Network Computer and quickly came forward to support it. A consortium comprising of Oracle, Sun and IBM was formed to provide the software and hardware to make the NC a reality.

However, in spite of hundreds of millions of dollars of investment and the considerable efforts of Oracle/Sun/IBM and others, the Network Computer flopped quite miserably (IBM only sold a total of 10,000 NCs) and eventually disappeared from the marketplace by 1999 or 2000. There were several reasons for the demise of the NC, but in retrospect is clear that it was the right idea at the wrong time.

Firstly, the NC required a persistent online connection with sufficient bandwidth to deliver all the software and applications across the internet. Unfortunately. the state of the art in connectivity in 1996 was a screaming fast 56K dial-up modem, and DSL and Cable modems were just glints in the eyes of the carriers.

Secondly, during this time (1996 to 1999), most websites were designed so that each time a user clicked on a link or submitted a form, the web browser would need to fetch a new page from the server or reload the current page with new information. This round trip to the server introduced latency – made worse by the slow internet connectivity- and resulted in a poor user experience which was no match for the rich, interactive experience of software running on a PC desktop. The Network Computer that used the web browser and the internet connection was also plagued by this approach. And so, by 1999, the Network Computer was dead and Ellison and his alliance of cohorts reluctantly gave up on the grand vision to transform computing and unseat Microsoft. But although the NC itself was a failure, it firmly planted the seeds of the promise and potential of network computing, to be fully realized in the future.

Several key developments contributed to the rebirth of the idea of the Network Computer. First, the explosive growth of consumer internet usage in the late nineties led to a massive investment by telecoms in fiber and broadband connections spanning the globe. In a span of less than a decade the world went from 28K dial-up modem connections to always-on  cable and DSL modems offering bandwidths that we could once only dream of and salivate about. We take this for granted today – but it is pretty remarkable. Consumers now routinely surf the web at download speeds of 1.0 to 10 MBps for a cost ranging $20 to $50 per month. The only way to even come close to this experience in 1999, was to install an expensive dedicated 1.5 MBps T-1 Line which would have set you back at least a thousand dollars a month. The high bandwidth connectivity meant a faster, more repsonsive web – content, pictures, videos on web pages that took tens of seconds to load before, now loaded in fractions of seconds. This was a bit closer to running the software on a local desktop PC.

Secondly, certain innovations in web browser design gradually brought the user experience closer to a desktop PC. One of the key elements in making web pages more interactive and responsive is the ability to send requests and receive updates from a server in the background – also known as asynchronous transfer. Although this was possible from the very first version of Java in 1995 using applets it was not widely used.  Adobe’s Flash was another technology that made it possible to run interactive applications – but it was a closed, proprietary  technology.  Ironically, it was Microsoft that made asynchronous transfers on web pages more popular with the introduction of the IFrame HTML element in Internet Explorer in 1996. And In 1999,  it was again Microsoft that  introduced a technology called ActiveX in Internet Explorer 5 which encapsulated a XMLHttpRequest object for communicating asynchronously with a server. ActiveX made it possible to design rich, interactive PC like web applications running inside a browser. ActiveX however only worked within Internet Explorer. The ActiveX approach was adopted and ported as a JavaScript XMLHttpRequest object by other browsers including Mozilla, Safari, Opera and others. (Microsoft also adopted the native XMLHttpRequest model from Internet Explorer 7).

However, the real impact of background and asynchronous HTTP requests to the server was not fully realized until it started appearing in full scale, large web applications such as Outlook Web Access in 2000 – again from Microsoft. This is of course ironic, considering that it was an early innovation that eventually led to web applications that threaten Microsoft’s desktop application dominance. Several other web applications began to use this approach including Oddpost in 2002. But it really took off and became mainstream when Google used this technology extensively to make Gmail (in 2004) and Google Maps (in 2005) –  fast, responsive and interactive – just like a local PC application. This technology – or rather group of technologies – also finally was given a name – Ajax which was short for Asynchronous JavaScript and XML. Ajax has since become an integral part of the web.    Ajax  finally enabled  the migration of many hitherto desktop-bound applications to the web – including spreadsheets, word  processing, interactive forms, databases etc.  The prime example of such applications  being Google Docs and Spreasheets- that offers a suite of applications that mimic most of the functionality of Microsoft’s flagship Office.    An internet connected PC running Google Docs / Spreasheets in a browser looks very close indeed to how a Network Computer was supposed to work.  And that is what Google’s Chrome Operating System aims to do.    It is also interesting to note here that Google’s CEO Eric Schmidt was an early evangelist of Ellison’s NC vision as the CTO of Sun Microsystems in the late nineties.

The  always on and high bandwidth connectivity combined with  Ajax, browser innovations,  and other helper technologies set the stage for the next step in the evolution of cloud based services.   The internet had become the ubiquitous network and the browser the universal client.   Suddenly, the applications, the  operating system and even the specific hardware powering your  local desktop computer did not matter.  You  could find and run most of  the  applications you needed   from a remote server somewhere on the internet – or from the “cloud”  (a term that arose from the nebulous graphics used to represent the internet on diagrams) .

Cloud Computing

And your experience would be nearly identical whether you were running on a PC powered by Intel and Microsoft,  or a RISC CPU running a variant of Linux, a Mac Airbook running OS X .  This of course is the ultimate vision of the proponents of cloud computing.  Amazon, Google and many others have embraced this vision completely – and committed major resources to it.

While  the  PC is far from dead and Microsoft recently sold 250 million of Windows 7 –  its latest operating system, it recognizes it must adapt to the new reality of a post-PC world.   Ray Ozzie – Microsoft’s Chief Software Architect, sess a future that involves  “continuous services”  delivered via the cloud on  “connected devices” available in a “breathtaking number of shapes and sizes, tuned for a broad variety of communications, creation and consumption tasks.”   Steve Ballmer, Microsoft’s CEO also recently announced that “For the cloud, we’re all in” – a major commitment to this paradigm shift in computing  from the desktop the cloud.

While the Network Computer is relegated to the dust heap of failed products and technologies,  it may just have been an idea a bit ahead of its time.   John Gage of Sun popularized the phrase “The Network is the Computer” – a tag line that was widely used in advertising campaigns for the NC.  It  would make a great slogan today  to market cloud computing services.  “Everything old is new again”  indeed !

Update (Nov 25, 2010): An article on TechCrunch about the Google Chrome OS talks about the cloud based operating system from Google and confirms that they do  have Microsoft Windows  directly in their sights.  Linus Upson, the  Google V.P. of Engineering for Chrome says that :

“…. 60 percent of businesses could immediately replace their Windows machines with Chrome OS machines. “

The Chrome OS is of course the Network Computer  reborn – and while the NC  failed before,  Google has a real chance of success  in its mission to unseat Microsoft with Chrome OS.  Chrome OS appears to be the destination at the end of the long and winding road that started with the Network Computer.  Borrowing a concept from the book  Where Good Ideas Come From by Steven Johnson – we could say that Cloud Computing was just not in the “adjacent possible” for the Network Computer in the late nineties – but it certainly is in 2010.   Sometimes timing is indeed everything for an idea to be successful.

Scroll to Top