The History of Computer Technology

This post was a team effort, special thanks to James Kilroe & Justin Swart for their contributions. On to the post!

Technology has never been as prevalent in daily life as it is today, with the trend of the increasing significance of technology only set to continue. It is therefore of use to look back at how technology has evolved, both from a technical perspective and how technology is consumed. In a matter of 60 years, computer technology has gone from requiring an entire room to operate, to the desk, to the pocket and now running largely from the Cloud. This drastic growth in technical capabilities has led to an equally drastic change in how computer technology is consumed. The original general-purpose computers largely acted as a mathematical calculator. Today, relatively strong computational power can be found in everything from our smartphones, to our watches and fridges.

Perhaps the more interesting trend, for us at Newtown Partners at least, is the shift in business models throughout the development of computer technology. The history of computer technology has gone through various epochs, which I detail in this paper. In short, new upper segments of the technology stack extract value from segments lower down the stack. The new technology layer creates and captures a significant portion of value (‘Exploring’ the new technological paradigm) and subsequently commoditized the value of the layers lower in the stack (‘Exploiting’ the infrastructure), repeating the cycle through the various epochs. This post, along with subsequent posts, will use this lens to try postulate where technology is heading, and in particular what successful business models will look like in these new paradigms.

Where Are We In The History Of The Computer Technology?

1940’s and 1950’s: First General-purpose computers

The ENIAC (Electronic Numerical Integrator And Computer) was the first large-scale computer to run at electronic speed without being slowed by any mechanical parts. The ENIAC used a record 18,000 vacuum tubes to run and was rather sizeable, requiring 15-by-9-meters of space.

At the time this was frontier technology and during the 1950s, early commercialization of the technology became viable.

1950’s and 1960’s: Transistors

Transistors brought about smaller and more reliable computers that disrupted the more expensive general-purpose computer. Decreased costs and smaller space requirements brought about increased demand for computers. However, the transistor computer remained inaccessible as it was too expensive and large for general households, so demand was still largely driven by firms.

The first transistor-only computer — the IBM 608 — cost $1,760 a month to rent. In comparison, it cost $3,200 a month to rent the first vacuum-tube ‘personal’ computer — IBM 650.

The 1950s to ’60s, therefore, marked the start of the modern computer industry which eventually consolidated around IBM. Some poignant examples of this ubiquity are:

1970’s and 1980’s: Microprocessor

Intel introduced the first commercial microprocessor — the Intel 4004 — in 1971. However, it was the Intel 8008 (introduced in 1972) that would power the next generation of computers — the IBM PC.

Microprocessors drastically lowered the costs of producing computers, enabling mass production of bespoke CPU systems. The microprocessor enabled the minicomputer, PC, laptop and eventually the mobile phone, all of which challenged IBM’s larger transistor computers.

The value during the 1970s remained within hardware production but started to migrate towards the microprocessor producers. Due to first mover advantage with the Intel 4004 and 8008, Intel controlled 100% of the microprocessor market segment in the early era. Motorola eventually introduced the 6800 in 1974, which begun the microprocessor ‘wars’ between Intel and Motorola.

1980’s and 1990’s: The PC

The introduction of microprocessors brought about greater competition within the hardware layer of computers. Value was no longer created (for the most part) through hardware advancements (as the majority of the foundations for hardware was laid down) but was rather generated through margin enhancement and performance improvements on existing architectures. The largest hardware manufacturers of this era were Intel, Zilog, Motorola, and MOS Tech. Competition became even tougher when Japanese chips from Hitachi, NEC, Fujitsu, and Toshiba came to market.

The mass production of computers brought about increased demand by users for a common operating system (OS) that ran standardized software. Microsoft identified this opportunity and captured the value by designing a proprietary operating system and securing distribution rights with computer manufacturers. Microsoft focused on winning over developers, bringing all applications into one operating system. Customers were drawn to the Windows platform as it had the largest number of compatible applications. As a result, Microsoft’s operating system was running on 97% of all computing devices by the late 90s. Therefore, value generation transitioned up the technology stack to the software layer, where a new, near unbounded design space was enabled.

Despite the increased hardware competition, Intel continued to dominate the microprocessor market. Intel’s Pentium chips combined with Microsoft’s Windows 3.x operating system rapidly expanded the PC market. Windows only began supporting non-intel processors from Windows NT 3.51. Despite support for non-intel processors, the Intel/Windows alliance, dubbed “Wintel”, continued to shape the PC market. This ecosystem lock-in was fundamental to the success of the two entities, which deserves a separate discussion altogether.

2000 to 2010: the Web

The introduction of Linux offered a free, open-source operating system, and the Web (HTTP) enabled a free distribution network. Both Linux and HTTP brought about two key shifts in the computing market:

  • Internet browsers (via HTTP and other open-source protocols) enabled cross-OS access to users, thus ending the application lock-in advantage of Windows.
  • Computing consumption started shifting towards mobile — a phenomenon that was largely missed by Microsoft.

Cross-OS access via the browser and Android (the open-source mobile OS) commoditized the OS layer dominated by Windows and strengthened through its application lock-in. The value capture once again shifted up the technology stack to the application layer, resulting in the unbundling of Microsoft’s grip on the computing market. By 2012 Microsoft’s computing market share had dropped to 20%, and the Linux-based mobile operating system, Android, controlled 85% of the mobile computing market.

2010 to today: the Network and the Cloud

As connectivity and latency improved, with concomitant cost reductions in hardware, the running of applications and the storage of user data began moving from on-device to the Cloud. Cloud is now a key focus of many of the tech giants; evidenced through Google, Amazon and Microsoft’s competing cloud solutions to users and enterprises. User data was aggregated en masse, coupled with significantly reduced running costs of technology businesses and increased competition among software providers ultimately led to the generalized internet-based business model shifting towards offering users free software and networks (platforms) with the intent of monetizing user data.

As a result, market consolidation has happened around Facebook, Apple, Amazon, Netflix, and Google (FAANG). FAANG companies have built up networks that ‘lock’ users into their platforms through powerful network effects. The result is that FAANG companies now have massive monopolized user data silos, resulting in trillions of dollars in market cap.

Apple, albeit, is different from other FAANG companies, in that Apple dominated as a result of the rapid proliferation of mobile technology. Apple has locked developers into their platform via the App Store and takes a 30% fee on any transaction. The App Store is the only portal for developers to access Apple’s massive and valuable user ecosystem, leaving these developers without an option but to pay the 30% fee.

Lastly on the hardware front, intense competition and the pursuit of incremental gains by squeezing every drop out of margins continued. However, it has been posited that the next wave of hardware design will be geared towards application — (see ASIC’s) or function-specific structures to generate incremental performance improvements in narrowly focused jobs.

Leave a Reply

Your email address will not be published. Required fields are marked *