World Domination 201: Breaking Windows

Eric Steven Raymond

Rob Landley

Revision History
Revision 1.013 May 2006esr
Initial version.

Table of Contents

Executive Summary
Punctuated Equilibrium: How the Dominant OS Changes
Network effects and Natural Monopolies.
Learning From History
Timing the Transitions
The Coming Equilibrium
How Long If We Fail?
The Deadline For Dominance

Executive Summary

Those who cannot learn from history are doomed to repeat it.

--George Santayana

The transition to 64-bit hardware is opening a critical window of vulnerability during the next two years, during which the dominant operating system of the next three decades will be chosen. A historical analysis of Moore's Law indicates that the new 64 bit desktop operating system will become entrenched by the end of 2008, and that this is a hard deadline. The last such transition was in 1990, the next one is scheduled for 2056.

The three contenders for the new 64 bit standard are Windows-64, MacOS X, and Linux. The winner will be determined by desktop market share among non-technical end users. If Windows or MacOS X becomes the new standard, we can look forward to decades of steadily increasing DRM, locked-down video and wireless hardware, escalating patent claims, and other forms of attack as yet unimagined.

Part 1: Breaking Windows establishes why 2008 is a hard deadline. Part 2: Dispatches From The OS Wars examines the current state of the three major contenders vying to become the new 64 bit standard. Part 3: Facing the Music examines the major blocking issues to Linux's desktop acceptance.

Punctuated Equilibrium: How the Dominant OS Changes

Network effects in software, especially mass-market operating systems, make it highly advantageous to be using the same software as everyone else. Migrations en masse happen only when there is no alternative. Accordingly, the dominant operating system is normally only vulnerable to being knocked off its perch when it is obsolesced by a change away from the hardware platform it is tied to.

The pattern is constent since the beginning of the personal-computer industry in the mid-1970s: software platform transitions follow hardware platform transitions. Long periods of monopolistic stability are interrupted by episodes of brief rapid change, in a pattern resembling the "punctuated equilibrium" of evolutionary biology.

8-bit computers were displaced by DOS during the 8-to-16-bit transition, DOS was displaced by Windows during the 16-to-32-bit transition, and Win-32 will be displaced during the 32-to-64-bit transition. But the window of opportunity won't last long; once the new dominant system is chosen for the 64-bit era, competition will effectively end for a long time — perhaps decades.

Network effects and Natural Monopolies.

Users buy the platform that has the most software, and developers write software for the platform that has the most seats. This positive feedback loop establishes a standard, and if that standard is owned it creates a natural monopoly.

Network effects cause the value of connected sets of nodes to grow faster than the linear increase in nodes (anywhere from n log(n) to n2). This allows even slightly larger networks to become "the obvious choice" for new installations and new product introductions, and eventually to take established market share from smaller networks. In a networked market, the largest player tends to get larger, and once past 50% market share usually continues up the s-curve to monopoly status.

A related factor is amortization of fixed costs. In the computer industry, one-time start-up costs (such as authoring software or building a microchip fabrication facility) are dominant, and incremental per-unit costs are minor in comparison. Thus pricing is almost entirely a matter of unit volume, and whoever ships the most units will have a better price to performance ratio as well as more research and development capital for the next iteration.[1]

The desktop matters both because entrenched monopolies defend themselves. Minority players face closed hardware (3D graphics, 802.11g), patented data file formats (Quicktime, mp3), exclusive vendor relationships (Apple's iTunes), locked-down systems (Microsoft's Xbox), and an increasingly hostile legal environment (DMCA, the broadcast flag). These problems can only be addressed with lots of warm bodies, a critical mass of end-users who care enough about these issues to make attempts to corner the market financially and politically painful. Only the desktop market has this level of influence.

Learning From History

Now, from the general to the specific. In the next two subsections, we'll look at the actual history of punctuated equilibrium in the personal computer industry. The trend curves we extract will enable us to make a fairly exact prediction of the length of desktop Linux's market window during the 32-bit-to-64-bit transition, and the length of the next equilibrium period after that.

Timing the Transitions

Hardware platform transitions are determined by changes in the size of addressable memory, demand for which doubles every 18 months on a curve roughly predicted by Moore's Law. For the past 25 years, the low end of retail "desktop system" memory has been about 2^((year-1975)/1.5) kilobytes, and the high end about 4 times that.[2]

YearLow EndHigh EndSystem
19751k4KAltair (Apr 1975: 1k)
19784k16kApple II (Jun 1977: 4k)
198116k64kPC (Aug 1981: 16k), Commodore 64 (1982)
198464k256kMacintosh (Jan 1984: 128k), Amiga 1000 (1985: 256k)
1987256k1MAmiga 500 (1987: 512k), Deskpro, PS/2
19901M4MWindows 3.0 (1990) 3.1 (1992), Linux 0.01 (1991)[a]
19934M16MOS/2 3.0 (Nov 1994), Linux 1.0 (Mar 1994)
199616M64MWin 95 (Aug 1995), Linux 2.0 (Jun 1996)
199964M256MWin 2k (Feb 2000), xp (2001), Linux 2.2 (Jan 1999)
2002256M1GLinux 2.4 (Jan 2001), MacOS X (Mar 2001)
20051G4GLinux 2.6 (Jan 2003), Win x86-64 (Apr 2005)
20084G16GThe new 64-bit desktop.

[a] Footnote: Linus Torvalds had 4 megs in 1991, but implemented swapping to support people with 2 megabyte systems within a few months.

Every three years, the high end became the new low end, and although systems did sell outside of the above range, this is where "the desktop" was at.

New platforms are introduced when the memory limit of the old platform hits the high end, and new sales of the old platform tail off when its memory limit falls off the low end. 64k was high end in 1981, at the introduction of the IBM PC. 64k fell off the low-end after 1984, thus Apple II and Commodore 64 sales dried up prompting the ouster of Steve Jobs and Jack Tramiel. 1 megabyte hit the high end in 1987 and 386 chips showed up in the Compaq Deskpro 386 and IBM's PS/2. 1 megabyte fell off the low-end after 1990, and even though DOS limped on with expanded/extended/extruded memory and DPMI, Windows 3.0 (desperately poor though it was) became the new standard.[3]

The doubling time of Moore's Law varies by component — hard drives double slightly faster than average and video resolution doubles more slowly — but the oft-quoted 18 month figure was derived from Intel's original core business, DRAM. It remains surprisingly accurate in part because memory and motherboard vendors schedule new product releases around it to avoid either falling behind the competition or cannibalizing the existing market unnecessarily.

With an 18 month doubling time, a few months make a significant difference. The Altair started shipping in April, and the PC in August. These figures are thus roughly mid-year, but a fudge factor of plus or minus six months is reasonable for regional variations, distribution channel delays, ambient humidity, etc.

The original mid-1981 launch of the IBM PC had 16-64k of ram, and that wasn't just IBM's guess but based on market research of what people would buy at that time. In mid-1975, the "mainstream" demand for the MITS Altair was approximately 1-4k: the system shipped with 1k, but running Micro-soft BASIC took 4k.[4]

The Coming Equilibrium

The next hardware platform transition is occurring now. 4 gigabytes hit the high end in 2005, when market forces forced even Intel to ship AMD's x86-64 instruction set rather than be left behind[5] and 64-bit laptops showed up in stock at Fry's. The volume ramp is underway, and since Apple abandoned the PowerPC for x86-64 there's no longer any question: we know what the new hardware platform will be. The historical record indicates that the PC distribution channel will finish the transition to 64 bit hardware by the end of 2008.

As the new 64 bit systems hit the low end, the S-curve of 64 bit adoption will go close to vertical. This is the last chance to capture 50% market share, when the volume of new adoptions decides upon a software platform to go with their new hardware.[6] After the s-curve flattens out again, gaining market share from undecided new users will no longer be a signifcant influence, and the largest platform will leverage its network effect advantage to consolidate the market. After 32 bit systems fall off the low end, the new 64 bit software platform will be entrenched.

If the new 64-bit software platform established in 2008 is not Linux, displacing it will be extremely difficult, perhaps impossible, for decades. To understand why, it's helpful to look back at the history of the first two equilibrium periods after 1975, the ages of the 8- and 16-bit machines.

The first mover in the 8-bit microcomputer world, Micro Instrumentation and Telemetry Systems (which manufactured the Altair), entered the computer industry in a near-bankrupt state, and had trouble scaling to meet demand. The MITS Altair was eventually cloned to produce the first commodity hardware platform (CP/M machines based on the S/100 bus), but the delay allowed another company (Apple) to enter the market with venture capital funding, scale production rapidly, and leapfrog into first place with more than 50% market share leading to a billion dollar IPO in 1980.

Many smaller players emerged (Commodore, Apple, TI, Tandy) but the number two position was collectively held by the commodity S100 machines running CP/M, and Apple stayed focused on its largest rival rather than the smaller targets. The proprietary Apple II hardware couldn't take market share away from the commodity S100 hardware, preventing Apple's climb up the S-curve to monopoly. But the S100 machines made only glacial headway commoditizing Apple's dominant market share. [7]

Years later Apple found its position reversed against IBM's PC. Apple's Macintosh was a clearly superior product (a 32-bit GUI in a 16-bit command line world), but the PC's 3-year headstart in the 16 bit space put it beyond 50% market share, and once PC hardware was cloned it gained commodity status as well.[8] By uniting commodity hardware with superior market share, the PC swept the field against the Macintosh, Amiga, Atari ST, and others. After 1990, later battles (Windows vs OS/2, Windows vs Java, Windows vs Linux) would primarily be fought within the context of PC hardware.

The current Linux vs Windows battle mirrors the earlier CP/M vs Apple II. Linux is now 15 years old, but preinstalling it is not an option on most laptops purchased today. Commodity status can be an effective defense against superior market share, but by itself is not an effective offense against it.[9]

Another data point is that Linux rose to prominence in the server market with the rise of the web. Linux and the web grew up together. Linux leveraged PC hardware to displace the initially dominant Sun servers while Sun was distracted by the loss of its own workstation market, both of which were tied to expensive proprietary hardware. But the important point is that web servers created a huge number of new server installations, which Linux captured at their creation to gain dominant market share.

Microsoft was late to this party: they didn't notice the internet until 1995. Microsoft has tried very hard to leverage its desktop monopoly to take over the internet server market, but in this space it faces the same challenges Linux does on the desktop: it's up against entrenched Unix systems, a network in which Linux could participate (and take share from smaller players), but Windows could not.

The Netcraft web server survey shows that the open source program Apache became the dominant web server in January 1996, replacing the abandoned NCSA Telnet. Microsoft's IIS wasn't even introduced until Apache had almost 1/3 of the market, and although Microsoft's billions of dollars and repeated attempts to leverage its monopoly (see Halloween I) have kept it in the game, they have not dragged Apache back below the critical 50% market share point. In a way, the web server market is replaying the stalemate between the Apple II and the Altair clones. Windows won't go away with Microsoft's billions of dollars and desktop leverage behind it, but it makes only glacial progress against the market leader, Linux. Linux's server share keeps its development strong, but the desktop remains with Windows.

How Long If We Fail?

It took us 50 years to exhaust the the first 32 bits, from the Univac to 2005, which roughly matches Moore's Law's estimate of 48 years. It took 18 years (1987 to 2005) to go from 16 bits to 32 bits. Using the next 32 bits (to exhaust) 64 bits can thus be expected to take anywhere from 36 to 50 years.

If Linux fails to take the desktop this time around, we'll probably have to wait at least three decades for another window to open.

The Deadline For Dominance

The three contenders to replace 32 bit Windows as the new desktop standard are Windows 64, Linux 64, and Mac OS X. Once the choice is made, the history we've just examined suggests it will persist for many years.

Linux on the desktop is on a deadline. If it doesn't get its act together soon, the opportunity will be missed. The remaining two papers in this series will examine what Linux must do over the next two years to take advantage of this window.



[1] This even applies to open source software, although in that case the costs and scale advantages are in terms of volunteer resources rather than the financial resources to buy those things. Open source projects may not need money, but they need development and testing man-hours, end-user feedback, word of mouth advertising, donated bandwidth, access to technical specifications for hardware and data file formats, users lobbying for a friendly legal environment without software patents/encryption export regulations/broadcast flags...

[2] Obviously, systems did sell outside this range for other purposes, from embedded systems to mainframes. But that had little or no effect on the desktop market.

[3] A perceptible market vacuum lingered until the release of Windows 95, but once the user base migrated to a new standard with Windows 3.0 it waited for that platform to incrementally improve over the next 5 years. It really is a hard deadline, folks.

[4] The Altair was advertised at the start of 1975 with 256 bytes of memory but was widely considered underpowered, and by mid-year 1k was standard. The first version of Micro-soft basic took 7k, but was stripped down to 4k because that was considered an unreasonable amount of memory to expect people to buy at that time.

[5] Intel's lukewarm adoption of x86-64 may explain why net profits declined 38% over the past year, as it "continues to lose share to archrival Advanced Micro Devices", the inventor of x86-64.

[6] This doesn't mean a platform can wait until the last minute to get its act together, either. Network effects will already be strongly influencing adoption decisions when 64 bit "crosses the chasm" and acquires the high volume to be found at the low end. This simply consolidates the market's decision. It's probably already too late to introduce a new contender into the 64 bit race, the players are Win64, MacOS X, and Linux for x86-64.

[7] The Apple II actually outlived CP/M. The Apple IIc Plus was released in September 1988, after Digital Research had lost a lawsuit with apple over the GEM GUI, and retooled CP/M into the DOS clone DR-DOS.

[8] PC hardware was unusually vulnerable to cloning because it was designed as a 16 bit successor to Apple's largest competitor, the commodity CP/M machines.

[9] Indeed, in some important ways Linux would have more difficulty displacing dominant 64-bit OS than the historical record suggests. Laws like the DMCA, the prospect of hardware DRM, and closed-source drivers for critical hardware like graphics and wireless will give an incumbent more formidable defenses than CP/M faced and failed to overcome.