Copyright © 2006 Eric S. Raymond
Copyright © 2006 Rob Landley
| Revision History | ||
|---|---|---|
| Revision 1.0 | 13 May 2006 | esr |
| Initial version. | ||
Table of Contents
Those who cannot learn from history are doomed to repeat it.
In the 1990s Linus Torvalds used to give a talk called World Domination 101 on the early steps he believed Linux would need to take to achieve "world domination — fast" [1]. We've made a lot of progress since then — in fact, with the rise of KDE and GNOME and OpenOffice much of the program Linus originally laid out, focused on GUIs and conventional productivity tools, has been fulfilled. World Domination 201 is about how to go the rest of the way.
The transition to 64-bit hardware is opening a critical window of vulnerability during the next two years, during which the dominant operating system of the next three decades will be chosen. A historical analysis of Moore's Law indicates that the new 64 bit desktop operating system will become entrenched by the end of 2008, and that this is a hard deadline. The last such transition was in 1990, the next one cannot be expected before 2050.
The three contenders for the new 64 bit standard are Windows-64, MacOS X, and Linux. The winner will be determined by desktop market share among non-technical end users. If Windows or MacOS X becomes the new standard, we can look forward to decades of steadily increasing DRM, locked-down video and wireless hardware, escalating patent claims, and other forms of attack as yet unimagined.
Part 1: Breaking Windows establishes why 2008 is a hard deadline. Part 2: Dispatches From The OS Wars examines the current state of the three major contenders vying to become the new 64 bit standard. Part 3: Facing the Music examines the major blocking issues to Linux's desktop acceptance.
Network effects in software, especially mass-market operating systems, make it highly advantageous to be using the same software as everyone else. Migrations en masse happen only when there is no alternative. Accordingly, the dominant operating system is normally only vulnerable to being knocked off its perch when it is obsolesced by a change away from the hardware platform it is tied to.
The pattern is consistent since the beginning of the personal-computer industry in the mid-1970s: software platform transitions follow hardware platform transitions. Long periods of monopolistic stability are interrupted by episodes of brief rapid change, in a pattern resembling the "punctuated equilibrium" of evolutionary biology.
8-bit computers were displaced by DOS during the 8-to-16-bit transition, DOS was displaced by Windows during the 16-to-32-bit transition, and Win-32 will be displaced during the 32-to-64-bit transition. But the window of opportunity won't last long; once the new dominant system is chosen for the 64-bit era, competition will effectively end for a long time — perhaps decades.
It is tempting for open-source boosters to assume that we have enough of a technical advantage to ignore the 32-to-64-bit transition and win during the 64-bit equilibrium by doing what we have been doing. But it isn't necessarily so; the historical example of CP/M vs the Apple II even shows that commodity platforms do not necessarily replace proprietary ones in the absence of a hardware platform transition. Even the incumbency advantage isn't enough to steer the market: Microsoft itself spent the 1980s trying and failing to replace DOS with Xenix, OS/2, and early versions of Windows before the 16-to-32-bit transition opened the way for Windows 3.0. Linux's glacial progress on the desktop over the last decade is not an aberration; between transitions the dominant platform is nearly unassailable.
Users buy the platform that has the most software, and developers write software for the platform that has the most seats. This positive feedback loop establishes a standard, and if that standard is owned it creates a natural monopoly.
Network effects cause the value of connected sets of nodes to grow faster than the linear increase in nodes (anywhere from n log(n) to n2). This allows even slightly larger networks to become "the obvious choice" for new installations and new product introductions, and eventually to take established market share from smaller networks. In a networked market, the largest player tends to get larger, and once past 50% market share usually continues up the s-curve to monopoly status.
A related factor is amortization of fixed costs. In the computer industry, one-time start-up costs (such as authoring software or building a microchip fabrication facility) are dominant, and incremental per-unit costs are minor in comparison. Thus pricing is almost entirely a matter of unit volume, and whoever ships the most units will have a better price to performance ratio as well as more research and development capital for the next iteration.[2]
The desktop also matters because entrenched monopolies defend themselves. Minority players face closed hardware (3D graphics, 802.11g), patented data file formats (Quicktime, mp3), exclusive vendor relationships (Apple's iTunes), locked-down systems (Microsoft's Xbox), and an increasingly hostile legal environment (DMCA, the broadcast flag). These problems can only be addressed with lots of warm bodies, a critical mass of end-users who care enough about these issues to make attempts to corner the market financially and politically painful. Only the desktop market has this level of influence.
Trend curves we can extract from history will enable us to make a fairly exact prediction of the length of desktop Linux's market window during the 32-bit-to-64-bit transition, and the length of the next equilibrium period after that.
Hardware platform transitions are determined by changes in the size of addressable memory, demand for which doubles every 18 months on a curve closely predicted by Moore's Law. For the past 30 years, the low end of retail "desktop system" memory has been about 2^((year-1975)/1.5) kilobytes, and the high end about 4 times that.[3] Every three years, the high end became the new low end, and although systems did sell outside of the above range, this is where "the desktop" was at.
| Year | Low End | High End | System |
|---|---|---|---|
| 1975 | 1k | 4K | Altair (Apr 1975: 1k) |
| 1978 | 4k | 16k | Apple II (Jun 1977: 4k) |
| 1981 | 16k | 64k | PC (Aug 1981: 16k), Commodore 64 (1982) |
| 1984 | 64k | 256k | Macintosh (Jan 1984: 128k), Amiga 1000 (1985: 256k) |
| 1987 | 256k | 1M | Amiga 500 (1987: 512k), Deskpro, PS/2 |
| 1990 | 1M | 4M | Windows 3.0 (1990) 3.1 (1992), Linux 0.01 (1991)[a] |
| 1993 | 4M | 16M | OS/2 3.0 (Nov 1994), Linux 1.0 (Mar 1994) |
| 1996 | 16M | 64M | Win 95 (Aug 1995), Linux 2.0 (Jun 1996) |
| 1999 | 64M | 256M | Win 2k (Feb 2000), xp (2001), Linux 2.2 (Jan 1999) |
| 2002 | 256M | 1G | Linux 2.4 (Jan 2001), MacOS X (Mar 2001) |
| 2005 | 1G | 4G | Linux 2.6 (Jan 2003), Win x86-64 (Apr 2005) |
| 2008 | 4G | 16G | The new 64-bit desktop. |
[a] Footnote: Linus Torvalds had 4 megs in 1991, but implemented swapping to support people with 2 megabyte systems within a few months. | |||
With an 18 month doubling time, a few months make a significant difference. The Altair started shipping in April and the PC in August, so these figures are roughly mid-year, but a fudge factor of plus or minus three months is reasonable for regional variations, distribution channel delays, ambient humidity, etc.
The interesting points were where a CPU architecture had to be abandoned, and when software platform for the new architecture became ubiquitous. When the old system's memory range hit the high end, early adopters had to start buying a whole new type of machine, and determined which one would start a volume ramp-up. When the old system's memory range hit the low end, and nobody was willing to buy the old system anymore because they needed more memory. New hardware required new software, and the new software standard generally emerged as the winner about the time the old platform stopped selling.
In 1975 the Altair came with 1k of memory but required 4k to run Micro-soft basic.[4] In 1981, IBM's market research led them to introduce the PC with the ability to install anywhere from 16k to 64k of memory. In 1987, the Compaq Deskpro 386 beat IBM's PS/2 to market with the first 386 based PC in 1987, as 1 megabyte hit the high end highlighting the limit of 16 bit systems. These were the decision points for the new hardware platforms.
Although the new 16 bit hardware platform (the IBM PC) was introduced in 1981, it wasn't until 1984 that 64k fell off the low end, at which point the market for 8-bit systems like the Apple II and Commodore 64 suddenly dried up and blew away. The installed base of 8-bit systems lasted for a few more years, but new sales and software development both tailed off severely enough to oust Apple's Steve Jobs and Commodore's Jack Tramiel from the companies they had created. By this time DOS 3.0 had become the PC standard, eclipsing early challengers like ROM-Basic, CP/M-86, MP/M, and Xenix. The emergence of DOS as the standard PC OS wasn't a sudden thing, just a gradual reduction of other competitors to irrelevance.
The wall for 16-bit systems was one megabyte, which hit the high end in 1987, but it wasn't until three years later (1990) that 1 megabyte fell off the low end and DOS could no longer take full advantage of even the cheapest machines. Into this market vacuum Microsoft shipped Windows 3.0, an all but forgotten version that was not just unbelievably bad by modern standards but universally criticized at the time. But it succeeded because users were ready to abandon DOS for anything that could use a larger address space.[5] Although IBM shipped OS/2 2.0 two years later (March 31 1992, a few days before Windows 3.1's April 6 1992 release), it was just too late. Windows had already picked up enough users from the pathetic but timely version to become the new standard.[6]
Even after 30 years, these long-term trend curves have continued to be predictive. The doubling time of Moore's Law varies by component — hard drives double slightly faster than average and video resolution doubles more slowly — but the oft-quoted 18 month figure was derived from Intel's original core business, DRAM. It remains surprisingly accurate in part because memory and motherboard vendors schedule new product releases around it to avoid either falling behind the competition or cannibalizing the existing market unnecessarily.
Thus in 2005 x86-64 systems showed up at retail outlets like Fry's, and started up the S-curve. Even Apple switched from Power PC to x86-64, in part to take advantage of the economies of scale from the new standard PC platform.[7]
This implies that 32 bit x86 systems become too obsolete to sell into the desktop market three years later, in 2008. The obsolescence of the hardware platform renders Win-32 obsolete as well, forcing the selection of its successor.
The next hardware platform transition is occurring now. When 4 gigabytes hit the high end in 2005, market forces forced even Intel to ship AMD's x86-64 instruction set rather than be left behind[8] and 64-bit laptops showed up in stock at Fry's. The volume ramp is underway, and since Apple abandoned the PowerPC for x86-64 there's no longer any question: we know what the new hardware platform will be. The historical record indicates that the PC distribution channel will finish the transition to 64 bit hardware by the end of 2008.
In theory, three years later 4-gigabyte systems will fall off the low end and 32-bit x86 systems will no longer be viable outside the embedded market. Eventually, new systems will no longer regularly be sold with less than 4 gigabytes preinstalled. At that point, the win-32 API becomes a legacy system with little or no new software developed for it.[9]
As the new 64 bit systems hits the low end, the S-curve of 64 bit adoption will go close to vertical. This is the last chance to capture 50% market share, when the largest surge of new users decides upon a software platform to go with their new hardware. This doesn't mean a platform can wait until the last minute to get its act together, this is when the decision becomes irreversible. Network effects will already be strongly influencing adoption decisions when 64 bit "crosses the chasm" and acquires the high volume to be found at the low end. This simply consolidates the market's decision. It's already too late to introduce a new contender into the 64 bit race.
After the s-curve flattens out again, gaining market share from undecided new users will no longer be a signifcant influence, and the largest platform will leverage its network effect advantage to consolidate the market. After 32 bit systems fall off the low end, the new 64 bit software platform will be entrenched.
The obselecense of 32-bit hardware doesn't determine the new software standard, it will merely lock it in place. The decision is happening right now. And whatever the new platform is, history indicates we'll be stuck with it for a long time.
It took us 50 years to exhaust the first 32 bits, from the Univac to 2005, which roughly matches Moore's Law's estimate of 48 years. It took 18 years (1987 to 2005) to go from 16 bits to 32 bits. Using the next 32 bits (to exhaust) 64 bits can thus be expected to take anywhere from 36 to 50 years.
If Linux fails to take the desktop this time around, we'll have to wait at least three decades for another such window to open.
The three contenders to replace 32 bit Windows as the new desktop standard are Windows 64, Linux 64, and Mac OS X. Once the choice is made, the history we've just examined suggests it will persist for many years.
Linux on the desktop is on a deadline. If it doesn't get its act together soon, the opportunity will be missed. The remaining two papers in this series will examine the possible outcomes, and what Linux must do over the next two years to take advantage of this window.
The point of this paper was that 2008 is a hard deadline. If you doubt this, consider that our trend-curve of address-space growth nailed 2005 as the introduction date for 64-bit systems working from a thirty-year successful retrodiction that includes the entire history of the microcomputer back to the Altair[10] Consider also that the obselecense of 32-bit hardware doesn't determine the new software standard, it will merely lock it in place. The decision is happening right now, as each new x86-64 system gets installed. And whatever the new software platform is, history indicates we'll be stuck with it for a long time. The forces at work are deep, inexorable — and predictable.
The first mover in the 8-bit microcomputer world, Micro Instrumentation and Telemetry Systems (which manufactured the Altair), entered the computer industry (in 1975) in a near-bankrupt state, and had trouble scaling to meet demand. The MITS Altair was eventually cloned to produce the first commodity hardware platform (CP/M machines based on the S/100 bus), but the delay allowed another company (Apple) to enter the market in 1977 with venture capital funding, scale production rapidly, and leapfrog into first place with 50% market share by the time of its one billion dollar IPO in 1980.
Many smaller players emerged (Commodore, Apple, TI, Tandy) but the number two position was collectively held by the commodity S100 machines running CP/M, and Apple stayed focused on its largest rival rather than the smaller targets. The proprietary Apple II hardware couldn't take market share away from the commodity S100 hardware, preventing Apple's climb up the S-curve to monopoly. But the S100 machines made only glacial headway commoditizing Apple's dominant market share. [11]
Years later Apple found its position reversed against IBM's PC. Apple's Macintosh was a clearly superior product (a 32-bit GUI in a 16-bit command line world), but the PC's 3-year headstart in the 16 bit space put it beyond 50% market share, and once PC hardware was cloned it gained commodity status as well.[12] By uniting commodity hardware with superior market share, the PC swept the field against the Macintosh, Amiga, Atari ST, and others. After 1990, later battles (Windows vs OS/2, Windows vs Java, Windows vs Linux) would primarily be fought within the context of PC hardware.
The current Linux vs Windows battle mirrors the earlier CP/M vs Apple II. Linux is now 15 years old, but preinstalling it is not an option on most laptops purchased today. Commodity status can be an effective defense against superior market share, but by itself is not an effective offense against it.[13]
Lesson 2: Adjacent markets can be distinct.
The PC server market is adjacent to the PC desktop market, but also distinct from it. Success in one does not automatically translate to success in the other.
Linux rose to prominence in the server market with the rise of the web. Linux and the web grew up together. Linux leveraged PC hardware to displace the initially dominant Sun servers while Sun was distracted by the loss of its own workstation market, both of which were tied to expensive proprietary hardware. But an important point is that web servers created a huge number of new server installations, which Linux captured at their creation to gain dominant market share. Linux didn't have to displace existing systems, it got in on the server market when the web created its own S-curve of new deployments..
Microsoft was late to this party: they didn't notice the internet until 1995. Microsoft has tried very hard to leverage its desktop monopoly to take over the internet server market, but in this space it faces the same challenges Linux does on the desktop: it's up against entrenched Unix systems, a network in which Linux could participate (and take share from smaller players), but Windows could not.
The Netcraft web server survey shows that the open source program Apache became the dominant web server in January 1996, replacing the abandoned NCSA Telnet. Microsoft's IIS wasn't even introduced until Apache had almost 1/3 of the market, and although Microsoft's billions of dollars and repeated attempts to leverage its monopoly (see Halloween I) have kept it in the game, they have not dragged Apache back below the critical 50% market share point. In a way, the web server market is replaying the stalemate between the Apple II and the Altair clones. Windows won't go away with Microsoft's billions of dollars and desktop leverage behind it, but it makes only glacial progress against the market leader, Linux. Linux's server share keeps its development strong, but the desktop remains with Windows.
Success in an adjacent market provides at least a linear advantage; the linear number of deployments provides resources (monetary or otherwise) that can be poured into the new area. But network effects don't seem to cross into adjacent markets.
The 16-bit and 32-bit markets were adjacent in this way, when Microsoft captured the transition from DOS to Windows. Microsoft managed this because its offering was there on time, and caught the volume ramp in the new space. Microsoft's dominance on the desktop has left its server offering in CP/M's position: a strong number two with the power to hold on to its established market share, but not enough to make signifcant gains.
[1] For a more detailed history of this meme, see Joe Barr's Perceptions of world domination
[2] This even applies to open source software, although in that case the costs and scale advantages are in terms of volunteer resources rather than the financial resources to buy those things. Open source projects may not need money, but they need development and testing man-hours, end-user feedback, word of mouth advertising, donated bandwidth, access to technical specifications for hardware and data file formats, users lobbying for a friendly legal environment without software patents/encryption export regulations/broadcast flags...
[3] Obviously, systems did sell outside this range for other purposes, from embedded systems to mainframes. But that had little or no effect on the desktop market.
[4] The Altair was advertised at the start of 1975 with 256 bytes of memory but was widely considered underpowered, and by mid-year 1k was the standard entry level configuration. The first version of Micro-soft basic needed 7k, but was stripped down to run in 4k because that was considered an unreasonable amount of memory to expect people to buy in 1975. MITS and Micro-soft didn't dictate market demand, they adjusted to it.
[5] The sudden success of Windows 3.0 came as a surprise to Microsoft. The 1 memory barrier was broken by a programmer named David Weise, who invented thunking and gave existing 16 bit windows programs the ability to use 4 gigs of memory without even needing to be recompiled. This wasn't a microsoft corporate initiative, it was one guy who broke the 640k barrier and changed history. The only interesting thing Windows 3.0 could do was allow its programs use more than 1 megabyte of memory, but at the time that was the most important thing it could possibly do.
[6] Since windows is the current 32-bit standard, its introduction deserves a closer look. Windows 3.0 wasn't Microsoft's first attempt to replace DOS. Microsoft pushed OS/2 1.0 pretty hard circa 1988, and before that they'd pushed Xenix hard circa 1983. In between they beat people over the head with Windows for years before anyone showed any interest, and Windows 3.0 was unbelievably bad not just by modern standards but by the standards of the time. Once DOS was established as the standard, Microsoft itself couldn't budge it until the hardware platform it ran on became obsolete.
Despite a perceptible discontent prior to the release of Windows 95, once the user base had migrated to the new standard it waited for that platform to incrementally improve over the next 5 years. In fact once users were on the new standard, Microsoft again had trouble convincing them to upgrade from 3.1 to NT, or from Windows 98 to XP. For over a decade now, the biggest competitor to each new release of Windows has been the installed base of previous versions of Windows."
[7] The Cell processor is interesting and its use in videogame consoles gives it potential production volume and economies of scale to have been a potentially viable desktop platform, but as with OS/2 IBM was simply too late to the party. The hardware decision has already been made. 64-bit Laptops have been available for a year now from the largest vendors to the smallest, and AMD already has a new generation of the technology (Turion) focused on low power consumption. Ever since notebook sales passed desktop volume in May 2005, the power-to-performance ratio has been almost as important as price-to-performance. (Intel continues to try to commit suicide by making the first Pentium M to support EM64T be a dual core space heater, but then they're they're still beating their heads against Itanic, too.) The cell's 8 built-in DSP slaves may be great, but if nobody can get a Cell laptop then who is going to write code for it? The 64-bit hardware race is over: x86-64 won.
[8] Intel's lukewarm adoption of x86-64 may explain why net profits declined 38% over the past year, as it "continues to lose share to archrival Advanced Micro Devices", the inventor of x86-64.
[9] Every platform grows memory extensions as it nears its end of life. The 8-bit systems had a technique called "bank switching", the DOS machines limped along with extended/expanded/extruded memory, and these days we have Intel's Page Addressing Extensions. It doesn't make much difference; rewriting software to jump through these hoops is as much work as porting the software to a new platform where accessing more memory is natural and easy.
[10] In retrospect, one of the many reasons for the abject failure of Intel's 64-bit Itanium processor was that 1998 was just too early.
[11] The Apple II actually outlived CP/M. The Apple IIc Plus was released in September 1988, after Digital Research had lost a lawsuit with apple over the GEM GUI, and retooled CP/M into the DOS clone DR-DOS.
[12] PC hardware was unusually vulnerable to cloning because it was designed as a 16 bit successor to Apple's largest competitor, the commodity CP/M machines.
[13] Indeed, in some important ways Linux would have more difficulty displacing a dominant 64-bit OS than the historical record suggests. Laws like the DMCA, the prospect of hardware DRM, and closed-source drivers for critical hardware like graphics and wireless will give an incumbent more formidable defenses than CP/M faced and failed to overcome. The idea of a commodity market for computers is no longer new, surprising, or dismissably radical.