World Domination 201: Breaking Windows

Eric Steven Raymond

Rob Landley

Revision History
Revision 1.04 Jan 2006esr
Initial version.

Table of Contents

Executive Summary
Punctuated Equilibrium: How the Dominant OS Changes
Network effects and Natural Monopolies.
Why the Desktop Matters
Learning From History
Timing the Transitions
The Coming Equilibrium
How Long If We Fail?
The Deadline For Dominance
Title to be supplied
Fire up the crystal ball
The contenders
Windows x86-64
Mac OS X
Conclusion

Executive Summary

Those who cannot learn from history are doomed to repeat it.

--George Santayana

The transition to 64-bit hardware is opening a critical window of vulnerability during the next two years, during which the dominant operating system of the next three decades will be chosen. A historical analysis of Moore's Law indicates that the new 64 bit desktop operating system will become entrenched by the end of 2008, and that this is a hard deadline. The last such transition was in 1990, the next one is scheduled for 2056.

The three contenders for the new 64 bit standard are Windows-64, MacOS X, and Linux. The winner will be determined by desktop market share among non-technical end users. If Windows or MacOS X becomes the new standard, we can look forward to decades of steadily increasing DRM, locked-down video and wireless hardware, escalating patent claims, and other forms of attack as yet unimagined.

Part 1 of this paper establishes why 2008 is a hard deadline. Part 2 examines the current state of the three major contenders vying to become the new 64 bit standard. Part 3 examines the major blocking issues to Linux's desktop acceptance.

Punctuated Equilibrium: How the Dominant OS Changes

Network effects in software, especially mass-market operating systems, make it highly advantageous to be using the same software as everyone else. Migrations en masse happen only when there is no alternative. Accordingly, the dominant operating system is normally only vulnerable to being knocked off its perch when it is obsolesced by a change away from the hardware platform it is tied to.

The pattern is constent since the beginning of the personal-computer industry in the mid-1970s: software platform transitions follow hardware platform transitions. Long periods of monopolistic stability are interrupted by episodes of brief rapid change, in a pattern resembling the "punctuated equilibrium" of evolutionary biology.

8-bit computers were displaced by DOS during the 8-to-16-bit transition, DOS was displaced by Windows during the 16-to-32-bit transition, and Win-32 will be displaced during the 32-to-64-bit transition. But the window of opportunity won't last long; once the new dominant system is chosen for the 64-bit era, competition will effectively end for a long time — perhaps decades.

It is tempting for open-source boosters to assume that we have enough of a technical advantage to ignore the 32-to-64-bit transition and win during the 64-bit equilibrium by doing what we have been doing. But it isn't necessarily so; the historical example of CP/M vs the Apple II even shows that commodity platforms do not necessarily replace proprietary ones in the absence of a hardware platform transition. Even the incumbency advantage isn't enough to steer the market: Microsoft itself spent the 1980s trying and failing to replace DOS with Xenix, OS/2, and early versions of Windows before the 16-to-32-bit transition opened the way for Windows 3.0. Linux's glacial progress on the desktop over the last decade is not an aberration; between transitions the dominant platform is nearly unassailable.

Network effects and Natural Monopolies.

Users buy the platform that has the most software, and developers write software for the platform that has the most seats. This positive feedback loop establishes a standard, and if that standard is owned it creates a natural monopoly.

Network effects cause the value of connected sets of nodes to grow faster than the linear increase in nodes (anywhere from n log(n) to n2). This allows even slightly larger networks to become "the obvious choice" for new installations and new product introductions, and eventually to take established market share from smaller networks. In a networked market, the largest player tends to get larger, and once past 50% market share usually continues up the s-curve to monopoly status.

A related factor is amortization of fixed costs. In the computer industry, one-time start-up costs (such as authoring software or building a microchip fabrication facility) are dominant, and incremental per-unit costs are minor in comparison. Thus pricing is almost entirely a matter of unit volume, and whoever ships the most units will have a better price to performance ratio as well as more research and development capital for the next iteration.[1]

This is why replacing the market leader, even with a clearly superior product, is extremely difficult.

Why the Desktop Matters

Another temptation for open-source advocates is to believe that winning the desktop is not really important — that we can fort up in the back room, hunker down around our servers and our open-source idealism, and survive a hostile desktop monopoly, eventually even triumphing over it by sheer technical merit.

Under today's conditions this is a dangerous illusion. A symptom of the problem is that in 2006 we have exactly zero 3D-accelerated graphics cards with open-source drivers. The state of open-source drivers for wireless hardware is nearly as dire. But much worse could be in the offing — hardware DRM built into motherboards (such as Microsoft's Xbox) remains a threat that could lock open-source software out of new hardware entirely. Similar locks on content, as exemplified by Apple's iTunes and the various locking achemes being mooted for e-books, threaten to exclude Linux systems from legal use in critically important markets.

More generally, the techno-optimist position ignores a large pattern in the evolution of technology. New technologies reshape law and society in their early, emergent phases — but as technologies mature, the constraints on them tend to be less and less about the limits of technology itself, more and more about the regulations and laws that surround them. Open source is no longer a novel, emergent technology, and the trend in the laws surrounding it, exemplified by the Digital Millennium Copyright Act and recent developments in patent law, is not a positive one.

To prevent hardware lockdowns, the open-source community needs market power. To prevent the legal environment from becoming even more hostile, open-source users need to be enough of an identifiable constituency that attempts to legislate open source out of existence will cause pain for politicians. Both of these requirements mean we need more user/stakeholders, and lots of them.

We're not going to find those users in server rooms; that's a high-margin but low-unit-volume business. A quarter-century of experience tells us that commodity-hardware manufacturers and ISVs treat anything less than 10% of the consumer-desktop market as statistical noise. (It is instructive in this respect to consider the history of what did, and didn't, get supported on Apple computers with about 5% market share.)

It also behooves us to remember that open source has active and malevolent opponents. Chief of these is Microsoft, but the music and movie industries also see us as a threat to their business models, and various opportunists (from patent trolls to spammers) are happy to exploit any opening for profit. Against such pressure there is safety only in numbers — a user base large enough to generate a shift in aggregate market demand.

We need to be big in the consumer market — and given the tendency of network effects to squeeze out minority players, this means we need to dominate it. Otherwise legal and regulatory restraints may combine with the expected timing of the next platform transition to lock in a hostile monopoly for a very long time.

Learning From History

Now, from the general to the specific. In the next two subsections, we'll look at the actual history of punctuated equilibrium in the personal computer industry. The trend curves we extract will enable us to make a fairly exact prediction of the length of desktop Linux's market window during the 32-bit-to-64-bit transition, and the length of the next equilibrium period after that.

Timing the Transitions

Hardware platform transitions are determined by changes in the size of addressable memory, demand for which doubles every 18 months on a curve roughly predicted by Moore's Law. For the past 25 years, the low end of retail "desktop system" memory has been about 2^((year-1975)/1.5) kilobytes, and the high end about 4 times that.[2]

YearLow EndHigh EndSystem
19751k4KAltair (Apr 1975: 1k)
19784k16kApple II (Jun 1977: 4k)
198116k64kPC (Aug 1981: 16k), Commodore 64 (1982)
198464k256kMacintosh (Jan 1984: 128k), Amiga 1000 (1985: 256k)
1987256k1MAmiga 500 (1987: 512k), Deskpro, PS/2
19901M4MWindows 3.0 (1990) 3.1 (1992), Linux 0.01 (1991)[a]
19934M16MOS/2 3.0 (Nov 1994), Linux 1.0 (Mar 1994)
199616M64MWin 95 (Aug 1995), Linux 2.0 (Jun 1996)
199964M256MWin 2k (Feb 2000), xp (2001), Linux 2.2 (Jan 1999)
2002256M1GLinux 2.4 (Jan 2001), MacOS X (Mar 2001)
20051G4GLinux 2.6 (Jan 2003), Win x86-64 (Apr 2005)
20084G16GThe new 64-bit desktop.

[a] Footnote: Linus Torvalds had 4 megs in 1991, but implemented swapping to support people with 2 megabyte systems within a few months.

Every three years, the high end became the new low end, and although systems did sell outside of the above range, this is where "the desktop" was at.

New platforms are introduced when the memory limit of the old platform hits the high end, and new sales of the old platform tail off when its memory limit falls off the low end. 64k was high end in 1981, at the introduction of the IBM PC. 64k fell off the low-end after 1984, thus Apple II and Commodore 64 sales dried up prompting the ouster of Steve Jobs and Jack Tramiel. 1 megabyte hit the high end in 1987 and 386 chips showed up in the Compaq Deskpro 386 and IBM's PS/2. 1 megabyte fell off the low-end after 1990, and even though DOS limped on with expanded/extended/extruded memory and DPMI, Windows 3.0 (desperately poor though it was) became the new standard.[3]

The doubling time of Moore's Law varies by component — hard drives double slightly faster than average and video resolution doubles more slowly — but the oft-quoted 18 month figure was derived from Intel's original core business, DRAM. It remains surprisingly accurate in part because memory and motherboard vendors schedule new product releases around it to avoid either falling behind the competition or cannibalizing the existing market unnecessarily.

With an 18 month doubling time, a few months make a significant difference. The Altair started shipping in April, and the PC in August. These figures are thus roughly mid-year, but a fudge factor of plus or minus six months is reasonable for regional variations, distribution channel delays, ambient humidity, etc.

The original mid-1981 launch of the IBM PC had 16-64k of ram, and that wasn't just IBM's guess but based on market research of what people would buy at that time. In mid-1975, the "mainstream" demand for the MITS Altair was approximately 1-4k: the system shipped with 1k, but running Micro-soft BASIC took 4k.[4]

The Coming Equilibrium

The next hardware platform transition is occurring now. 4 gigabytes hit the high end in 2005, when market forces forced even Intel to ship AMD's x86-64 instruction set rather than be left behind[5] and 64-bit laptops showed up in stock at Fry's. The volume ramp is underway, and since Apple abandoned the PowerPC for x86-64 there's no longer any question: we know what the new hardware platform will be. The historical record indicates that the PC distribution channel will finish the transition to 64 bit hardware by the end of 2008.

As the new 64 bit systems hit the low end, the S-curve of 64 bit adoption will go close to vertical. This is the last chance to capture 50% market share, when the volume of new adoptions decides upon a software platform to go with their new hardware.[6] After the s-curve flattens out again, gaining market share from undecided new users will no longer be a signifcant influence, and the largest platform will leverage its network effect advantage to consolidate the market. After 32 bit systems fall off the low end, the new 64 bit software platform will be entrenched.

If the new 64-bit software platform established in 2008 is not Linux, displacing it will be extremely difficult, perhaps impossible, for decades. To understand why, it's helpful to look back at the history of the first two equilibrium periods after 1975, the ages of the 8- and 16-bit machines.

The first mover in the 8-bit microcomputer world, Micro Instrumentation and Telemetry Systems (which manufactured the Altair), entered the computer industry in a near-bankrupt state, and had trouble scaling to meet demand. The MITS Altair was eventually cloned to produce the first commodity hardware platform (CP/M machines based on the S/100 bus), but the delay allowed another company (Apple) to enter the market with venture capital funding, scale production rapidly, and leapfrog into first place with more than 50% market share leading to a billion dollar IPO in 1980.

Many smaller players emerged (Commodore, Apple, TI, Tandy) but the number two position was collectively held by the commodity S100 machines running CP/M, and Apple stayed focused on its largest rival rather than the smaller targets. The proprietary Apple II hardware couldn't take market share away from the commodity S100 hardware, preventing Apple's climb up the S-curve to monopoly. But the S100 machines made only glacial headway commoditizing Apple's dominant market share. [7]

Years later Apple found its position reversed against IBM's PC. Apple's Macintosh was a clearly superior product (a 32-bit GUI in a 16-bit command line world), but the PC's 3-year headstart in the 16 bit space put it beyond 50% market share, and once PC hardware was cloned it gained commodity status as well.[8] By uniting commodity hardware with superior market share, the PC swept the field against the Macintosh, Amiga, Atari ST, and others. After 1990, later battles (Windows vs OS/2, Windows vs Java, Windows vs Linux) would primarily be fought within the context of PC hardware.

The current Linux vs Windows battle mirrors the earlier CP/M vs Apple II. Linux is now 15 years old, but preinstalling it is not an option on most laptops purchased today. Commodity status can be an effective defense against superior market share, but by itself is not an effective offense against it.[9]

Another data point is that Linux rose to prominence in the server market with the rise of the web. Linux and the web grew up together. Linux leveraged PC hardware to displace the initially dominant Sun servers while Sun was distracted by the loss of its own workstation market, both of which were tied to expensive proprietary hardware. But the important point is that web servers created a huge number of new server installations, which Linux captured at their creation to gain dominant market share.

Microsoft was late to this party: they didn't notice the internet until 1995. Microsoft has tried very hard to leverage its desktop monopoly to take over the internet server market, but in this space it faces the same challenges Linux does on the desktop: it's up against entrenched Unix systems, a network in which Linux could participate (and take share from smaller players), but Windows could not.

The Netcraft web server survey shows that the open source program Apache became the dominant web server in January 1996, replacing the abandoned NCSA Telnet. Microsoft's IIS wasn't even introduced until Apache had almost 1/3 of the market, and although Microsoft's billions of dollars and repeated attempts to leverage its monopoly (see Halloween I) have kept it in the game, they have not dragged Apache back below the critical 50% market share point. In a way, the web server market is replaying the stalemate between the Apple II and the Altair clones. Windows won't go away with Microsoft's billions of dollars and desktop leverage behind it, but it makes only glacial progress against the market leader, Linux. Linux's server share keeps its development strong, but the desktop remains with Windows.

How Long If We Fail?

It took us 50 years to exhaust the the first 32 bits, from the Univac to 2005, which roughly matches Moore's Law's estimate of 48 years. It took 18 years (1987 to 2005) to go from 16 bits to 32 bits. Using the next 32 bits (to exhaust) 64 bits can thus be expected to take anywhere from 36 to 50 years.

If Linux fails to take the desktop this time around, we'll probably have to wait at least three decades for another window to open.

The Deadline For Dominance

The three contenders to replace 32 bit Windows as the new desktop standard are Windows 64, Linux 64, and Mac OS X. Once the choice is made, the history we've just examined suggests it will persist for many years.

Linux on the desktop is on a deadline. If it doesn't get its act together soon, the opportunity will be missed. Our next paper will examine what Linux must do over the next two years to take advantage of this window.

Title to be supplied

The interesting points are where a CPU architecture has to be abandoned. When the old system's memory range hits the high end, early adopters have to start buying a whole new type of machine, and determine which one will start a volume ramp-up. When the old system's memory range hits the low end, nobody is willing to buy the old system anymore because they need more memory.

So in 1981, IBM introduced a new 16-bit machine as 64k hit the high end and demand for something that could use more memory started to materialize. But it wasn't until mid 1984 that 64k fell off the low end and suddenly the market for the the Apple II and Commodore 64 and CP/M machines dried up and blew away. The installed base lasted for a few more years, but new sales tailed off pretty quickly.

The next wall was 1987, when 1 megabyte hit the high end. Compaq stopped waiting for IBM and introduced the Desqpro 386 because early adopters were demanding it, but it wasn't until three years later (1990) that 1 megabyte hit the low end and suddenly a replacement for DOS became an immediate problem. (And Microsoft shipped a crappy Windows 3.0, but did so into a market vacuum that was ready to abandon dos for anything that could use a larger address space.) By the time IBM shipped OS/2 2.0 (in 1992) it was just too late, Windows had already picked up enough users to become the new standard.

Windows 3.0 wasn't Microsoft's first attempt to replace DOS. Microsoft pushed OS/2 1.0 pretty hard circa 1988, and before that they'd pushed Xenix hard circa 1983. They beat people over the head with Windows for five years before anyone showed any interest, and people picked up on 3.0 which was a darn crappy version of it, so it wasn't that they suddenly fixed the problems with the thing.[10] Once DOS was established as the standard, Microsoft _itself_ couldn't budge it, until the hardware platform it ran on became obsolete. OS transitions happen when hardware platforms transition.

Six months into 2005, x86-64 systems showed up at Fry's and started up the S-curve. Even Apple switched from Power PC to x86-64, in part to take advantage of the economies of scale from the new standard PC platform.[11]

A few years later, 32 bit systems will drop off the low end and stop selling at all (outside of the embedded market). And that is when the new desktop software standard gets locked in.

Fire up the crystal ball

In theory, three years later 4-gigabyte systems will fall off the low end and 32-bit x86 systems will no longer be viable outside of the embedded market. Eventually, new systems will no longer regularly be sold with less than 4 gigabytes preinstalled. At that point, the win-32 API becomes a legacy system with little or no new software developed for it.

In practice, predictions aren't so reliable without the benefit of hindsight. Right now the computer industry's memory range is running a bit behind the historical curve [12]. This seems to be due to a widespread shift to notebooks, which have historically been a generation behind desktop systems. In May 2005 notebook sales passed desktops [13], and the resulting intense price pressure on workstations has actually resulted in the desktop memory range dipping below that of new notebooks. Notebook volumes have overtaken desktop volumes to the point that the economies of scale are reversing, and traditional systems constructed out of notebook components ("blade servers" and "blade workstations") grow increasingly common.*

So when do 32-bit notebooks and desktops go away? Major hardware manufacturers, following Moore's Law, tentatively expect this to happen around mid-2008, and they're the ones who will make it happen. It pays to be ready on their schedule.

But the important point is that the obsolescence of 32-bit hardware doesn't determine the new software standard, it will merely lock it in place. The decision is happening right now. And whatever the new platform is, history indicates we'll be stuck with it for a long time.

The contenders

Win64, Linux-64, and MacOS X could all become the new standard for 64 bit desktop software. All three have problems.

Windows x86-64

Microsoft shipped x86-64 versions of Windows XP Professional and Server 2003 on April 25, 2005. They didn't work.

If Microsoft can't ship Vista 32-bit they're not likely to capture the 64 bit transition. Microsoft has undergone a severe brain drain in the past decade. First the antitrust trial hit morale hard circa 1998, then the permatemps decision (Vizcaino v. Microsoft) costing them lots of experienced people. The stock plateaued in 2000 which took away the incentive of anyone there for the money. Now Google is hiring away anyone with a brain who's left. [14] There are strong organizational reasons they haven't done anything new in six years, it's not just bad luck.

On the other hand, they still made $3.6 billion (net) last quarter alone, and have enough cash on hand to buy Home Depot outright. Their exclusive distribution contracts with computer resellers still make it difficult to buy a laptop without Windows preinstalled. Don't count them out yet.

Linux-64: I've been using Linux exclusively ask my desktop and laptop operating system since 1998, so when I say we suck I say it with conviction.

When somebody with a degree in finance or architecture or can grab a Linux laptop and watch episodes of The Daily Show off of Comedy Central's website without an established Linux geek walking them through it, maybe we'll be relevant again.

But it's hard to forget that when the call went out for a big company to stand up and fight for DeCSS or open source 3D drivers, Red Hat yanked MP3 playback support and slunk off into the server market. The chicken and egg problem with hardware support is why Linux needs to win the desktop in the first place, but you can't win if you don't even try.

Novell was trying for a while there. Gnome developers drank the .NET kool-aid that even Microsoft has since set aside, and KDE developers doesn't listen to anybody, ever, about anything, but Novell was getting Gnome and KDE to talk to each other... before Chris Stone and Hubert Mantel were sacrificed on the altar of Jack Messman's ego.

Between the two of them, that pretty much knocks us out of contention for now. Even our friend IBM is unlikely to preinstall any of the smaller players on its laptops and workstations before it's too late to do any good.

As for the smaller players, Knoppix would be promising if installing the contents of the CD onto the hard drive didn't confuse it so badly. Ubuntu may get over its teething troubles but time is running out. Mandriva: it's hard to see an alliance between France and Brazil coming to the rescue, but stranger things have happened. Gentoo could still be repackaged into something usable by mere motals the way Debian was by Knoppix and Ubuntu. People are trying, but time is running out.*

footnote: There's a reason I didn't mention Lindows or Xandros.

But I'm not bitter, really.

Mac OS X

Steve Jobs brought Apple back from the dead. Mac OS X is Unix under the covers, has third party software support (from Quicken to World of Warcraft), and handles video and audio just fine. It has its act together, is nice and shiny, and has traction now. Their unit volumes are surprising: Apple literally could not manufacture the Mac Mini fast enough. Aunt Tillie already has a Mac.

Apple is moving closer to PC hardware, switching from Power PC to x86-64. The question is, will Apple agree to license MacOS X for Dell, HPaq, and Thinkpads? Dell isn't going away. If Apple doesn't let Dell ship systems that run MacOS X then Dell has no choice but to ship a competitor to MacOS X. No matter how good Apple is, they aren't going to be bigger than the entire commodity PC market, which includes white boxes and everything Taiwan can belt out. If Steve dreams of decommoditizing PC hardware, then there's still a chance for another OS.

The other question is will Steve Jobs be distracted by his new position as CEO of Disney? (Now that Disney's bought Pixar, Steve is the largest Disney shareholder and the company's desperate for a CEO to replace Michael Eisner.) And is the ipod tail wagging the macintosh dog?

Steve Jobs has to know that 2008 is the time to deliver the coup de grace to make Mac OS X the new standard OS for the PC, but is he willing to let MacOS X move beyond macintosh hardware? Given the option to unseat microsoft and become the new monopoly OS vendor, is he neurotic enough to pass that up due to his built-in prejudices a second time?

Conclusion

Win32 is going away soon. That's a given. If Vista is 32 bit, it's completely and totally irrelevant. If Linux doesn't get its act together on the desktop within a year or two, we'll be facing Mac OS X with Aqua as the new standard desktop OS, built on an open source (BSD) base and with the best GUI on top (Aqua) that anyone's ever seen. (Not just pretty, but Don-Norman usable, low latency, supporting all the data formats anyone could care about, and they've already got Quicken and games and such shipping for them.)

Mac OS X as the new desktop standard wouldn't exactly be bad, but it does mean we'll have to live with binary-only drivers for the foreseeable future. It also means that Disney would own our new OS. (Jobs has already taken action against people who tried to copy Aqua's "look and feel", how do you think he'd respond to a Wine-style cloning effort?)

Linux has gotten complacent hoping to replace Windows on the desktop. We haven't been looking at Apple. And because of that, we may be shut out of the desktop for the next 20 years, which would suck.



[1] This even applies to open source software, although in that case the costs and scale advantages are in terms of volunteer resources (development and testing man-hours, donated bandwidth, access to technical specifications for hardware or data file formats, users lobbying for a friendly legal environment without software patents/encryption export regulations/broadcast flags) rather than the financial resources to buy those things.

[2] Obviously, systems did sell outside this range for other purposes, from embedded systems to mainframes. But that had little or no effect on the desktop market.

[3] A perceptible market vacuum lingered until the release of Windows 95, but once the user base migrated to a new standard with Windows 3.0 it waited for that platform to incrementally improve over the next 5 years. It really is a hard deadline, folks.

[4] The Altair was advertised at the start of 1975 with 256 bytes of memory but was widely considered underpowered, and by mid-year 1k was standard. The first version of Micro-soft basic took 7k, but was stripped down to 4k because that was considered an unreasonable amount of memory to expect people to buy at that time.

[5] Intel's lukewarm adoption of x86-64 may explain why net profits declined 38% over the past year, as it "continues to lose share to archrival Advanced Micro Devices", the inventor of x86-64.

[6] This doesn't mean a platform can wait until the last minute to get its act together, either. Network effects will already be strongly influencing adoption decisions when 64 bit "crosses the chasm" and acquires the high volume to be found at the low end. This simply consolidates the market's decision. It's probably already too late to introduce a new contender into the 64 bit race, the players are Win64, MacOS X, and Linux for x86-64.

[7] The Apple II actually outlived CP/M. The Apple IIc Plus was released in September 1988, after Digital Research had lost a lawsuit with apple over the GEM GUI, and retooled CP/M into the DOS clone DR-DOS.

[8] PC hardware was unusually vulnerable to cloning because it was designed as a 16 bit successor to Apple's largest competitor, the commodity CP/M machines.

[9] Indeed, in some important ways Linux would have more difficulty displacing dominant 64-bit OS than the historical record suggests. Laws like the DMCA, the prospect of hardware DRM, and closed-source drivers for critical hardware like graphics and wireless will give an incumbent more formidable defenses than CP/M faced and failed to overcome.

[10] footnote: Why did Windows 3.0 rip Microsoft's attention away from OS/2? Because David Weise invented thunking and gave existing 16 bit windows programs the ability to use 4 gigs of memory at once. It wasn't a microsoft corporate initiative, it was one guy breaking the 640k barrier, see: http://blogs.msdn.com/larryosterman/archive/2005/02/02/365635.aspx

[11] footnote: The cell processor is interesting and its use in videogame consoles gives it production volume and economies of scale to have been a potentially viable desktop platform, but as with OS/2 IBM is too late to the party. The hardware decision already has been made. Laptops with x86-64 processors have been for sale at Fry's for over six months, and AMD already has a new generation of the technology (Turion) focused on low power consumption. (The power-to-performance ratio is now almost as important as price-to-performance. Intel continues to try to commit suicide by making the first Pentium M to support EM64T be a dual core space heater, but it took them a while to stop beating their heads against Itanic, too.) The cell's 8 built-in DSP slaves may be great, but if nobody can get a Cell laptop then who is going to write code for it?

[12] footnote: On the other hand, desktop video cards now come with up to half a gigabyte of their own memory, so the situation isn't as clear-cut as it seems.

[13] http://news.com.com/PC+milestone--notebooks+outsell+desktops/2100-1047_3-5731417.html

[14] footnote: Anyone interested in following internal goings-on in redmond, try http://minimsft.blogspot.com. Google for the businessweek article on him.