The Try Athlon
The Bull Argument

By Rob Landley (TMF Oak)

There are so many good things to say about Advanced Micro Devices (NYSE: AMD) that I'm not going to be able to fit half of them in this Duel. In brief, AMD has spent the past year firing on all cylinders, while Intel has stumbled badly. This has happened in the context of the greatest computer component shortage and highest demand in the history of the computer industry.

At the high end, AMD has virtually driven Intel from the field. As of the third quarter of 2000, over 90% of all Gigahertz (i.e., 1000 Megahertz) and faster x86 compatible processors sold so far have been manufactured by AMD. Nine of the ten largest computer manufacturers offer AMD chips. (The holdout, Dell, is still in negotiations with AMD.)

AMD's remarkable results started with the introduction of its Athlon design, which was introduced in August of 1999 at faster clock speeds and higher manufacturing yields than even AMD's own engineers expected. The current Gigahertz versions are an upgraded Athlon version called "Thunderbird," with the addition of extra ("level two") cache memory packaged on a second silicon chip inside the processor's ceramic case, and manufacturing migrated from the .25 micron manufacturing process using aluminum wires to .18 microns using copper. (That's techie speak for "making faster chips that produce less heat.")

Intel's attempts to keep up with AMD at the high end have been a disaster. Intel optimized its current Pentium class designs using "racy logic," meaning some parallel operations aren't properly synchronized but depend on the rate at which signals travel through circuits to ensure things happen in the right order. This is a short-term improvement, because you can design smaller circuits, but it makes moving to a new manufacturing process or a higher clock speed all but impossible. The new manufacturing process can subtly change the timings the circuit depends on. Change the terrain the signals race through and they may cross the finish line out of order. This results in lower chip manufacturing yields, especially from new manufacturing processes that end up producing chips full of bugs that the exact same design simply didn't exhibit under the old manufacturing process.

This is one reason Gigahertz Pentiums STILL aren't available in decent volumes. Intel only announced the product because AMD had announced theirs three days earlier. To make it work, though, Intel will have to redesign the Pentiums so the signals running through the circuits race to completion in the right order.

AMD's Athlon circuitry has always been explicitly synchronized in a well-documented way, and its Gigahertz yields are great without redesigning the chip for each new manufacturing process. Not only that, AMD had been stockpiling Gigahertz chips for a month before announcing their availability, so it could actually ship them to meet orders. The difference has gotten so extreme that Intel's attempts to match AMD's next step, the 1.1 ghz chips, led Intel to ship defective prototypes to reviewers, which the chip giant later admitted simply didn't work, and had to recall.

AMD is also cleaning up at the low end. Intel originally created the Celeron because Clayton Christensen (author of The Innovator's Dilemma) convinced Andy Grove that AMD's low-end chips were a threat to Intel's market dominance. As it turns out, it was already too late. Those low-end chips forced AMD's engineers to get as good (if not better) than Intel's at manufacturing, because AMD's pre-Athlon designs (the K6 family) were (like Intel's current crop) designed to be fast rather than easy to manufacture. But the razor-thin margins AMD eked out of its K6 chips kept the company afloat while it financed the development of the Athlon design, which was not optimized for any one specific manufacturing process but instead designed to move to new manufacturing technology without much work. When AMD's battle-hardened engineers were given an actual scaleable design to manufacture, AMD pretty much took over the high end of the market overnight.

Intel had trouble fighting back with the Celeron because it competed against Intel's own high-end processors, and rather than focus on AMD as the enemy, Intel repeatedly produced intentionally underperforming versions of the Celeron so as not to cannibalize its own high-end market. Once AMD clawed its way to the top, Intel's legendary 50% margins proved a fat target for a company used to living off razor-thin profits anyway. Intel could stop Celeron from cutting into its high-end business, but that just gave AMD the opportunity to do it instead.

These days, AMD's Duron (a stripped-down Athlon with half as much L2 cache and thus cheaper to manufacture) gives the best bang for the buck at the low end. It outperforms Intel's Celeron, especially for the floating-point operations critical to 3D games. Plus, Duron is cheaper to manufacture than Celeron because Intel's decision years ago to move to cartridge-based processor packages rather than simple sockets for chips adds noticeably to the cost of Intel's chips. (As far as I can tell, Intel's main reason for the switch was to up costs for competitors: They wanted a new interface to the motherboard they could patent, in order to keep competitors from plugging their chips into the same motherboards Intel used. Instead, Intel upped its own costs.)

As for the future, Intel plans to introduce the Pentium 4, which will require an enormous heat sink so huge (over a pound) that it has to be bolted to the case so its weight doesn't break the motherboard during shipping. AMD sees no trouble scaling the Thunderbird design all the way to 2 Gigahertz with new manufacturing technology, perhaps even beyond.

The two companies' 64-bit strategies differ as well. Intel's ia64 is a brand-new design that requires software to be completely rewritten, and only intended for a few high-end servers. AMD's x86-64 is an Athlon with 10% more circuitry to add 64-bit features, and aimed at desktop machines as well as servers. Intel's strategy turns 64-bit computing into an all-or-nothing niche market, while AMD treats it like the 386 treated 32-bit computing years ago: something the chip provides more or less for free that users grow into over time, and in the meantime it's good at running the old software faster than its predecessors could.

AMD is now selling more than a billion dollars worth of chips each quarter. This year they expect to pay off $400 million of their $1.5 billion long-term debt, and most of the rest is actually a loan from the German government at around 5% interest as a bribe (ahem, "incentive") to locate a manufacturing facility in an underdeveloped part of Germany. They'd be stupid to pay that off in a hurry instead of using the money to build another fab.

The Bear Argument »

 This Week's Duel

  • Introduction
  • The Bull Argument
  • The Bear Argument
  • The Bull Rebuttal
  • The Bear Rebuttal
  • Vote Results
  • Flashback: Costco

     Related Links

  • AMD Discussion Board
  • AMD Snapshot