Features

The Birth Of The Microprocessor


An invention of major social and technological impact reaches its twentieth birthday

Federico Faggin


There are turning points in the history of technology when something new and major happens. Something unstoppable and irreversible arises (e.g., the automobile, the airplane, or the microprocessor) that becomes the catalyst for sweeping social and technological changes.


Such inventions don't come from new scientific principles but from the synthesis of existing principles. The new form expands the previous one in both predictable and unpredictable ways. Typically, the unexpected consequences are the most valuable.


Such inventions are frequently born out of a few believers' struggle with those who have something to lose from change, set in a background of indifference. Because these inventions have a certain inevitability about them, the real contribution lies in making them work.


You have to believe in the idea passionately enough to carry on the struggle, until it is firmly rooted in the world and has a life of its own. It is a work of intellect and love. On the twentieth anniversary of the introduction of the first microprocessor, the 4004, I would like to tell you the story of the early years.


In the Beginning


In 1969, Silicon Valley was the center of the semiconductor industry, and one-year-old Intel was one of the most prestigious spin-offs from Fairchild Semiconductor. Intel and a few other companies envisioned that semiconductor memories were the wave of the future and would replace the magnetic-core memories then in use.


Later that year, some people from Busicom, a young and enterprising Japanese calculator manufacturer, came to Intel looking for a custom-chip manufacturer. They wanted a set of approximately 10 custom circuits for the heart of a new low-cost, desktop-printing calculator.


Intel was in no position to bid for this totally custom contract. The company had no in-house expertise in random-logic design, and it would have taken too many engineers to do the work. But Ted Hoff, manager of the Application Research Departmentat Intel, thought there was a better way to handle this task.


In those days, there was a controversy about calculator design: standard versus custom. The proponents of custom design were in the majority. They argued that designing general-purpose calculator chips wasn't cost-effective: Standard chips would need to incorporate too many options and thus would be bigger and more expensive than custom-tailored ones.


Standard-design proponents argued that if you structured the calculator as a small programmable computer, it could be both versatile and cost-effective. Fairchild had already done pioneering work in this area, developing a 1-bit serial CPU architecture, as had Rockwell, where Michael Ebertin and his coworkers designed a more sophisticated CPU. The idea of a "CPU on a chip" had been around since the mid-1960s.


Since the invention of the IC in 1959, the semiconductor industry had doubled the number of components integrated into a single chip every year. In the early 1960s, small-scale integration (SSI) allowed a few tens of components to form simple logic gates. By the mid-1960s, medium-scale integration (MSI) enabled a few hundred components to function as counters, adders, multiplexers, and so on. Large-scale integration (LSI), capable of integrating a few thousand components on a single chip, would soon occur.


A few SSI devices had replaced printed circuit boards containing discrete components (e.g., transistors, diodes, and resistors). A few MSI devices had replaced printed circuit boards containing many tens of SSI devices. It was obvious that a few LSI devices could soon replace printed circuit boards containing many tens of MSI devices.


Engineers wondered what kind of a general-purpose function could possibly need that many components. The answer was already evident: semiconductor memories and CPUs for small computers. Such CPUs already needed one or more printed circuit boards that were full of SSI and MSI components. In the late 1960s, LSI arrived, and it was just a matter of time until a CPU on a chip appeared. Hoff saw in the Busicom need an opportunity to define a small set of standard components designed around this CPU-on-a-chip idea.


During the fall of 1969, Hoff, aided by Stan Mazor, an applications engineer at Intel, defined an architecture consisting of a 4-bit CPU, a ROM to store the program instructions, a RAM to store data, and several I/O ports to interface with external devices such as the keyboard, printer, switches, and lights. They also defined and verified the CPU instruction set with the help of Busicom engineers--in particular, Masatoshi Shima.


Enduring the Pains of Birth


While working at Fairchild in 1968, I developed silicon-gate technology, a new process technology for fabricating high-density, high-performance MOS ICs. Intel adopted this technology, allowing it to build high-performance memories and the microprocessor before the competition did. My desire to design complex ICs with silicon-gate techology led me to work for Intel.


So, in April 1970, my new job at Intel was to design a calculator chip set. Presumably, Hoff and Mazor had already completed the architecture and logic design of the chip set, and only some circuit design and chip layouts were left to do. However, that's not what I found when I started at Intel, nor is it what Shima found when he arrived from Japan.


Shima expected to review the logic design, confirming that Busicom could indeed produce its calculator, and then return to Japan. He was furious when he found out that no work had been done since his visit approximately six months earlier. He kept on saying in his broken English, "I came here to check. There is nothing to check. This is just idea." The schedule that was agreed on for his calculator had already been irreparably compromised.


Shima and I were in the same boat. Hoff was away on business and thought his job was finished. Mazor could not resolve the remaining architectural issues that Shima promptly brought up. There I was--behind before I had even begun. I worked furiously, 12 to 16 hours a day.


First, I resolved the remaining architectural issues, and then I laid down the foundation of the design style that I would use for the chip set. Finally, I started the logic and circuit design and then the layout of the four chips. I had to develop a new methodology for random-logic design with silicon-gate technology; it had never been done before.


To make the circuits small, I had to use bootstrap loads, which no one at Intel thought was possible with silicon-gate technology. When I demonstrated them, bootstrap loads were promptly put to work in the ongoing memory designs as well.


I called the chip set "the 4000 family." It consisted of four 16-pin devices: The 4001 was a 2-Kb ROM with a 4-bit mask-programmable I/O port; the 4002 was a 320-bit RAM with a 4-bit output port; the 4003 was a 10-bit serial-in, parallel-out shift register to be used as an I/O expander; and the 4004 was a 4-bit CPU.


The 4001 was the first chip designed and laid out. The first fabrication of the 4001 (called a run) came out in October 1970, and the circuit worked perfectly. In November, the 4002 came out with only one minor error, and the 4003, also completed, worked perfectly. Finally, the 4004 arrived a few days before the end of 1970. It was a major disappointment because one of the masking layers had been omitted in the wafer processing. The run was unusable.


Three weeks after that disappointment, a new run came. My hands were trembling as I loaded the 2-inch wafer into the probe station. It was late at night, and I was alone in the lab. I was praying for it to work well enough that I could find all the bugs so the next run could yield shippable devices. My excitement grew as I found various areas of the circuit working. By 3:00 a.m., I went home in a strange state of exhaustion and excitement.


Verification continued for a few more days. When the testing was finished, only a few minor errors had been found. I was elated. All that work had suddenly paid off in a moment of intense satisfaction.


In February 1971, the 4004 masks were corrected, and a new run was started. At about the same time, I received the ROM codes from Busicom so that I could tool the masks and make the production 4001s for the first calculator.


By mid-March 1971, I shipped full kits of components to Busicom, where Shima verified that his calculator worked properly. Each kit consisted of a 4004, two 4002s, four 4001s, and two 4003s. It took a little less than one year to go from the idea to a fully working product.


Now that the first microprocessor was a reality, I thought that the chip could be used for many other applications. Unfortunately, Intel's management disagreed, thinking that the 4000 family was good only for calculators. Furthermore, the 4000 family had been designed under an exclusive contract. It could not be announced or sold to anyone but Busicom.


The opportunity to prove that the 4000 family was good for other applications came when the need for a production tester arose. The tester was clearly not a calculator application, so I decided to use the 4004 as the tester's main controller. In that project, I gained considerable insight into what could and could not be done with the 4000 family. When the tester was successfully completed, I had additional ammunition to convincingly lobby for the 4000 family's introduction.


I urged Robert Noyce, then president of Intel, to market the 4004. I suggested that perhaps Intel could trade some price concessions for nonexclusivity. (I had heard from Shima that Busicom was hurting in the marketplace and needed a lower price to effectively compete.) Noyce succeeded in obtaining nonexclusivity from Busicom for the 4004 for applications other than calculators. Shortly after that, in mid-1971, Intel decided to market the 4000 family.


In November 1971, the 4000 family, now known as MCS-4 (for Microcomputer System 4-bit), was officially introduced with an advertisement in major trade publications. The main caption read, "Announcing a new era of integrated electronics"--a very prophetic ad.


A Younger but Brighter Sibling


In 1969, Computer Terminal Corp. (now Datapoint) visited Intel. Vic Poor, vice president of R&D at CTC, wanted to integrate the CPU (about 100 TTL components) of CTC's new intelligent terminal, the Datapoint 2200, into a few chips and reduce the cost and size of the electronics.


Hoff looked at the architecture, the instruction set, and the CTC logic design and estimated that Intel could integrate it all on a single chip, so Intel and CTC entered into a contract to develop the chip. The Datapoint CPU chip, internally called the 1201, was an 8-bit device. Intended for intelligent terminal applications, it was more complex than the 4004.


The 1201 looked like it would be the first microprocessor to come out, since its design was started first, and I had four chips to design, the CPU being the last. I was a bit disappointed, but I had enough to worry about. However, after a few months of work on the 1201, the designer, Hal Feeney, was asked to design a memory chip, and the CTC project was put on ice.


In the meantime, CTC had also commissioned Texas Instruments to do the same chip design as an alternative source. At the end of 1970, Intel resumed the 1201 project under my direction, and Feeney was reassigned to it.


Early in June 1971, TI ran an advertisement in Electronics describing its MOS LSI capabilities. A picture of a complex IC with the caption "CPU on a chip" accompanied a description of TI's custom circuit for the Datapoint 2200. The ad continued, "TI developed and is producing it for Computer Terminal Corp...." and gave the chip's vital statistics. The dimensions were 215 mils by 225 mils, a huge chip even for 1971 technology and 225 percent larger than Intel's estimate for the 1201.


The TI chip, however, never worked and was never marketed. It faded away, not to be heard from again until TI's current legal battles. Surprisingly, TI patented the architecture of the 1201, which was Datapoint's architecture with Intel's inputs, and now asserts broad rights on the microprocessor. TI might have been the first company to announce the microprocessor, but making it work was the trick.


An invention requires a reduction to practice, not just an idea. And in 1990, the U.S. Patent Office awarded a patent to Gilbert Hyatt for the invention of the microcomputer chip (about 20 years after his original filing date). News of the award took the industry by surprise because Patent Office proceedings are secret and Hyatt wasn't widely known. While Hyatt was said to have built a breadboard prototype implementation (using conventional components) of his microprocessor architecture, no single-chip implementation was ever produced. Again, this idea was not reduced to practice. For more information on this, see "Micro, Micro: Who Made the Micro?," January 1991 BYTE.


What Gilbert Hyatt, TI, and others failed to do, Intel did: It made the first microprocessor work--at a low cost and in volume production. It took vision, guts, and lots of work to bring to market a product that was different from all the others, a product that required lots of customer training, support, and groundwork. Intel did it, taking a big risk at a time when it was still small and could ill afford to fail.


Three critical tasks had to be performed before the idea of the microprocessor could take root. First, the production technology of the time had to economically implement a useful architecture. Second, someone had to design, develop, and bring the chip to production with sufficiently low manufacturing costs. And third, the microprocessor had to be made available to the general market. This last task required a true belief in the device and its ability to transform hardware design.


During the summer of 1971, as work on the 1201 was progressing nicely, Datapoint decided that it didn't want the 1201 anymore. The economic recession of 1970 had brought the price of TTL down to where the 1201 was no longer attractive. However, because Seiko of Japan had expressed an interest in it, Intel decided to continue with the project. Datapoint agreed to let Intel use its architecture in exchange for canceling the development charges. Intel was free to commercialize the 1201 as a proprietary product.


Designed after the 4004, the 1201 was not too difficult a project. Architecturally, the 1201 was very similar to the 4004--despite the 1201's being an 8-bit CPU--and many of the design solutions used in the 4004 readily applied to the 1201. There was only one bad moment.


Intel was all set to introduce the 1201 (later renamed the 8008) when I discovered some intermittent failures. It took me a feverish week to solve the problem. It was a nasty one, at the crossroads of device physics, circuit design, and layout: The charge stored in the gate of the transistors in the register file was leaking away due to substrate injection. I had to modify the circuit and the layout to fix the problem.


Making the Sale


To use a microprocessor, you first had to visualize a problem as a computer program and then write and debug it in some kind of hardware-simulation environment before committing the program to ROM. Fortunately, Intel had just developed the 1701, the first EPROM to use a floating polysilicon gate as the storage element.


The 1701 was a 2-Kb device programmable with special hardware and erasable with ultraviolet light. Introduced six months earlier than the 1201, the 1701 was a solution looking for a problem. However, it made possible the development of a board that you could use to develop, run, and debug software for the MCS-4.


Microprocessors required much more marketing effort than conventional components. A typical component would have a 6- to 10-page data sheet, and that was all. The MCS-4 had the data sheets, a programming manual, applications notes showing how to use the components, a development board capable of implementing a functional prototype of the hardware, and a cross assembler (i.e., a program running on a minicomputer that allowed the conversion from instruction mnemonics into machine language).


All this paraphernalia required a lot more knowledge, complexity, and cost than the semiconductor industry was prepared to handle. In addition, the engineers had to fundamentally change their approach to hardware design. With the microprocessor, you had to visualize problems in terms of software. This was the hardest obstacle of all.


In April 1972, Intel introduced the 8008, with a group of supporting chips, as a family of products called the MCS-8. The supporting chips were standard Intel products with the names changed. MCS-8 looked impressive, and market interest was high, but sales were slow.


Customers needed more than the simple design aids that Intel offered; they needed far more hardware and software tools, training, and applications support than had been anticipated. So Intel provided them with a variety of software and hardware design aids and fostered a massive engineer training program carried out by external consultants.


Then the idea of a development system arose, and Intel's management decided to commit the company in that direction. The development system is a self-contained computer specialized for developing and debugging microprocessor software. A year after the microprocessor introduction, Intel was receiving more revenues from development systems than from microprocessor chips.


The Real Hotshot


Late in the summer of 1971, I went to Europe to give a series of technical seminars on the MCS-4 and the 8008 and to visit customers. It was an important experience. I received a fair amount of criticism--some of it valid--about the architecture and performance of the microprocessors. The more computer-oriented the company I visited was, the nastier people's comments were.


When I returned home, I had an idea of how to make a better 8-bit microprocessor than the 8008, incorporating many of the features that people wanted: most important, speed and ease of interfacing. I could have boosted both of these features if I had used a 40-pin package instead of the 8008's 18-pin package and integrated the functions of the support chips. Feeney and I had wanted to do that with the 1201, but Intel policy required 16-, 18-, and, on exception, 24-pin packages.


Using the new n-channel process being developed for 4-Kb DRAM would also improve speed and ease of interfacing. I also wanted to make several functional improvements: a better interrupt structure, more memory addressability, and additional instructions.


By early 1972, I started lobbying for the new chip. However, Intel management wanted to see how the market would respond to the MCS-4 and, later, to the MCS-8 introduction before committing more resources. I thought we were wasting time. I had already asked Shima to come to California from Japan to work for me, and visa formalities were under way.


In the summer of 1972, the decision came to go ahead with the new project. I finished the architecture and design feasibility so that my coworkers and I could go full steam when Shima arrived in November.


The first run of the new microprocessor, the 8080, came in December 1973. My coworkers and I corrected a few minor errors, and Intel introduced the product in March 1974. After that, Intel was clearly the leading microprocessor supplier, although other companies had competing products.


In 1972, Rockwell announced the PPS-4 (similar to the MCS-4 but packaged in 42-pin packages). The PPS-4 used four-phase design techniques and metal-gate MOS technology and achieved about the same speed as the MCS-4, thanks to a more parallel operation. Rockwell engineers stemmed the limitations of metal-gate MOS technology for a while, but the PPS-8, introduced after the 8080, was no match for it.


The only serious competition for Intel came from Motorola. Motorola's product, the 6800, used MOS silicon-gate technology and was introduced about six months after Intel's 8080. In many ways, the 6800 was a better product. However, the combination of timing, more aggressive marketing, availability of better software and hardware tools, and product manufacturability--the 8080 chip size was much smaller than the 6800's--gave Intel the lead.


The 8080 really created the microprocessor market. The 4004 and 8008 suggested it, but the 8080 made it real. For the first time, several applications that were not possible with prior microprocessors became practical. The 8080 was immediately used in hundreds of different products. The microprocessor had come of age.


A New Challenge


By the summer of 1974, I had grown restless. From the beginning, I had led all the microprocessor development activity at Intel, and, with time, I was responsible for all the MOS chip-design activity, except that on DRAMs. Intel had grown into a large company, and I found the environment stifling. So, with Ralph Ungermann, one of my managers, I decided to start a company that, unlike Intel, would be totally dedicated to the microprocessor market.


In November 1974, Zilog was founded, and a little more than a year later, the Z80 CPU, the first member of the Z80 family, was born. I had the idea for the Z80 in December 1974. It had to be a family of components designed to work seamlessly together and able to grow. It had to be totally compatible with the 8080 at the machine-instruction level and yet incorporate many more features, registers, and instructions.


After I completed the architecture and the design feasibility and after the financing was arranged, Shima joined Zilog to do the detailed design. By early 1976, the Z80 was a reality, and the demands of my job as president of Zilog had put an end to my engineering career. The Z80 was extremely successful, surpassing my wildest expectations, and Zilog became a major competitor of Intel.


The Z80 was a good product, but its timing was also lucky. The significance of the microprocessor was becoming evident. Computer clubs were sprouting up throughout the U.S. The number of young computer enthusiasts was increasing rapidly, and with them came an enormous amount of creative energy, enthusiasm, and exuberance. That milieu was the breeding ground of the personal computer, the product that popularized the microprocessor.


The personal computer is one example of the unpredictable consequences of a major new technology. Of course, we knew in 1971 that we could buy a little computer that would fit on a desk, but it is the personal computer as a socioeconomic phenomenon rather than as a feat of engineering that was a surprise to me.


By 1977, microprocessors were firmly planted in the world and were becoming part of the fabric of everyday technology. From that point on, it became a matter of building faster, bigger, better, less expensive microprocessors. And the industry has done just that. Fueling this process is the continuing improvement in semiconductor processing technology, the source of the microelectronics revolution.


Action Summary

While the microprocessor has made the personal computing revolution possible, the first single-chip CPUs were not greeted with enthusiasm. Engineers who designed the early microprocessors fought technical battles and management indifference. In hindsight, inventions that change the world seem to have a certain inevitability about them. But the real contribution--and risk--lie not in conceiving them but in making them work.


Federico Faggin conceived, designed, and codesigned many of the earliest microprocessors, including the Intel 4004, 8008, 4040, and 8080, as well as the Zilog Z80. He is cofounder and president of Synaptics (San Jose, CA), a company that is dedicated to the creation of hardware for neural networks and other machine-learning applications. You can reach him on BIX c/o "editors."



Copyright © 1994-1997 BYTE