Intel®
United States Home | Select a Location
Site Map | Contact Us | About Intel
ProductsSupport
Advanced Search
Home ComputingBusinessDeveloperReseller / Provider
press room home
press releases
press resources
corporate information
the Intel newswire
Advanced Search
Site Map
Contact Intel PR
Backgrounder

The History of Intel, 30 Years of Innovation

This year, Intel celebrates its 30th anniversary – 30 years filled with innovation and industry-leading technology. The development of this revolutionary company is a story of vision, willingness to embrace change, and just plain luck; a story that put Intel at the very heart of the Information Age.

Life Before the Microprocessor

The microprocessor has changed our lives in so many ways that it is difficult to recall how different things were before its invention. In the 1960s, computers filled entire rooms. Their expensive processing power was available only to a select few in government labs, research universities and large corporations. The mid-1960s development of the integrated circuit (co-invented by Intel founder Bob Noyce) had enabled the miniaturization of electronic circuitry onto a single silicon chip, but the world was still skeptical. The large scale integration of transistors onto silicon was still an emerging business.

At its founding on July 18, 1968, Intel had carved out a unique challenge: to make semiconductor memory practical. This was quite a stretch, considering that silicon memory was at least 100 times more expensive than magnetic core memory, the leading technology at the time. But Intel’s founders felt that semiconductor memory’s advantages -- smaller size, greater performance, reduced energy consumption -- would convince manufacturers to try the new technology. General-purpose Success It started modestly, when Japanese manufacturer Busicom asked Intel to design a set of chips for a family of high-performance programmable calculators. At the time, all logic chips (which perform calculations and execute programs, as opposed to memory chips, which store instructions and data) were custom-designed for each customer’s product. By definition, this process limited the widespread application of any one logic chip.

That was all about to change. Busicom’s original design for their calculator called for at least 12 custom chips. But Intel engineer Ted Hoff rejected the unwieldy proposal and instead designed a single-chip, general-purpose logic device that retrieved its application instructions from semiconductor memory. As the core of a four-chip set, this central processing unit not only met Busicom’s calculator needs but could be plugged into a variety of applications without needing to be redesigned.

Buying Back the Cash Cow

There was only one problem with the new chip: Busicom owned the rights to it. Hoff and others knew that the product had almost limitless application, bringing intelligence to a host of "dumb" machines. They urged Intel to repurchase the rights to the product. While Intel founders Gordon Moore and Noyce championed the new chip, others within the company were concerned that the product would distract Intel from its memory mission. Finally, the doubters were convinced by the fact that every four-chip microcomputer set included two memory chips. As the director of marketing at the time recalled, "Originally, I think we saw it as a way to sell more memories, and we were willing to make the investment on that basis."

Intel offered to return Busicom’s $60,000 investment in exchange for the rights to the product. Struggling with financial troubles, the Japanese company agreed. At the time the Busicom deal hardly made a ripple at Intel or in the industry. But it paved the way for Intel’s developing vision of ubiquitous microprocessor-based computing.

The Microprocessor Hits the Market

The 4004 microcomputer set (the term "microprocessor" was not coined until later) was formally introduced at the end of 1971. Smaller than a thumbnail and packing 2300 transistors, the $200 chip delivered as much computing power as the first electronic computer, ENIAC. By comparison, ENIAC relied on 18,000 vacuum tubes packed into 3,000 cubic feet when it was built in 1946. The 4004 executed 60,000 operations in one second, primitive by today’s standards, but a major breakthrough at the time.

Soon after the 4004, Intel introduced the 8008 microcomputer, which processed eight bits of information at a time, twice as much as the original chip. As anticipated, both devices began to open up new markets for Intel products. For the first time, affordable computing power was available to designers of all types of products -- and this potential sparked boundless creativity and innovation. The first digital scales appeared at local grocery stores -- the microcomputer converted weights to prices and operated a label printer for marking purchases. Traffic lights were able to detect waiting cars and control traffic more efficiently. The new microcomputer revolutionized everything from medical instruments to inventory computers for fast-food restaurants, airline reservations systems to gasoline pumps, even pinball games and slot machines. Take a look at our sidebar on The Ubiquitous Microprocessor for more information.

Still, neither Intel nor its customers anticipated every potential application for the new products. In one particularly ironic example, Intel Chairman Emeritus Moore remembers, "In the mid-1970s, someone came to me with an idea for what was basically the PC. The idea was that we could outfit an 8080 processor with a keyboard and a monitor and sell it in the home market. I asked, ‘What’s it good for?’ And the only answer was that a housewife could keep her recipes on it. I personally didn’t see anything useful in it, so we never gave it another thought."

Turning Point: The IBM PC

By 1981, Intel’s microprocessor family had grown to include the 16-bit 8086 and the eight-bit 8088 processors. These two chips garnered an unprecedented 2,500 design wins in a single year. Among those designs was a product from IBM: it was to become the first PC.

Without knowing the details of the product, Intel sales engineers had to win IBM’s confidence, since "Big Blue" had never used an outside vendor for a key microprocessor before. As the Intel sales engineer who serviced the IBM account recalled, "Everything was very secretive. When we went in to provide technical support, they’d have our technical people on one side of a black curtain and theirs on the other side, with their prototype product. We’d ask questions, they’d tell us what was happening and we’d have to try to solve the problem literally in the dark. If we were lucky, they’d let us reach a hand through the curtain and grope around a bit to try to figure out what the problem was."

Eventually, Intel’s long-term commitment to the microprocessor product line and ability to manufacture in volume convinced IBM to choose the 8088 as the brains of its first PC. IBM’s decision proved a terrific coup for Intel, but again, it was an event whose significance was not evident at first. The Intel sales engineer who worked with IBM on the project recalled, "At the time, a great account was one that generated 10,000 units a year. Nobody comprehended the scale of the PC business would grow to tens of millions of units every year."

In 1982, Intel introduced the 286 chip. With 134,000 transistors, it provided about three times the performance of other 16-bit processors of the time. Featuring on-chip memory management, the 286 was the first microprocessor that offered software compatibility with its predecessors. This revolutionary chip was first used in IBM’s benchmark PC-AT*.

The Microprocessor Machine

In 1985, the Intel386™ processor hit the streets. Sporting new 32-bit architecture and a staggering 275,000 transistors, the chip could perform more than five million instructions every second (MIPS). Compaq’s DESKPRO* 386 was the first PC based on the new microprocessor.

Next out of the block was the Intel486™ processor in 1989. Accelerated product development was in full bloom, and the new chip showcased the results: 1.2 million transistors and the first built-in math coprocessor. The new chip was some 50 times faster than the original 4004, equaling the performance of powerful mainframe computers.

In 1993, Intel introduced the Pentium® processor, setting new performance standards with up to five times the performance of the Intel486 processor. The Pentium processor uses 3.1 million transistors to perform up to 90 MIPS -- about 1,500 times the speed of the original 4004.

Nineteen ninety-five saw the introduction of Intel’s first processor in the P6 family, the Pentium Pro processor. With 5.5 million transistors, it was the first to be packaged with a second die containing high-speed memory cache to accelerate performance. Capable of performing up to 300 MIPS, the Pentium Pro continues to be a popular choice for multiprocessor servers and high-performance workstations.

Intel started of 1997 with a bang, introducing its revolutionary MMX™ technology, a new set of instructions designed to enhance multimedia performance. All processors following the introduction of the Pentium processor with MMX technology have included this technology. At the same time, Intel introduced its popular BunnyPeople™ characters – the colorful dancing technicians who "put the fun into Intel processors." Intel’s BunnyPeople characters have been featured in commercials, at tradeshow keynotes, and as beanbag dolls. They have popped up in stores all over the globe where Intel processor-based PCs are sold.

As the overall market for desktop PCs grew over time it evolved into three market segments differentiated by the computing needs of various customers -- Enthusiast/Professional, Performance, and Basic PC. Intel’s prior strategy has been to design ever-more powerful processors aimed at the top end of the computer market segment as previous-generation chips migrated to the lower-end market segment. Intel’s new strategy is to use one core technology as a foundation for developing a range of processor products tailored to meet the needs of multiple segments.

With this strategy in mind, Intel introduced the Pentium II processor in May 1997. Pentium II processors, with 7.5 million transistors packed into a unique Single Edge Contact Cartridge, deliver exceptional performance on today’s business applications, and provide headroom for upcoming software, such as more advanced operating systems and 3-D-based Web browsers. Consumer PCs based on Pentium II processors feature new technologies such as DVD players and AGP graphics, delivering the best home computing experience available. Intel also offers Pentium II processors for mobile PCs, bringing a new level of performance and computing capabilities not previously available to mobile PC users.

Introduced in April 1998, the Intel® Celeron® processor is the latest Intel processor optimized to meet the computing needs of Basic PC users. Intel Celeron processor-based PCs meet the core computing needs and affordability requirements of many new PC users, while also providing a balanced platform on which to run some of today’s standard business and home PC applications.

Servers and workstations also got a boost in 1998, with the recent introduction of the Pentium II Xeon™ processor. The newest addition to Intel’s Pentium II brand, the Pentium II Xeon processor, is Intel’s first microprocessor specifically designed for midrange and higher server and workstation platforms. Because server and workstation applications place heavy demands on a processor’s cache architecture, the Pentium II Xeon processor is available with large, fast Level 2 caches of 512 KB or 1 MB that are integrated into the processor cartridge at the full operation speed of the processor core (400 MHz).

"What is different is that we plan on obsoleting our own product line, generation after generation with almost doubling performance every year," said Albert Yu, senior vice president and general manager of the Microprocessor Products Group.

In 1991, PCs based on the Intel486 processor cost about $225 per MIPS of performance. Today, the Pentium II processor delivers dramatically increased performance at a cost of only about $2 per MIPS. As Moore notes, "If the auto industry advanced as rapidly as the semiconductor industry, a Rolls Royce would get a half a million miles per gallon, and it would be cheaper to throw it away than to park it."

The PC Revolution

That first PC sparked a computer revolution. Today the PC is everywhere, with over 200 million in use throughout the world. A child using a Pentium processor-based PC has more computing power than was available to mainframe computer operators just a decade ago, more power than the U.S. government first used to send men to the moon.

The PC has democratized computing around the world. Many people now believe that technological literacy will dictate opportunities for future generations. People’s livelihood will rest on their ability to gather, process and distribute information via increasingly powerful PCs. As microprocessor inventor Hoff reflects, "Information is power. I like the way the microprocessor has spread that power around."

Today, the PC’s emerging status as the linked communications device of choice is revolutionizing modern life yet again. PC-based video conferencing, internal networking and the Internet are standard business communications tools. And people everywhere are turning to their PCs to tap into the Internet, to connect, explore and create new worlds of entertainment, information and communication.

Designing the Modern Chip

Underlying the expanding microprocessor revolution is Intel’s ability to continually reduce the cost of high-performance processing power. What has made this dramatic pace of improvement possible? Much of the performance increase is due to Intel’s remarkable strides in designing leading edge microarchitecture on the latest silicon technology and ramping the product in huge volume -- allowing the company to squeeze increasing numbers of transistors on a chip and leading to more power on desktops for less and less money.

The first microprocessor was developed by two engineers in nine months. Modern microprocessor design requires hundreds of people, clustered into teams dedicated to portions of each chip, working on various phases of design.

Today’s microprocessor designer still has to be concerned with everything that touches the chip. But, unlike the manual design process of yesteryear, today’s designers use sophisticated computer-aided design (CAD) programs running on high-powered workstations to create their complex "maps." Through the use of CAD programs and other tools, the productivity of design has improved enormously, but it is just barely keeping up with the increased complexity and performance. In designing microprocessors of the future, dependence on advanced computer-aided design tools will continue to soar.

Testing has also become a significant part of the design process. Yu explains: "There are literally billions and trillions of lines of code that Intel chips must be compatible with. We have to be completely compatible with earlier generations -- the entire line of Intel Architecture." This is accomplished by using extensive test suites, sophisticated instrumentation and exhaustive validation techniques to diagnose and cure problems.

In contrast, Intel’s 1971 testing was limited to an oscilloscope examination. "Engineers would build a breadboard-a physical model of the chip -- and apply simple tests to verify the circuitry," says Yu. "That was about it." Other early methods included dropping parts on the floor to test their shock resistance and sealing them in everyday, off-the-shelf pressure cookers for six hours.

Manufacturing Miracles

Twenty-five years ago, manufacturing processes were relatively primitive. As Intel Chairman Andy Grove recalled, "The fab area looked like Willy Wonka’s factory, with hoses and wires and contraptions chugging along -- the semiconductor equivalent of the Wright Brothers’ jury-rigged airplane. It was state-of-the-art manufacturing at the time, but by today’s standards, it was unbelievably crude."

Fab fashions now and then:
Intel technicians in the company’s early days wore short smocks over their street clothes. Today, technicians wear specialized "bunny suits" to maintain a sterile environment in the fabrication plants.

Most of the work was done by hand. Fab workers used tweezers to load silicon wafers (from which chips are cut) onto quartz "boats," then pushed the boats into red-hot furnaces. The operators then opened and closed valves by hand to expose the wafers to various gasses for specified amounts of time. According to Gerry Parker, executive vice president and general manager of the New Business Group, "This process was fraught with potential for error. Many wafers came out of the oven looking like extra crispy potato chips."

As wafers grew larger and manufacturing processes needed much more precise control, machines, instead of people, were called on to handle the wafers. Today, microprocessor-controlled robots whisk the wafers between process stages, and operators are called on to keep the complex equipment working at peak efficiency. In addition to more consistent handling, automation has provided the added benefit of isolating workers from physical and chemical hazards.

As the silicon transistors shrank, it became increasingly important to keep damaging particles such as dust and skin flakes away from the developing wafers. In the first fabs, standards were lax: workers did not cover their hair and wore only a simple smock over their street clothes. Soon, special "bunny suits" were introduced to minimize contaminants and improve air purity. Today, workers wear suits of a non-linting, anti-static fabric with face masks, safety glasses, gloves, shoe coverings and even special breathing equipment. As a result, modern cleanrooms are 100 times cleaner than those of 25 years ago.

Intel’s rigorous quality-control efforts have paid off. In the mid-1980s, fewer than 50 percent of Intel’s chips were functional at the end of the manufacturing line. Today, the manufacturing process yields are dramatically higher.

Into the Future

In its first 30 years, Intel technology has enabled developments that were unimaginable a quarter century ago. We may be even more amazed at what emerges over the next 30 years. As microprocessors become faster and more powerful, an endless array of new applications develop and existing applications will spread to far corners of the world. We will witness further integration of audio, video and conferencing capabilities with the Internet. At home, more and more people will be able to view and print family photos from their digital camera, using an intuitive photo editing program to remove red-eye, lighten dark backgrounds, and incorporate the photos into family newsletters and Web pages.

The increase in computing power will also be used to make computers easier to use. Voice and handwriting recognition, local control of complex Internet-based applications, and lifelike animation demand considerable computing power -- all available in the Intel microprocessor roadmap. Yu explains: "Our customers want us to design a microprocessor with millions of transistors that will be able to do 3-D animation on the desktop in real time. That’s a wonderful challenge! We plan to continue this incredible delivery of ever increasing performance at reasonable cost well into the next century."

*Legal Information  |  Privacy Policy
©2002 Intel Corporation