Book cover of Moore’s Law by Arnold Thackray

Moore’s Law

by Arnold Thackray

18 min readRating: 3.9 (123 ratings)
Genres
Buy full book on Amazon

Introduction

In the pantheon of technology pioneers who shaped the modern digital world, a few names stand out - Bill Gates, Steve Jobs, Alan Turing. But there's one figure who deserves to be at the very top of that list, even though he's far less of a household name: Gordon Moore.

As the co-founder of both Fairchild Semiconductor and Intel, Gordon Moore played an instrumental role in two of the most influential technology companies of the 20th century. But his most profound impact came from something known as Moore's Law - an uncannily accurate prediction made in the 1960s about the exponential growth of computing power that has shaped the entire trajectory of the digital revolution.

In "Moore's Law: The Life of Gordon Moore, Silicon Valley's Quiet Revolutionary," author Arnold Thackray takes us on a journey through the life and career of this unassuming yet enormously consequential figure. From his early passion for chemistry to his groundbreaking work in semiconductors and microprocessors, we see how Moore's analytical mind and visionary insights drove the relentless progress of computer technology over decades.

This is the story of how a quiet, reserved man became one of the primary architects of our modern technological world. It's a tale of scientific breakthroughs, business acumen, and a prescient understanding of where technology was headed. Through Moore's eyes, we witness the birth and explosive growth of Silicon Valley and the digital age.

A Young Chemist's Passion

Gordon Moore's journey to becoming a technology pioneer began with an early love of science, particularly chemistry. Born in San Francisco in 1929, Moore was a reserved but exceptionally focused child. His life changed forever in 1940 at the age of 11, when his best friend received a chemistry set. The two boys began experimenting, with a particular fondness for making explosives and blowing things up.

This hands-on approach to science captivated young Gordon. He found that chemistry suited his analytical mind better than pure mathematics, as he could see its tangible effects on the physical world. By the time he reached high school, Moore was far ahead of his classmates in chemistry knowledge and displayed remarkable confidence in his ideas.

Moore's passion for chemistry - and explosions - only grew as he got older. He conducted experiments with nitroglycerine at home and even started making firecrackers for his friends to use in neighborhood pranks. This early experimentation, while certainly risky, helped cultivate Moore's love of scientific inquiry and his willingness to push boundaries.

It was during his high school years that Moore also experienced another kind of chemistry - the romantic variety. In 1947, he met Betty Irene Whitaker, a vivacious journalism student who was in many ways Moore's opposite. While he was quiet and reserved, she was outgoing and headstrong. This contrast intrigued Moore, and Betty in turn was drawn to his quiet confidence. It was the beginning of a lifelong partnership.

Higher Education and New Horizons

Moore's talent for chemistry opened doors to some of the best universities in the country. In 1948, he was accepted to the University of California, Berkeley, aided by glowing recommendation letters from his professors at San Jose State.

Berkeley in the late 1940s was an exciting place for a young chemist. While the East Coast had long been the center of academic science in America, California's booming economy was shifting the balance westward. Moore had the opportunity to work with several influential figures in the field, including two professors who would go on to win Nobel Prizes.

One of Moore's most influential mentors at Berkeley was George Jura, an assistant professor in the physical chemistry department. Jura instilled in his students a healthy skepticism toward established scientific literature, urging them to challenge existing ideas through original research. This approach resonated with Moore's experimental nature and helped shape his future scientific endeavors.

Moore's talent and work ethic quickly gained notice. In 1950, he was accepted to the prestigious California Institute of Technology (Caltech) for graduate studies. By this time, his relationship with Betty Whitaker had deepened, and he invited her to join him in this new chapter of his life - an invitation that was essentially equivalent to a marriage proposal in those days. The couple married and embarked on their life together.

At Caltech, Moore found himself in the midst of a technological boom. The aerospace industry was rapidly advancing, relying heavily on electronics and early computer systems for complex calculations. It was a thrilling environment for a young scientist passionate about pushing the boundaries of what was possible.

Under the guidance of Professor Richard McLean Badger, Moore's work in experimental chemistry flourished. He conducted research on nitrogen compounds, work that was supported by the military due to its applications in explosives being used in the Korean War. Moore's earlier experiments with nitroglycerine made him well-suited for this research.

In 1951, at just 22 years old, Moore published his first scientific paper on nitrous acid in the Journal of Chemical Physics. He completed his Ph.D. in a remarkably short three years, finishing in 1953. This rapid progress through his studies demonstrated Moore's exceptional intellect and drive.

Entering the World of Industry

After completing his Ph.D., Moore initially hoped to secure a professorship at a prestigious university. However, unable to find a position that met his high standards, he began to consider other options. His mentor, Professor Badger, encouraged him to look into industry work, where there was high demand for his skills.

Moore was particular about finding a company that would give him the freedom to experiment independently. He eventually decided on the Applied Physics Laboratory, a Navy-funded research center at Johns Hopkins University. This decision meant leaving California, but Gordon and Betty felt ready for a change. They bought a Buick and set off across the country to start their new life.

However, Moore's time at the Applied Physics Laboratory would be short-lived. In 1955, a lecture by William Shockley, one of the inventors of the transistor at Bell Labs, caught Moore's attention. Shockley was working on developing a new kind of transistor using silicon as a semiconductor, and he needed a talented young chemist to assist him in his California laboratory.

Recognizing Moore's potential, Shockley offered him a job at the Shockley Semiconductor Laboratory. The opportunity was too good to pass up, so Gordon and Betty packed up their trusty Buick once again and headed back to California.

The Birth of the Silicon Transistor

Moore's work at Shockley Semiconductor Laboratory put him at the forefront of a technological revolution. The transistor, invented in 1947 at Bell Labs, was beginning to show its potential to change the world. Transistors could amplify and switch signals on and off like vacuum tubes, but they were smaller, more durable, and required less power.

By 1955, over half a million transistors were being produced monthly in the US alone. The transistor radio was becoming ubiquitous, and it was clear that this technology would enable a wide array of new devices.

Under Shockley's guidance, Moore and his colleagues worked towards creating a new kind of transistor that used silicon as a semiconductor. This was cutting-edge research, and the potential applications were enormous. Shortly after Moore joined the team, Shockley received the Nobel Prize for his work on the transistor, further validating the importance of their work.

However, despite initial progress, tensions began to mount at Shockley Semiconductor. After 18 months of work, the team was still far from achieving their goal of a silicon transistor. Shockley's partner, Arnold Beckman, was growing anxious about the lack of concrete results.

As pressure increased and conflicts arose, Moore and seven of his colleagues decided to leave the company and strike out on their own. This group became known as the "traitorous eight," with Moore as their de facto leader.

Fairchild Semiconductor and the Race for the Silicon Transistor

Moore and his colleagues' decision to leave Shockley Semiconductor marked the beginning of a new chapter in the history of computing. The group quickly found an investor in Sherman Fairchild, one of IBM's largest shareholders, and Fairchild Semiconductor was born.

The timing couldn't have been better. The same week that Fairchild Semiconductor was founded, the Soviet Union launched Sputnik, the world's first artificial satellite. Moore recognized that this event would create an even greater demand for the kind of fast-switching silicon transistor they were trying to develop.

The race was on. While Texas Instruments had already produced a small batch of silicon transistors for military use in 1954, these were slow-switching. The market was crying out for a fast-switching silicon transistor, particularly for use in advanced military applications like the B-70 Valkyrie bomber.

Moore and his team at Fairchild Semiconductor worked tirelessly to solve the remaining problems with their transistor design. They knew they were competing against well-funded giants like Texas Instruments and Bell Labs, so speed was of the essence.

In a remarkable achievement, just one year after founding the company, Fairchild Semiconductor succeeded in developing the world's first fast-switching silicon transistor. They called it the ZN696, and it was exactly what the rapidly advancing computer industry needed.

The team had been so focused on the technical challenges that they hadn't given much thought to how they would actually deliver their groundbreaking product to IBM. In a moment of improvisation that speaks to the scrappy nature of early Silicon Valley, Moore went to the grocery store and bought the nicest container he could find - a Brillo box - to ship their revolutionary transistors.

The Birth of Moore's Law

As silicon transistors became the industry standard, their applications multiplied rapidly. More and more devices - computers, radios, televisions - were produced using transistors instead of vacuum tubes. But perhaps the most significant development was the emergence of the microchip.

In the late 1950s, the prevailing wisdom was that it would be too expensive to put multiple components on a single chip. Most people assumed it would always be cheaper to wire individual components together. Moore, however, saw things differently.

Recognizing the potential of integrated circuits, Moore started Micrologic, a division of Fairchild dedicated to their development. He understood that integrated circuits would lead to a new generation of smaller and more complex microchips. His foresight was validated when NASA selected Fairchild's microchips for use in the onboard guidance computer of the Apollo program.

Moore's unique position gave him insight into the rapidly growing world of transistor-driven electronics, and he began to see patterns emerging. In February 1965, he published a journal article titled "The Future of Integrated Electronics," which contained some startling predictions.

Moore observed that the complexity of microchips, measured by the number of components they contained, had been doubling every year since their invention. He predicted that this trend would continue, while the manufacturing cost would halve each year.

This exponential growth, Moore believed, would lead to an explosion of computational power. His prediction, which became known as Moore's Law, seemed outlandish at the time. He projected that by 1975, there would be 65,000 transistors on a single microchip - a number that seemed like science fiction when the most advanced chips of 1965 contained only 64 transistors.

Moore's Law turned out to be remarkably accurate, and it has continued to hold true (with some modifications) for decades. It became a driving force in the computer industry, setting the pace for innovation and progress.

The Founding of Intel

In 1968, Moore embarked on a new venture that would change the face of computing forever. Along with his business-savvy partner Bob Noyce, Moore founded Intel. The pair recognized that they couldn't compete with giants like Motorola and Bell Labs in established markets, so they needed to find a fresh niche.

Moore noticed that the expansion of computers and calculators was creating increased demand for greater memory-processing capabilities. At the time, data was still stored on punch cards, which were cumbersome and time-consuming to use. Moore saw an opportunity to revolutionize data storage.

To bring their vision to life, Moore and Noyce recruited Joel Karp, a talented microchip designer from General Microelectronics in Santa Clara. Karp went on to create Intel's 1101 memory microchip, which could hold 256 bits of data. This innovation propelled Intel to the forefront of the emerging memory market.

Intel's success in the memory market was further cemented when a Berkeley electrical engineer named Dov Frohman approached Moore with an innovative idea for a new kind of memory chip. Unlike earlier chips that could only store data when powered on, or those with fixed, physically printed memories, Frohman's design could retain data even when powered off and was reprogrammable.

Moore immediately recognized the potential of this technology and rushed it to market. These new microchips, called EPROMs (Erasable Programmable Read-Only Memory), were a massive success. Between 1972 and 1985, EPROM chip sales became Intel's primary source of revenue.

Learning from Mistakes: The Electronic Watch Debacle

Despite Intel's success in the memory chip market, Moore was always on the lookout for new opportunities to apply microchip technology. In the early 1970s, his attention was drawn to the emerging electronic watch market.

The world's first fully electronic wristwatch, the Hamilton Pulsar, was introduced in 1972 with a hefty price tag of $2,100 (equivalent to about $12,000 today). Moore saw potential in this market and began searching for partners to help Intel enter it.

Intel invested heavily in Microma, a small company that was just about to start shipping LCD wristwatches. However, this venture quickly ran into trouble. Microma's watches faced numerous technical problems, and competition from Texas Instruments rapidly drove down prices in the market. Microma simply couldn't keep up.

This experience taught Moore a valuable lesson about the differences between the semiconductor industry and consumer products. Success in consumer goods required more than just technical knowledge about microchips; it demanded a deep understanding of consumer preferences, marketing, and retail dynamics.

Moore maintained a sense of humor about the loss, jokingly referring to his own Microma watch (which he continued to wear long after the company's failure) as his "15 million dollar watch." This expensive lesson would inform his decision-making in the future, particularly when it came to potential forays into new markets.

The Personal Computer Revolution

As Intel was licking its wounds from the electronic watch debacle, a new market was emerging that would prove far more significant: personal computers. Enthusiasts across the country had begun building their own home computers, and in 1973, a small company called MITS sold the first "personal minicomputer" kit.

Bob Noyce, Moore's partner at Intel, saw an opportunity. He suggested that Intel could enter the personal computer industry by producing complete computer kits along with each new chip they developed.

Moore, still stinging from the Microma failure, initially rejected the idea outright. "We are not in the computer business," he declared. "We build computer development systems." This caution about entering new markets beyond their core competency of chip manufacturing would shape Intel's strategy in the coming years.

Despite Moore's reluctance to enter the personal computer market directly, Intel would play a crucial role in its development through its microprocessors. The company's focus on advancing chip technology would put it at the heart of the PC revolution.

The Rise of the Microprocessor

By the mid-1970s, Intel had become a hugely profitable company, with annual revenues exceeding half a billion dollars. However, the landscape was changing. Japanese manufacturers began producing memory chips more quickly and at a lower cost, putting pressure on Intel's profits in that sector.

Moore, with his characteristic foresight, recognized where the future lay: microprocessors. These versatile chips allowed customers to create their own software programs and store them in EPROM chips. Moore understood that microprocessors had the potential to become a universal part of computing.

The market for home computers was expanding at a phenomenal rate. In 1978, Apple sold 25,000 units of their Apple II computer. By 1980, worldwide PC sales had reached 750,000 units - and each of these computers contained an Intel microprocessor.

To maintain their competitive edge, Moore knew Intel would have to invest heavily in research and development. He committed $100 million to the development of Intel's next game-changer, the 386 microprocessor. With a total of 275,000 transistors, it offered unprecedented computational power for its time.

Intel's dominance in the microprocessor market was further cemented when they began collaborating with Microsoft. In 1986, the two companies produced the Deskpro 386, a PC that ran on Microsoft's software and Intel's microprocessors. This partnership would dominate the PC world for the next quarter-century.

Intel's Market Dominance

As computers rapidly changed the world through the 1980s and 1990s, Intel's position at the heart of this revolution became increasingly secure. By 1990, the average US citizen was spending 40 percent of their time looking at screens, whether watching TV, playing video games, or using a computer. And nearly all of these devices relied on Intel microprocessors.

Intel's early move to secure a dominant position in the microprocessor market in the 1980s paid off handsomely. Between 1981 and 1987, consumers spent billions of dollars on software and hardware for personal computers, all of which relied on Intel microprocessors. This success allowed Intel to exit the memory market entirely and focus solely on microprocessors.

Intel's dominance was partly due to the increasingly high costs associated with developing and manufacturing ever more complex microprocessors. The enormous budgets and sophisticated facilities required created a significant barrier to entry for potential competitors. By the mid-1990s, Intel had captured over 80 percent of the market share in PC microprocessors, a position it has largely maintained ever since.

The company's success was staggering. In 2000, the year before Moore decided to retire at the age of 72, Intel's revenue had reached $10.5 billion, and its stock price had tripled. Moore's Law had driven relentless progress in computing power, and Intel had ridden that wave to become one of the most valuable technology companies in the world.

Moore's Philanthropic Turn

As Moore became less involved in the day-to-day operations of Intel, he turned his attention increasingly to philanthropy. He made substantial contributions to his alma mater, Caltech, as well as other educational institutions. The Gordon and Betty Moore Foundation was established, joining the ranks of tech billionaires like Bill Gates and Warren Buffet who had committed to giving away large portions of their wealth.

Moore's philanthropic efforts were so significant that in 2005, Forbes magazine named him the year's most charitable person. His giving focused particularly on scientific research and education, reflecting his lifelong passion for advancing human knowledge and technological capabilities.

The Future of Moore's Law

As Moore's career in the tech industry wound down, questions began to arise about the future of Moore's Law. The prediction that had guided the computer industry for decades was approaching its physical limits, as microprocessor technology neared the level of individual atoms.

While Moore's Law in its original form may be reaching its end, its spirit of relentless technological progress continues to drive innovation in the tech industry. New frontiers like quantum computing, artificial intelligence, and nanotechnology are opening up, promising to carry forward the exponential growth in computing power that Moore first observed.

The challenge for the next generation of innovators will be to find the next "Moore's Law" - the next paradigm that will drive technological progress for decades to come. Who will be the next Gordon Moore, quietly revolutionizing the world through scientific insight and technological innovation?

Conclusion

Gordon Moore's life story is a testament to the power of following one's passions and talents. From his early love of chemistry and explosions to his groundbreaking work in semiconductors and microchips, Moore consistently pushed the boundaries of what was possible in technology.

His work laid the foundation for the personal computer revolution and the digital age we now live in. Moore's Law, his prescient observation about the exponential growth of computing power, became a self-fulfilling prophecy that drove decades of innovation in the tech industry.

But perhaps what's most remarkable about Moore is the contrast between his outsized impact on the world and his quiet, unassuming personality. Unlike many of Silicon Valley's more flamboyant figures, Moore preferred to let his work speak for itself. His analytical mind and visionary insights drove progress, while his humility and ethical approach to business set a standard for corporate leadership.

As we look to the future of technology, the example of Gordon Moore remains relevant. His ability to see long-term trends, his commitment to continuous innovation, and his understanding of the transformative power of technology are qualities that will be needed to tackle the challenges of the 21st century and beyond.

The story of Gordon Moore is not just the story of one man or one company, but of the digital revolution itself. It's a reminder of how a single insight, combined with dedication and hard work, can change the world in ways that even its originator might never have imagined. As we continue to push the boundaries of what's possible with technology, we would do well to remember the quiet revolutionary who helped set us on this path.

Books like Moore’s Law