Of Microchips and Men: A Conversation About Intel

The first hundred employees of Intel pose for a photograph outside the company’s Mountain View, California, headquarters, in 1969.Photograph by Intel Corp./AP.

Before Silicon Valley was known for Google employees whisked to work in private shuttles, startups valued in the billions, and people walking around with optical displays strapped to their faces, it was known simply for silicon—the stuff used to make computer chips. Intel, perhaps the best-known chip manufacturer, was founded in 1968, in Mountain View, California.

The company and its engineers helped develop two key technologies: the integrated circuit and the microprocessor. Both of these relied on an earlier invention, the transistor, conceived in the nineteen-forties. The transistor facilitated the creation of “solid-state” electronics—digital devices made from solid materials (as opposed to vacuum tubes) capable of conducting current. In the late nineteen-fifties, Robert Noyce, who would later co-found Intel, began experimenting with stringing transistors together on sheets of silicon to create integrated circuits—the structures also known as chips. Engineers made different kinds of chips for different purposes. In the early seventies, a team at Intel came up with one chip to suit several of these purposes, creating the first commercially available microprocessor.

I spoke to Michael S. Malone, a veteran technology reporter and the author of a history of the company, “The Intel Trinity,” released earlier this month, about Intel’s origins and its influence on science, finance, and culture. Here are some edited excerpts from our conversation.

Why is it important for the public to learn about Intel’s story?

I’m convinced that Intel is the world’s most important company not because of its size, not because of its most recent products, not because it’s the hottest company in the world right now—it was fifteen years ago, but not now. It’s because at the very core of the electronics revolution, at the heart of our modern lives, is the integrated-circuit chip, the microprocessor. And Intel is behind it all.

The two founders of Intel, Robert Noyce and Gordon Moore, were part of a group known as the Traitorous Eight. How was it that Intel was born out of mistrust and disappointment?

You begin with one of the greatest applied scientists of all time, but an incredibly difficult human being, William Shockley, who brings together the brightest young scientists in the country in Silicon Valley in the nineteen-fifties. He then just berates his employees. He doesn’t trust them, he makes them take lie-detector tests, on and on. And the eight of them just get up one day and quit. That’s the Traitorous Eight.

The eight men go found Fairchild Semiconductor, in 1957, which is a division of Fairchild Camera and Instrument, headquartered in New Jersey. But then Fairchild H.Q. betrays them. Fairchild Semiconductor has got the greatest accumulation of talent ever seen. Even now in Silicon Valley there’s nothing like the early days of Fairchild. But Fairchild Camera and Instrument won’t give stock options to these people, who are making them millions and millions of dollars. In the end, everyone in California is feeling pretty betrayed by this old-line East Coast company. Finally, Noyce and Moore can’t take it anymore. They walk out and start Intel.

At its start, in 1968, Intel raised $2.5 million in two days. How did the company attract so much interest so quickly?

Bob Noyce and Gordon Moore. They had the best reputations in technology, and when they walked out of Fairchild, everyone knew they were going to start a new company, and everybody wanted to be in on that game. Arthur Rock, a banker who helped raise the money, said that it took two days back then, but in today’s world of e-mail and text messaging he could have done it in two hours.

You say that you consider the microprocessor the most important product in the modern world. Why is that?

Because now you can bring intelligence down to the device level, to the chip level, and once you can do that you can stick it in anything. It’s in our phones, it’s in our cars, it’s in our appliances, kids’ toys, medical devices—just about anywhere you look, there are microprocessors. And they’re getting more and more powerful, as well as smaller and smaller.

You write that Moore’s Law—the idea that the number of transistors on a chip will double every two years—isn’t so much a law as a compact between Intel and the public. What do you mean by that?

When I say Moore’s Law isn’t a law, I mean that there are laws in technology, but Moore’s Law isn’t one. It’s essentially an observation that has been accepted as a code of conduct. It’s a social contract made between the semiconductor industry, especially Intel, and the rest of the world: “We are going to keep pushing the technology forward.” Engineers have dedicated their careers to doubling computing performance every couple of years, until they eventually hit a wall imposed by physics. And that’s what Intel’s done for almost fifty years now, and what they’ll keep doing for probably the rest of our lives. But if, tomorrow morning, Intel and others decided, “We don’t want to do this anymore,” Moore’s Law would end, and so would the rate of technological change we’ve all grown accustomed to.

What impact did Robert Noyce have on Steve Jobs?

I’ve only met four people in Silicon Valley history who, when they walked into a room, every head turned: David Packard, William Hewlett, Bob Noyce, and Steve Jobs. Steve and I were neighbors as kids, and he was kind of a lost soul. He was put above everybody at such an early age with his hot-shot company, Apple, but he found in Noyce an individual who, no matter how far he got, he was always going to look up to—because Bob Noyce not only built Fairchild and Intel but also invented the integrated circuit.* So he had the best credentials imaginable. And he was also hugely charismatic. I think Steve Jobs wanted to grow up to be Bob Noyce. And Noyce’s death really shattered Jobs, more than just about anyone. It deeply affected him.

Since Intel’s initial public offering, in 1971, we’ve seen a few dramatic episodes in the world of tech financing. Can you explain the changes since then in how startup companies raise money?

Early on—I’m talking fifties, sixties, seventies—[to go public] you had to have an established company with a line of products, and you had to have around ten million dollars in revenues, which is like a hundred million now. You had to be a real company. And that continued up until the mid-nineties. Then you get into the dot-com bubble. All these new startups really had was an idea and a couple of founders. But there was so much money flowing into venture-capital funds that the V.C.s were giving money away. You would go in and ask for five million and they would say, “You know what? Take fifteen, for marketing.” It was just completely out of control. It was supercharged by the fact that these companies were going public with basically no revenues and a handful of employees, and they were getting billion-dollar valuations on the stock market.

The most interesting phenomenon of the last three or four years is that big, successful Valley companies like Facebook and Google and Apple are so flush with cash that the game is now, you build yourself to a certain size and look to be bought. Look at Mark Zuckerberg. He buys Instagram and then he buys WhatsApp. He spends nineteen billion dollars for WhatsApp. That’s a mind-boggling number for a startup. For the first time, acquisitions are more appealing than I.P.O.s. So we are going into this interesting era where maybe companies will choose not to go public anymore, which was always the big-money exit strategy, and instead go do a fan dance in front of Mark Zuckerberg in hopes of getting these insane valuations.

What’s your take on the worldly ambitions of the new tech companies?

I’m a little bothered by the hypocrisy exhibited by the new generation of Silicon Valley leaders. They’re code writers, and software is different from hardware. With software people, there is this big, romantic philosophy—“Do no evil”—yet it’s always combined with a sort of duplicity. These guys who are running the social-networking era, they’re really behaving like oligarchs: “You know the reason we’re successful is that we’re special. We’re smarter than other people.” You didn’t see that in the early generation of Silicon Valley leaders. They were the children of blue-collar working families. They worked with their hands. So they didn’t try to be your whole world. They didn’t build a campus for you to live on twenty-four hours a day, like in a dorm. They expected you to go home to your family. They had an admiration for working people. You just don’t see that right now with the social-networking guys. Average folks in the Valley, especially poor people, have a really strong sense that these guys don’t care about them. And I think it manifests itself in all sorts of ways, like working with the N.S.A., and the perpetual effort to monetize our private information. It’s a very different world.

*Note: The Institute of Electrical and Electronics Engineers, among other organizations, credits the development of the first working integrated circuit to Jack S. Kilby, an employee of Texas Instruments, who demonstrated his prototype in late 1958, shortly before Noyce independently conceived his.