355 500 произведений, 25 200 авторов.

Электронная библиотека книг » Walter Isaacson » The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution » Текст книги (страница 14)
The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution
  • Текст добавлен: 21 сентября 2016, 18:47

Текст книги "The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution"


Автор книги: Walter Isaacson



сообщить о нарушении

Текущая страница: 14 (всего у книги 42 страниц)

MICROCHIPS BLAST OFF

The first major market for microchips was the military. In 1962 the Strategic Air Command designed a new land-based missile, the Minuteman II, that would each require two thousand microchips just for its onboard guidance system. Texas Instruments won the right to be the primary supplier. By 1965 seven Minutemen were being built each week, and the Navy was also buying microchips for its submarine-launched missile, the Polaris. With a coordinated astuteness not often found among military procurement bureaucracies, the designs of the microchips were standardized. Westinghouse and RCA began supplying them as well. So the price soon plummeted, until microchips were cost-effective for consumer products and not just missiles.

Fairchild also sold chips to weapons makers, but it was more cautious than its competitors about working with the military. In the traditional military relationship, a contractor worked hand in glove with uniformed officers, who not only managed procurement but also dictated and fiddled with design. Noyce believed such partnerships stifled innovation: “The direction of the research was being determined by people less competent in seeing where it ought to go.”19 He insisted that Fairchild fund the development of its chips using its own money so that it kept control of the process. If the product was good, he believed, military contractors would buy it. And they did.

America’s civilian space program was the next big booster for microchip production. In May 1961 President John F. Kennedy declared, “I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the earth.” The Apollo program, as it became known, needed a guidance computer that could fit into a nose cone. So it was designed from scratch to use the most powerful microchips that could be made. The seventy-five Apollo Guidance Computers that were built ended up containing five thousand microchips apiece, all identical, and Fairchild landed the contract to supply them. The program beat Kennedy’s deadline by just a few months; in July 1969 Neil Armstrong set foot on the moon. By that time the Apollo program had bought more than a million microchips.

These massive and predictable sources of demand from the government caused the price of each microchip to fall rapidly. The first prototype chip for the Apollo Guidance Computer cost $1,000. By the time they were being put into regular production, each cost $20. The average price for each microchip in the Minuteman missile was $50 in 1962; by 1968 it was $2. Thus was launched the market for putting microchips in devices for ordinary consumers.20

The first consumer devices to use microchips were hearing aids because they needed to be very small and would sell even if they were rather expensive. But the demand for them was limited. So Pat Haggerty, the president of Texas Instruments, repeated a gambit that had served him in the past. One aspect of innovation is inventing new devices; another is inventing popular ways to use these devices. Haggerty and his company were good at both. Eleven years after he had created a huge market for inexpensive transistors by pushing pocket radios, he looked for a way to do the same for microchips. The idea he hit upon was pocket calculators.

On a plane ride with Jack Kilby, Haggerty sketched out his idea and handed Kilby his marching orders: Build a handheld calculator that can do the same tasks as the thousand-dollar clunkers that sit on office desks. Make it efficient enough to run on batteries, small enough to put into a shirt pocket, and cheap enough to buy on impulse. In 1967 Kilby and his team produced almost what Haggerty envisioned. It could do only four tasks (add, subtract, multiply, and divide) and was a bit heavy (more than two pounds) and not very cheap ($150).21 But it was a huge success. A new market had been created for a device people had not known they needed. And following the inevitable trajectory, it kept getting smaller, more powerful, and cheaper. By 1972 the price of a pocket calculator had dropped to $100, and 5 million units were sold. By 1975 the price was down to $25, and sales were doubling every year. In 2014 a Texas Instruments pocket calculator cost $3.62 at Walmart.

MOORE’S LAW

That became the pattern for electronic devices. Every year things got smaller, cheaper, faster, more powerful. This was especially true—and important—because two industries were growing up simultaneously, and they were intertwined: the computer and the microchip. “The synergy between a new component and a new application generated an explosive growth for both,” Noyce later wrote.22 The same synergy had happened a half century earlier when the oil industry grew in tandem with the auto industry. There was a key lesson for innovation: Understand which industries are symbiotic so that you can capitalize on how they will spur each other on.

If someone could provide a pithy and accurate rule for predicting the trend lines, it would help entrepreneurs and venture capitalists to apply this lesson. Fortunately, Gordon Moore stepped forward at that moment to do so. Just as the microchip sales were starting to skyrocket, he was asked to forecast the future market. His paper, titled “Cramming More Components onto Integrated Circuits,” was published in the April 1965 issue of Electronics magazine.

Moore began with a glimpse of the digital future. “Integrated circuits will lead to such wonders as home computers—or at least terminals connected to a central computer—automatic controls for automobiles, and personal portable communications equipment,” he wrote. Then he produced an even more prescient prediction that was destined to make him famous. “The complexity for minimum component costs has increased at a rate of roughly a factor of two per year,” he noted. “There is no reason to believe it will not remain nearly constant for at least ten years.”23

Roughly translated, he was saying that the number of transistors that could be crammed, cost-effectively, onto a microchip had been doubling every year, and he expected it to do so for at least the next ten years. One of his friends, a professor at Caltech, publicly dubbed this “Moore’s Law.” In 1975, when the ten years had passed, Moore was proved right. He then modified his law by cutting the predicted rate of increase by half, prophesying that the future numbers of transistors crammed onto a chip would show “a doubling every two years, rather than every year.” A colleague, David House, offered a further modification, now sometimes used, which said chip “performance” would double every eighteen months because of the increased power as well as the increased numbers of transistors that would be put onto a microchip. Moore’s formulation and its variations proved to be useful at least through the subsequent half century, and it helped chart the course for one of the greatest bursts of innovation and wealth creation in human history.

Moore’s Law became more than just a prediction. It was also a goal for the industry, which made it partly self-fulfilling. The first such example occurred in 1964, as Moore was formulating his law. Noyce decided that Fairchild would sell its simplest microchips for less than they cost to make. Moore called the strategy “Bob’s unheralded contribution to the semiconductor industry.” Noyce knew that the low price would cause device makers to incorporate microchips into their new products. He also knew that the low price would stimulate demand, high-volume production, and economies of scale, which would turn Moore’s Law into a reality.24

Fairchild Camera and Instrument decided, not surprisingly, to exercise its right to buy out Fairchild Semiconductor in 1959. That made the eight founders rich but sowed seeds of discord. The corporation’s East Coast executives refused to give Noyce the right to hand out stock options to new and valued engineers, and they sucked up the semiconductor division profits to fund less successful investments in more mundane realms, such as home movie cameras and stamp machines.

There were also internal problems in Palo Alto. Engineers began defecting, thus seeding the valley with what became known as Fairchildren: companies that sprouted from spores emanating from Fairchild. The most notable came in 1961, when Jean Hoerni and three of the other eight defectors from Shockley left Fairchild to join a startup, funded by Arthur Rock, that became Teledyne. Others followed, and by 1968 Noyce himself was ready to leave. He had been passed over for the top corporate job at Fairchild, which ticked him off, but he also realized that he did not really want it. Fairchild, the corporation as a whole and even the semiconductor division in Palo Alto, had become too big and bureaucratic. Noyce yearned to shed some managerial duties and return to being close to the lab.

“How about starting a new company?” he asked Moore one day.

“I like it here,” Moore replied.25 They had helped to create the culture of the California tech world, in which people left established companies to form new ones. But now, as they were both hitting forty, Moore no longer had the urge to jump off the roof in a hang glider. Noyce kept pressing. Finally, as the summer of 1968 approached, he simply told Moore he was leaving. “He had a way of making you want to take a leap with him,” Moore said many years later, laughing. “So finally I said, ‘Okay, let’s go.’ ”26

“As [the company] has grown larger and larger, I have enjoyed my daily work less and less,” Noyce wrote in his letter of resignation to Sherman Fairchild. “Perhaps this is partly because I grew up in a small town, enjoying all the personal relationships of a small town. Now we employ twice the total population of my largest ‘home town.’ ” His desire, he said, was to “get close to advanced technology again.”27

When Noyce called Arthur Rock, who had put together the financing deal that launched Fairchild Semiconductor, Rock immediately asked, “What took you so long?”28

ARTHUR ROCK AND VENTURE CAPITAL

In the eleven years since he had assembled the deal for the traitorous eight to form Fairchild Semiconductor, Arthur Rock had helped to build something that was destined to be almost as important to the digital age as the microchip: venture capital.

For much of the twentieth century, venture capital and private equity investing in new companies had been mainly the purview of a few wealthy families, such as the Vanderbilts, Rockefellers, Whitneys, Phippses, and Warburgs. After World War II, many of these clans set up firms to institutionalize the business. John Hay “Jock” Whitney, an heir to multiple family fortunes, hired Benno Schmidt Sr. to form J. H. Whitney & Co., which specialized in what they originally called “adventure capital” to fund entrepreneurs with interesting ideas who could not get bank loans. The six sons and one daughter of John D. Rockefeller Jr., led by Laurence Rockefeller, started a similar firm, which eventually became Venrock Associates. That same year, 1946, also saw the birth of the most influential entry, one that was based on business acumen rather than family wealth: the American Research and Development Corporation (ARDC). It was founded by Georges Doriot, a former dean of the Harvard Business School, in partnership with a former MIT president, Karl Compton. ARDC scored big by doing a startup investment in Digital Equipment Corporation in 1957, which was worth five hundred times as much when the company went public eleven years later.29

Arthur Rock took this concept west, ushering in the silicon age of venture capital. When he put together Noyce’s traitorous eight with Fairchild Camera, Rock and his company took a stake in the deal. After that, he realized that he could raise a fund of money and do similar deals without relying on one corporate patron. He had a background in business research, a love of technology, an intuitive feel for business leadership, and a lot of East Coast investors he had made happy. “The money was on the East Coast but the exciting companies were in California, so I decided to move west knowing that I could connect the two,” he said.30

Rock grew up the son of Russian Jewish immigrants in Rochester, New York, where he worked as a soda jerk in his father’s candy store and developed a good feel for personalities. One of his key investment maxims was to bet primarily on the people rather than the idea. In addition to going over business plans, he conducted incisive personal interviews with those who sought funding. “I believe so strongly in people that I think talking to the individual is much more important than finding out too much about what they want to do,” he explained. On the surface, he wore the cloak of the curmudgeon, with a gruff and taciturn style. But those who looked at his face closely enough could tell from the light in his eyes and the hints of a smile that he enjoyed people and had a warm sense of humor.

When he got to San Francisco, he was introduced to Tommy Davis, a talkative deal maker who was investing the money of the Kern County Land Co., a cattle and oil empire flush with cash. They went into business together as Davis and Rock, raised $5 million from Rock’s East Coast investors (as well as some of the Fairchild founders), and started funding new companies in return for a chunk of the equity. Stanford’s provost Fred Terman, still seeking to build his university’s ties to the growing tech boom, encouraged his engineering professors to spend time advising Rock, who took a night course in electronics at the university. Two of his first bets were on Teledyne and Scientific Data Systems, which both paid off handsomely. By the time Noyce called him about finding an exit strategy from Fairchild in 1968, Rock’s partnership with Davis had amiably dissolved (their investments had shot up thirtyfold in seven years) and he was on his own.

“If I wanted to start a company,” Noyce asked, “could you find me the money?” Rock assured him it would be easy. What could better fit his theory that you place your money on the jockeys—that you invest based on your assessment of the people running the company—than an enterprise that would be led by Robert Noyce and Gordon Moore? He barely asked what they were going to make, and at first he didn’t even think they needed to do a business plan or description. “It was the only investment that I’ve ever made that I was 100 percent sure would succeed,” he later claimed.31

When he had sought a home for the traitorous eight in 1957, he pulled out a single piece of legal-pad paper, wrote a numbered list of names, and methodically phoned each one, crossing off names as he went down the list. Now, eleven years later, he took another sheet of paper and listed people who would be invited to invest and how many of the 500,000 sharesII available at $5 apiece he would offer to each. This time around, he would cross out only one name. (“Johnson at Fidelity”III didn’t come in.) Rock needed a second sheet to revise the allocations because most people wanted to invest more than he offered them. It took him less than two days to raise the money. The lucky investors included Rock himself, Noyce, Moore, Grinnell College (Noyce wanted to make it rich, and he did), Laurence Rockefeller, Rock’s Harvard classmate Fayez Sarofim, Max Palevsky of Scientific Data Systems, and Rock’s old investment firm, Hayden, Stone. Most notably, the other six members of the traitorous eight, many of them now working at firms that would have to compete with this new one, were given a chance to invest. All did.

Just in case someone desired a prospectus, Rock himself typed up a three-and-a-half-page sketch of the proposed company. It opened by describing Noyce and Moore and then gave a perfunctory three-sentence overview of the “transistor technologies” the company would develop. “Lawyers later screwed up venture investing by forcing us to write prospectus books that were so long and complex and carefully vetted that it’s a joke,” Rock complained later, pulling the pages out of his file cabinet. “All I had to tell people was that it was Noyce and Moore. They didn’t need to know much else.”32

The first name that Noyce and Moore chose for their new company was NM Electronics, their initials. That was not very exciting. After many clunky suggestions—Electronic Solid State Computer Technology Corp. was one—they finally decided on Integrated Electronics Corp. That wasn’t very thrilling, either, but it had the virtue that it could be abridged—as Intel. That had a nice ring to it. It was smart and knowing, in many different ways.

THE INTEL WAY

Innovations come in a variety of guises. Most of those featured in this book are physical devices, such as the computer and the transistor, and related processes, such as programming, software, and networking. Also important are the innovations that produce new services, such as venture capital, and those that create organizational structures for research and development, such as Bell Labs. But this section is about a different type of creation. There arose at Intel an innovation that had almost as much of an impact on the digital age as any of these. It was the invention of a corporate culture and management style that was the antithesis of the hierarchical organization of East Coast companies.

The roots of this style, like much of what happened in Silicon Valley, were at Hewlett-Packard. During World War II, while Bill Hewlett was in the military, Dave Packard slept on a cot at the office many nights and managed three shifts of workers, many of them women. He realized, partly out of necessity, that it helped to give his workers flexible hours and plenty of leeway in determining how to accomplish their objectives. The management hierarchy was flattened. During the 1950s this approach merged with the casual lifestyle of California to create a culture that included Friday beer bashes, flexible hours, and stock options.33

Robert Noyce took this culture to the next level. To understand him as a manager, it’s useful to recall that he was born and bred a Congregationalist. His father and both grandfathers were ministers of the dissenting denomination that had as its core creed the rejection of hierarchy and all of its trappings. The Puritans had purified the church of all pomp and levels of authority, even going as far as eliminating elevated pulpits, and those who spread this Nonconformist doctrine to the Great Plains, including the Congregationalists, were just as averse to hierarchical distinctions.

It also helps to remember that, from his early days as a student, Noyce loved madrigal singing. Every Wednesday evening he attended rehearsals of his twelve-voice group. Madrigals don’t rely on lead singers and soloists; the polyphonic songs weave multiple voices and melodies together, none of them dominant. “Your part depends on [the others’ and] it always supports the others,” Noyce once explained.34

Gordon Moore was similarly unpretentious, nonauthoritarian, averse to confrontation, and uninterested in the trappings of power. They complemented each other well. Noyce was Mr. Outside; he could dazzle a client with the halo effect that had followed him since childhood. Moore, always temperate and thoughtful, liked being in the lab, and he knew how to lead engineers with subtle questions or (the sharpest arrow in his quiver) a studied silence. Noyce was great at strategic vision and seeing the big picture; Moore understood the details, particularly of the technology and engineering.

So they were perfect partners, except in one way: with their shared aversion to hierarchy and unwillingness to be bossy, neither was a decisive manager. Because of their desire to be liked, they were reluctant to be tough. They guided people but didn’t drive them. If there was a problem or, heaven forbid, a disagreement, they did not like to confront it. So they wouldn’t.

That’s where Andy Grove came in.

Grove, born András Gróf in Budapest, did not come from a madrigal-singing Congregationalist background. He grew up Jewish in Central Europe as fascism was rising, learning brutal lessons about authority and power. When he was eight, the Nazis took over Hungary; his father was sent to a concentration camp, and András and his mother were forced to move into a special cramped apartment for Jews. When he went outside, he had to wear a yellow Star of David. One day when he got sick, his mother was able to convince a non-Jewish friend to bring some ingredients for soup, which led to the arrest of both his mother and the friend. After she was released, she and András assumed false identities while friends sheltered them. The family was reunited after the war, but then the communists took over. Grove decided, at age twenty, to flee across the border to Austria. As he wrote in his memoir, Swimming Across, “By the time I was twenty, I had lived through a Hungarian Fascist dictatorship, German military occupation, the Nazis’ Final Solution, the siege of Budapest by the Soviet Red Army, a period of chaotic democracy in the years immediately after the war, a variety of repressive Communist regimes, and a popular uprising that was put down at gunpoint.”35 It wasn’t like mowing lawns and singing in a small-town Iowa choir, and it did not instill genial mellowness.

Grove arrived in the United States a year later and, as he taught himself English, was able to graduate first in his class at City College of New York and then earn a PhD in chemical engineering from Berkeley. He joined Fairchild in 1963 right out of Berkeley, and in his spare time wrote a college textbook titled Physics and Technology of Semiconductor Devices.

When Moore told him of his plans to leave Fairchild, Grove volunteered to come along. In fact, he almost forced himself on Moore. “I really respected him and wanted to go wherever he went,” Grove declared. He became the third person at Intel, serving as the director of engineering.

Grove had deep admiration for Moore’s technical skills but not his management style. That was understandable, given Moore’s aversion to confrontation and almost any aspect of management beyond proffering gentle advice. If there was a conflict, he would watch quietly from afar. “He is either constitutionally unable or simply unwilling to do what a manager has to do,” Grove said of Moore.36 The feisty Grove, by contrast, felt that honest confrontation was not only a managerial duty but one of life’s invigorating spices, which as a hardened Hungarian he relished.

Grove was even more appalled by the management style of Noyce. At Fairchild he had simmered with fury when Noyce ignored the incompetence of one of his division heads, who showed up late and drunk at meetings. Thus he groaned when Moore said that his new venture would be in partnership with Noyce. “I told him that Bob was a better leader than Andy gave him credit for,” Moore said. “They just had different styles.”37

Noyce and Grove got along socially better than they did professionally. They went with their families to Aspen, where Noyce helped Grove learn to ski and even buckled his boots for him. Nevertheless, Grove detected a detachment in Noyce that could be disconcerting: “He was the only person I can think of who was both aloof and charming.”38 In addition, despite their weekend friendship, Grove found himself irritated and sometimes appalled by Noyce at the office. “I had nothing but unpleasant, discouraging dealings with him as I watched Bob manage a troubled company,” he recalled. “If two people argued and we all looked to him for a decision, he would put a pained look on his face and said something like, ‘Maybe you should work that out.’ More often he didn’t say that, he just changed the subject.”39

What Grove did not realize at the time, but came to understand later, was that effective management need not always come from having one strong leader. It can come from having the right combination of different talents at the top. Like a metallic alloy, if you get the right mix of elements the result can be strong. Years later, after Grove had learned to appreciate this, he read Peter Drucker’s The Practice of Management, which described the ideal chief executive as an outside person, an inside person, and a person of action. Grove realized that instead of being embodied in one person, such traits could exist in a leadership team. That was the case at Intel, Grove said, and he made copies of the chapter for Noyce and Moore. Noyce was the outside guy, Moore the inside, and Grove was the man of action.40

Arthur Rock, who put together the funding for the trio and initially served as their board chair, understood the virtue of creating an executive team whose members complemented each other. He also noted a corollary: it was important that the trifecta become CEO in the order that they did. Noyce he described as “a visionary who knew how to inspire people and sell the company to others when it was getting off the ground.” Once that was done, Intel needed to be led by someone who could make it a pioneer in each new wave of technology, “and Gordon was such a brilliant scientist he knew how to drive the technology.” Then, when there were dozens of other companies competing, “we needed a hard-charging, no-nonsense manager who could focus on driving us as a business.” That was Grove.41

The Intel culture, which would permeate the culture of Silicon Valley, was a product of all three men. As might be expected in a congregation where Noyce was the minister, it was devoid of the trappings of hierarchy. There were no reserved parking places. Everyone, including Noyce and Moore, worked in similar cubicles. Michael Malone, a reporter, described visiting Intel to do an interview: “I couldn’t find Noyce. A secretary had to come out and lead me to his cubicle, because his cubicle was almost indistinguishable from all the other cubicles in this vast prairie dog town of cubicles.”42

When one early employee wanted to see the company’s organization chart, Noyce made an X in the center of a page and then drew a bunch of other Xs around it, with lines leading to each. The employee was at the center, and the others were people he would be dealing with.43 Noyce noticed that at East Coast companies the clerks and secretaries got little metal desks while those of top executives were expansive ones made of mahogany. So Noyce decided that he would work at a small gray aluminum desk, even as newly hired support staffers were given bigger wooden ones. His dented and scratched desk was near the center of the room, in open view, for everyone to see. It prevented anyone else from demanding some vestment of power. “There were no privileges anywhere,” recalled Ann Bowers, who was the personnel director and later married Noyce.IV “We started a form of company culture that was completely different than anything had been before. It was a culture of meritocracy.”44

It was also a culture of innovation. Noyce had a theory that he developed after bridling under the rigid hierarchy at Philco. The more open and unstructured a workplace, he believed, the faster new ideas would be sparked, disseminated, refined, and applied. “The idea is people should not have to go up through a chain of command,” said one of Intel’s engineers, Ted Hoff. “If you need to talk to a particular manager you go talk to him.”45 As Tom Wolfe put it in his profile, “Noyce realized how much he detested the eastern corporate system of class and status with its endless gradations, topped off by the CEOs and vice-presidents who conducted their daily lives as if they were a corporate court and aristocracy.”

By avoiding a chain of command, both at Fairchild Semiconductor and then at Intel, Noyce empowered employees and forced them to be entrepreneurial. Even though Grove cringed when disputes went unresolved at meetings, Noyce was comfortable letting junior employees resolve problems rather than bucking them up to a higher layer of management that would tell them what to do. Responsibility was thrust on young engineers, who found themselves having to be innovators. Every now and then, a staffer might be unnerved by a tough problem. “He would go to Noyce and hyperventilate and ask him what to do,” Wolfe reported. “And Noyce would lower his head, turn on his 100 ampere eyes, listen, and say: ‘Look, here are your guidelines. You’ve got to consider A, you’ve got to consider B, and you’ve got to consider C.’ Then he would turn on the Gary Cooper smile: ‘But if you think I’m going to make your decision for you, you’re mistaken. Hey . . . it’s your ass.’ ”

Instead of proposing plans to top management, Intel’s business units were entrusted to act as if they were their own little and agile company. Whenever there was a decision that required buy-in from other units, such as a new marketing plan or a change in a product strategy, the issue would not be bucked up to bosses for a decision. Instead an impromptu meeting would be convened to hash it out, or try to. Noyce liked meetings, and there were rooms set aside for whenever anyone felt the need to call one. At these meetings everyone was treated as an equal and could challenge the prevailing wisdom. Noyce was there not as a boss but as a pastor guiding them to make their own decisions. “This wasn’t a corporation,” Wolfe concluded. “It was a congregation.”46

Noyce was a great leader because he was inspiring and smart, but he was not a great manager. “Bob operated on the principle that if you suggested to people what the right thing to do would be, they would be smart enough to pick it up and do it,” said Moore. “You didn’t have to worry about following up.”47 Moore admitted that he was not much better: “I was never very eager to exert authority or be the boss either, which might mean we were too much alike.”48


    Ваша оценка произведения:

Популярные книги за неделю