355 500 произведений, 25 200 авторов.

Электронная библиотека книг » Walter Isaacson » The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution » Текст книги (страница 15)
The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution
  • Текст добавлен: 21 сентября 2016, 18:47

Текст книги "The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution"


Автор книги: Walter Isaacson



сообщить о нарушении

Текущая страница: 15 (всего у книги 42 страниц)

Such a management style needed someone to impose discipline. Early on at Intel, well before it was his turn in the lineup to become CEO, Grove helped institute some management techniques. He created a place where people were held accountable for sloppiness. Failures had consequences. “Andy would fire his own mother if she got in the way,” said one engineer. Another colleague explained that this was necessary in an organization headed by Noyce: “Bob really has to be a nice guy. It’s important for him to be liked. So somebody has to kick ass and take names. And Andy happens to be very good at that.”49

Grove began to study and absorb the art of management as if it were the science of circuitry. He would later become a best-selling author of books with titles such as Only the Paranoid Survive and High Output Management. He did not try to impose a hierarchal command on what Noyce had wrought. Instead he helped to instill a culture that was driven, focused, and detail-aware, traits that would not naturally have arisen from Noyce’s laid-back, nonconfrontational style. His meetings were crisp and decisive, unlike those run by Noyce, where people tended to hang around as long as possible knowing that he was likely to tacitly assent to the last person who had his ear.

What saved Grove from seeming like a tyrant was that he was so irrepressible, which made him hard not to like. When he smiled, his eyes lit up. He had a pixielike charisma. With his Hungarian accent and goofy grin, he was by far the most colorful engineer in the valley. He succumbed to the dubious fashions of the early 1970s by attempting, in an immigrant geek manner worthy of a Saturday Night Live skit, to be groovy. He grew his sideburns long and his mustache droopy and wore open shirts with gold chains dangling over his chest hair. None of which hid the fact that he was a real engineer, one who had been a pioneer of the metal-oxide semiconductor transistor that became the workhorse of modern microchips.

Grove nurtured Noyce’s egalitarian approach—he worked in an exposed cubicle his entire career, and loved it—but he added an overlay of what he called “constructive confrontation.” He never put on airs, but he never let down his guard. In contrast to Noyce’s sweet gentility, Grove had a blunt, no-bullshit style. It was the same approach Steve Jobs would later use: brutal honesty, clear focus, and a demanding drive for excellence. “Andy was the guy who made sure the trains all ran on time,” recalled Ann Bowers. “He was a taskmaster. He had very strong views about what you should do and what you shouldn’t do and he was very direct about that.”50

Despite their different styles, there was one thing that Noyce and Moore and Grove shared: an unwavering goal of making sure that innovation, experimentation, and entrepreneurship flourished at Intel. Grove’s mantra was “Success breeds complacency. Complacency breeds failure. Only the paranoid survive.” Noyce and Moore may not have been paranoid, but they were never complacent.

THE MICROPROCESSOR

Inventions sometimes occur when people are confronted with a problem and scramble to solve it. At other times, they happen when people embrace a visionary goal. The tale of how Ted Hoff and his team at Intel invented the microprocessor is a case of both.

Hoff, who had been a young teacher at Stanford, became the twelfth employee at Intel, where he was assigned to work on chip design. He realized that it was wasteful and inelegant to design many types of microchips that each had a different function, which Intel was doing. A company would come in and ask it to build a microchip designed to do a specific task. Hoff envisioned, as did Noyce and others, an alternative approach: creating a general-purpose chip that could be instructed, or programmed, to do a variety of different applications as desired. In other words, a general-purpose computer on a chip.51

This vision coincided with a problem that was dumped in Hoff’s lap in the summer of 1969. A Japanese company named Busicom was planning a powerful new desktop calculator, and it had drawn up specifications for twelve special-purpose microchips (different ones to handle display, calculations, memory, etc.) that it wanted Intel to build. Intel agreed, and a price was set. Noyce asked Hoff to oversee the project. Soon a challenge arose. “The more I learned about this design, the more concerned I became that Intel may have undertaken more than it was prepared to deliver,” Hoff recalled. “The number of chips and their complexity was much greater than I had expected.” There was no way Intel could build them at the agreed price. Making matters worse, the growing popularity of Jack Kilby’s pocket calculator was forcing Busicom to cut its price even further.

“Well, if there’s anything you can think of to simplify the design, why don’t you pursue it,” Noyce suggested.52

Hoff proposed that Intel design a single logic chip that could perform almost all of the tasks that Busicom wanted. “I know this can be done,” he said of the general-purpose chip. “It can be made to emulate a computer.” Noyce told him to try it.

Before they could sell the idea to Busicom, Noyce realized he had to convince someone who might be even more resistant: Andy Grove, who nominally worked for him. Part of what Grove saw as his mandate was keeping Intel focused. Noyce would say yes to almost anything; Grove’s job was to say no. When Noyce sauntered over to Grove’s workspace and sat on the corner of his desk, Grove was immediately on guard. He knew that Noyce’s effort to appear nonchalant was a sign that something was afoot. “We’re starting another project,” Noyce said, affecting a laugh.53 Grove’s first reaction was to tell Noyce he was crazy. Intel was a fledgling company still struggling to manufacture its memory chips, and it didn’t need any distractions. But after he heard Noyce describe Hoff’s idea, Grove realized that resistance was probably wrong and definitely futile.

By September 1969 Hoff and his colleague Stan Mazor had sketched out the architecture of a general-purpose logic chip that could follow programming instructions. It would be able to do the work of nine of the twelve chips that Busicom had requested. Noyce and Hoff presented the option to Busicom executives, who agreed that it was the better approach.

When it came time to renegotiate the price, Hoff made a critical recommendation to Noyce, one that helped create a huge market for general-purpose chips and assured that Intel would remain a driver of the digital age. It was a deal point that Bill Gates and Microsoft would emulate with IBM a decade later. In return for giving Busicom a good price, Noyce insisted that Intel retain the rights to the new chip and be allowed to license it to other companies for purposes other than making a calculator. He realized that a chip that could be programmed to perform any logical function would become a standard component in electronic devices, the way two-by-four pieces of lumber were a standard component in the construction of houses. It would replace custom chips, which meant it could be manufactured in bulk and thus continually decline in price. It would also usher in a more subtle shift in the electronics industry: the importance of hardware engineers, who designed the placement of the components on a circuit board, began to be supplanted by a new breed, software engineers, whose job it was to program a set of instructions into the system.

Because it was essentially a computer processor on a chip, the new device was dubbed a microprocessor. In November 1971 Intel unveiled the product, the Intel 4004, to the public. It took out ads in trade magazines announcing “a new era of integrated electronics—a micro-programmable computer on a chip!” It was priced at $200, and orders, as well as thousands of requests for the manual, began pouring in. Noyce was attending a computer show in Las Vegas on the day of the announcement and was thrilled to watch potential customers cramming into the Intel suite.

Noyce became an apostle of the microprocessor. At a reunion in San Francisco he hosted for his extended family in 1972, he stood up in the bus he had chartered and waved a wafer over his head. “This is going to change the world,” he told them. “It’s going to revolutionize your home. In your own house, you’ll all have computers. You will have access to all sorts of information.” His relatives passed the wafer around the bus like an object of veneration. “You won’t need money anymore,” he prophesied. “Everything will happen electronically.”54

He was exaggerating only slightly. Microprocessors began showing up in smart traffic lights and car brakes, coffeemakers and refrigerators, elevators and medical devices, and thousands of other gizmos. But the foremost success of the microprocessor was making possible smaller computers, most notably personal computers that you could have on your desk and in your home. And if Moore’s Law continued to hold true (as it would), a personal computer industry would grow up symbiotically with a microprocessor industry.

That is what happened in the 1970s. The microprocessor spawned hundreds of new companies making hardware and software for personal computers. Intel not only developed the leading-edge chips; it also created the culture that inspired venture-funded startups to transform the economy and uproot the apricot orchards of Santa Clara Valley, the forty-mile stretch of flat land from south San Francisco through Palo Alto to San Jose.

The valley’s main artery, a bustling highway named El Camino Real, was once the royal road that connected California’s twenty-one mission churches. By the early 1970s—thanks to Hewlett-Packard, Fred Terman’s Stanford Industrial Park, William Shockley, Fairchild and its Fairchildren—it connected a bustling corridor of tech companies. In 1971 the region got a new moniker. Don Hoefler, a columnist for the weekly trade paper Electronic News, began writing a series of columns entitled “Silicon Valley USA,” and the name stuck.55

I. Only living people can be selected for a Nobel.

II. The vehicle he used was convertible debentures, which were loans that could be converted into common stock if the company became successful but were worthless (at the end of the line of creditors) if it failed.

III. Edward “Ned” Johnson III, then running the Fidelity Magellan Fund. In 2013 Rock still had these two sheets, along with the older one seeking the patron for what became Fairchild, tucked in a filing cabinet in his office overlooking San Francisco Bay.

IV. After she married Noyce she had to leave Intel, and she moved to the fledgling Apple Computer, where she became Steve Jobs’s first director of human resources and also a calming maternal influence on him.

Dan Edwards and Peter Samson in 1962 playing Spacewar at MIT.

Nolan Bushnell (1943– ).













CHAPTER SIX

VIDEO GAMES

The evolution of microchips led to devices that were, as Moore’s Law forecast, smaller and more powerful each year. But there was another impetus that would drive the computer revolution and, eventually, the demand for personal computers: the belief that computers weren’t merely for number-crunching. They could and should be fun for people to use.

Two cultures contributed to the idea that computers should be things that we interact and play with. There were the hard-core hackers who believed in “the hands-on imperative” and loved pranks, clever programming tricks, toys, and games.1 And there were the rebel entrepreneurs eager to break into the amusement games industry, which was dominated by syndicates of pinball distributors and ripe for a digital disruption. Thus was born the video game, which turned out to be not merely an amusing sideshow but an integral part of the lineage that led to today’s personal computer. It also helped to propagate the idea that computers should interact with people in real time, have intuitive interfaces, and feature delightful graphic displays.

STEVE RUSSELL AND

SPACEWAR

The hacker subculture, as well as the seminal video game Spacewar, emanated from MIT’s Tech Model Railroad Club, a geeky student organization founded in 1946 that met in the bowels of a building where radar had been developed. Its bunker was almost completely filled by a model train board with dozens of tracks, switches, trolleys, lights, and towns, all compulsively crafted and historically accurate. Most of its members obsessed over fashioning picture-perfect pieces to display on the layout. But there was a subset of the club that was more interested in what was underneath the sprawling chest-high board. The members of the “Signals and Power Subcommittee” tended to the relays, wires, circuits, and crossbar switches, which were rigged together on the underside of the board to provide a complex hierarchy of controllers for the numerous trains. In this tangled web they saw beauty. “There were neat regimental lines of switches, and achingly regular rows of dull bronze relays, and a long, rambling tangle of red, blue, and yellow wires—twisting and twirling like a rainbow-colored explosion of Einstein’s hair,” Steven Levy wrote in Hackers, which begins with a colorful depiction of the club.2

Members of the Signals and Power Subcommittee embraced the term hacker with pride. It connoted both technical virtuosity and playfulness, not (as in more recent usage) lawless intrusions into a network. The intricate pranks devised by MIT students—putting a live cow on the roof of a dorm, a plastic cow on the Great Dome of the main building, or causing a huge balloon to emerge midfield during the Harvard-Yale game—were known as hacks. “We at TMRC use the term ‘hacker’ only in its original meaning, someone who applies ingenuity to create a clever result, called a ‘hack,’ ” the club proclaimed. “The essence of a ‘hack’ is that it is done quickly, and is usually inelegant.”3

Some of the early hackers had been infused with the aspiration of creating machines that could think. Many were students at MIT’s Artificial Intelligence Lab, founded in 1959 by two professors who would become fabled: John McCarthy, a Santa Claus lookalike who coined the term artificial intelligence, and Marvin Minsky, who was so clever that he seemed a refutation of his own belief that computers would someday surpass human intelligence. The prevailing doctrine of the lab was that, given enough processing power, machines could replicate neural networks like those of the human brain and be able to interact intelligently with users. Minsky, a puckish man with twinkling eyes, had built a learning machine designed to model the brain, which he named SNARC (Stochastic Neural Analog Reinforcement Calculator), hinting that he was serious but might also be joking a bit. He had a theory that intelligence could be a product of the interaction of nonintelligent components, such as small computers connected by giant networks.

A seminal moment for the hackers of the Tech Model Railroad Club came in September 1961, when the Digital Equipment Corporation (DEC) donated the prototype of its PDP-1 computer to MIT. About the size of three refrigerators, the PDP-1 was the first computer to be designed for direct interaction with the user. It could connect to a keyboard and a monitor that displayed graphics, and it could be operated easily by a single person. Like moths to a flame, a handful of hard-core hackers began to circle this new computer, and they formed a cabal to conjure up something fun to do with it. Many of the discussions took place in a rundown apartment on Hingham Street in Cambridge, so the members dubbed themselves the Hingham Institute. The high-minded name was ironic. Their goal was not to come up with some elevated use for the PDP-1 but instead to do something clever.

Previous hackers had created a few rudimentary games for earlier computers. One at MIT had a dot on a screen that represented a mouse trying to navigate a maze to find a wedge of cheese (or, in later versions, a martini); another, at the Brookhaven National Lab on Long Island, used an oscilloscope on an analog computer to simulate a tennis match. But the members of the Hingham Institute knew that with the PDP-1 they had the chance to create the first real computer video game.

The best programmer in their group was Steve Russell, who was helping Professor McCarthy create the language LISP, which was designed to facilitate artificial intelligence research. Russell was a consummate geek, brimming with passions and intellectual obsessions that ranged from steam trains to thinking machines. Short and excitable, he had thick glasses and curly hair. When he spoke, he sounded like someone had punched his fast-forward button. Although he was intense and energetic, he was prone to procrastination, earning him the nickname “Slug.”

Like most of his hacker friends, Russell was an avid fan of bad movies and pulp science fiction. His favorite author was E. E. “Doc” Smith, a failed food engineer (an expert on the bleaching of flour, he concocted doughnut mixes) who specialized in a trashy sci-fi subgenre known as space opera. It featured melodramatic adventures filled with battles against evil, interstellar travel, and clichéd romance. Doc Smith “wrote with the grace and refinement of a pneumatic drill,” according to Martin Graetz, a member of the Tech Model Railroad Club and the Hingham Institute, who wrote a reminiscence about the creation of Spacewar. Graetz recalled a typical Doc Smith tale:

After some preliminary foofaraw to get everyone’s name right, a bunch of overdeveloped Hardy Boys go trekking off through the universe to punch out the latest gang of galactic goons, blow up a few planets, kill all sorts of nasty life forms, and just have a heck of a good time. In a pinch, which is where they usually were, our heroes could be counted on to come up with a complete scientific theory, invent the technology to implement it, and produce the weapons to blow away the baddies, all while being chased in their spaceship hither and thither through the trackless wastes of the galaxy.I

Afflicted by their passion for such space operas, it’s not surprising that Russell, Graetz, and their friends decided to concoct a space-war game for the PDP-1. “I had just finished reading Doc Smith’s Lensman series,” Russell recalled. “His heroes had a strong tendency to get pursued by the villain across the galaxy and have to invent their way out of their problem while they were being pursued. That sort of action was the thing that suggested Spacewar.”4 Proudly nerdy, they reconstituted themselves into the Hingham Institute Study Group on Space Warfare, and Slug Russell proceeded to code.5

Except that, true to his nickname, he didn’t. He knew what the starting point of his game program would be. Professor Minsky had stumbled upon an algorithm that drew a circle on the PDP-1 and was able to modify it so that it would display three dots on the screen that interacted with each other, weaving beautiful little patterns. Minsky called his hack the Tri-Pos, but his students dubbed it “the Minskytron.” That was a good foundation for creating a game featuring interacting spaceships and missiles. Russell spent weeks mesmerized by the Minskytron and grokking its ability to make patterns. But he bogged down when it came time to write the sine-cosine routines that would determine the motion of his spaceships.

When Russell explained this obstacle, a fellow club member named Alan Kotok knew how to solve it. He drove out to the suburban Boston headquarters of DEC, which made the PDP-1, and found a sympathetic engineer who had the routines necessary to make the calculations. “Alright, here are the sine-cosine routines,” Kotok told Russell. “Now what’s your excuse?” Russell later admitted, “I looked around and I didn’t find an excuse, so I had to settle down and do some figuring.”6

Throughout the Christmas vacation of 1961 Russell hacked away, and within weeks he had produced a method to maneuver dots on the screen by using the toggle switches of the control panel to make them speed up, slow down, and turn. Then he converted the dots into two cartoonish spaceships, one of them fat and bulging like a cigar and the other thin and straight like a pencil. Another subroutine allowed each spaceship to shoot a dot out of its nose, mimicking a missile. When the position of the missile dot coincided with that of a spaceship, the latter would “explode” into randomly moving dots. By February 1962 the basics had been completed.

At that point Spacewar became an open-source project. Russell put his program tape in the box that held other PDP-1 programs, and his friends began to make improvements. One of them, Dan Edwards, decided it would be cool to introduce a gravitational force, so he programmed in a big sun that exerted a tug on the ships. If you didn’t pay attention, it could suck you in and destroy you, but good players learned to whip close to the sun and use its gravitational pull to gain momentum and swing around at higher speeds.

Another friend, Peter Samson, “thought my stars were random and unrealistic,” Russell recalled.7 Samson decided the game needed “the real thing,” meaning astronomically correct constellations rather than miscellaneous dots. So he created a programming addition he called “Expensive Planetarium.” Using information from the American Ephemeris and Nautical Almanac, he encoded a routine that showed all the stars in the night sky down to the fifth magnitude. By specifying how many times a display point on the screen fired, he was even able to replicate each star’s relative brightness. As the spaceships sped along, the constellations slowly scrolled past.

This open-source collaboration produced many more clever contributions. Martin Graetz came up with what he called “the ultimate panic button,” which was the ability to get out of a jam by toggling a switch and disappearing temporarily into another dimension of hyperspace. “The idea was that when everything else failed you could jump into the fourth dimension and disappear,” he explained. He had read about something similar, called a “hyper-spatial tube,” in one of Doc Smith’s novels. There were, however, some limits: you could toggle into hyperspace only three times in a game; your disappearance gave your opponent a breather; and you never knew where your spaceship would reappear. It might end up in the sun or right in the sights of your opponent. “It was something you could use, but not something you wanted to use,” Russell explained. Graetz added an homage to Professor Minsky: a ship disappearing into hyperspace left behind one of the signature patterns of the Minskytron.8

One lasting contribution came from two active members of the Tech Model Railroad Club, Alan Kotok and Bob Sanders. They realized that players crammed in front of a PDP-1 console jostling elbows and frantically grabbing at the computer’s switches was both awkward and dangerous. So they rummaged around under the train set in the clubroom and commandeered some of the toggles and relays. These they pieced together inside two plastic boxes to make remote controls, complete with all the necessary function switches and the hyperspace panic button.

The game quickly spread to other computer centers and became a staple of hacker culture. DEC began shipping the game preloaded into its computers, and programmers created new versions for other systems. Hackers around the world added more features, such as cloaking powers, exploding space mines, and ways to shift into a first-person perspective from the view of one of the pilots. As Alan Kay, one of the pioneers of the personal computer, said, “The game of Spacewar blossoms spontaneously wherever there is a graphics display connected to a computer.”9

Spacewar highlighted three aspects of the hacker culture that became themes of the digital age. First, it was created collaboratively. “We were able to build it together, working as a team, which is how we liked to do things,” Russell said. Second, it was free and open-source software. “People asked for copies of the source code, and of course we gave them out.” Of course—that was in a time and place when software yearned to be free. Third, it was based on the belief that computers should be personal and interactive. “It allowed us to get our hands on a computer and make it respond to us in real time,” said Russell.10

NOLAN BUSHNELL AND ATARI

Like many computer science students in the 1960s, Nolan Bushnell was a Spacewar fanatic. “The game was seminal to anyone who loved computers, and for me it was transforming,” he recalled. “Steve Russell was like a god to me.” What set Bushnell apart from other computer bums who got their kicks by maneuvering blips on a screen was that he was also enthralled by amusement parks. He worked in one to help pay for college. In addition, he had the boisterous temperament of an entrepreneur, relishing the mix of thrill-seeking and risk-taking. Thus it was that Nolan Bushnell became one of those innovators who turned an invention into an industry.11

When Bushnell was fifteen, his father died. He had been a construction contractor in a growing exurb of Salt Lake City, and he left behind several unfinished jobs for which he hadn’t been paid. Young Bushnell, already big and boisterous, finished them off, adding to his natural bravado. “When you do something like that as a 15-year-old, you begin to believe you can do anything,” he said.12 Not surprisingly, he became a poker player, and as good luck would have it he lost, fortuitously forcing him to take a job on the midway at the Lagoon Amusement Park while studying at the University of Utah. “I learned all the various tricks for getting people to put up their quarters, and that sure served me well.”13 He was soon promoted to the pinball and game arcade, where animated driving games such as Speedway, made by Chicago Coin Machine Manufacturing Company, were the new rage.

He was fortunate as well in landing at the University of Utah. It had the best computer graphics program in the country, run by professors Ivan Sutherland and David Evans, and became one of the first four nodes on the ARPANET, the precursor to the Internet. (Other students included Jim Clark, who founded Netscape; John Warnock, who cofounded Adobe; Ed Catmull, who cofounded Pixar; and Alan Kay, about whom more later.) The university had a PDP-1, complete with a Spacewar game, and Bushnell combined his love of the game with his understanding of the economics of arcades. “I realized you could make a whole lot of quarters if you could put a computer with a game in an arcade,” he said. “And then I did the division and realized that even a whole lot of quarters coming in every day would never add up to the million-dollar cost of a computer. You divide twenty-five cents into a million dollars and you give up.”14 And so he did, for the moment.

When he graduated in 1968 (“last in his class,” he often bragged), Bushnell went to work for Ampex, which made recording equipment. He and a colleague there, Ted Dabney, continued to concoct schemes for turning a computer into an arcade video game. They considered ways to adapt the Data General Nova, a $4,000 refrigerator-size minicomputer that came out in 1969. But no matter how they juggled the numbers, it was neither cheap enough nor powerful enough.

In his attempts to push the Nova to support Spacewar, Bushnell looked for elements of the game, such as the background of stars, that could be generated by the hardware circuits rather than by the processing power of the computer. “Then I had a great epiphany,” he recalled. “Why not do it all with hardware?” In other words, he could design circuits to perform each of the tasks that the program would have done. That made it cheaper. It also meant that the game had to be a lot simpler. So he turned Spacewar into a game that had only one user-controlled spaceship, which fought against two simple saucers generated by the hardware. Eliminated, too, were the sun’s gravity and the panic button to disappear into hyperspace. But it was still a fun game, and it could be built at a reasonable cost.

Bushnell sold the idea to Bill Nutting, who had formed a company to make an arcade game called Computer Quiz. In keeping with that name, they dubbed Bushnell’s game Computer Space. He and Nutting hit it off so well that Bushnell quit Ampex in 1971 to join Nutting Associates.

As they were working on the first Computer Space consoles, Bushnell heard that he had competition. A Stanford grad named Bill Pitts and his buddy Hugh Tuck from California Polytechnic had become addicted to Spacewar, and they decided to use a PDP-11 minicomputer to turn it into an arcade game. When Bushnell heard this, he invited Pitts and Tuck to visit. They were appalled at the sacrifices—indeed sacrileges—Bushnell was perpetrating in stripping down Spacewar so that it could be produced inexpensively. “Nolan’s thing was a totally bastardized version,” Pitts fumed.15 For his part, Bushnell was contemptuous of their plan to spend $20,000 on equipment, including a PDP-11 that would be in another room and connected by yards of cable to the console, and then charge ten cents a game. “I was surprised at how clueless they were about the business model,” he said. “Surprised and relieved. As soon as I saw what they were doing, I knew they’d be no competition.”

Galaxy Game by Pitts and Tuck debuted at Stanford’s Tresidder student union coffeehouse in the fall of 1971. Students gathered around each night like cultists in front of a shrine. But no matter how many lined up their coins to play, there was no way the machine could pay for itself, and the venture eventually folded. “Hugh and I were both engineers and we didn’t pay attention to business issues at all,” conceded Pitts.16 Innovation can be sparked by engineering talent, but it must be combined with business skills to set the world afire.


    Ваша оценка произведения:

Популярные книги за неделю