355 500 произведений, 25 200 авторов.

Электронная библиотека книг » Walter Isaacson » The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution » Текст книги (страница 16)
The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution
  • Текст добавлен: 21 сентября 2016, 18:47

Текст книги "The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution"


Автор книги: Walter Isaacson



сообщить о нарушении

Текущая страница: 16 (всего у книги 42 страниц)

Bushnell was able to produce his game, Computer Space, for only $1,000. It made its debut a few weeks after Galaxy Game at the Dutch Goose bar in Menlo Park near Palo Alto and went on to sell a respectable 1,500 units. Bushnell was the consummate entrepreneur: inventive, good at engineering, and savvy about business and consumer demand. He also was a great salesman. One reporter remembered running into him at a Chicago trade show: “Bushnell was about the most excited person I’ve ever seen over the age of six when it came to describing a new game.”17

Computer Space turned out to be less popular in beer halls than it was in student hangouts, so it was not as successful as most pinball games. But it did acquire a cult following. More important, it launched an industry. Arcade games, once the domain of pinball companies based in Chicago, would soon be transformed by engineers based in Silicon Valley.

Unimpressed by his experience with Nutting Associates, Bushnell decided to form his own company for his next video game. “Working for Nutting was a great learning experience, because I discovered that I couldn’t screw things up any worse than they did,” he recalled.18 He decided to name the new company Syzygy, a barely pronounceable term for when three celestial bodies are in a line. Fortunately, that name was not available because a hippie candle-making commune had registered it. So Bushnell decided to call his new venture Atari, adopting a term from the Japanese board game Go.

PONG

On the day that Atari was incorporated, June 27, 1972, Nolan Bushnell hired his first engineer. Al Alcorn was a high school football player from a rough neighborhood of San Francisco who taught himself television repair through an RCA correspondence course. At Berkeley he participated in a work-study program that brought him to Ampex, where he worked under Bushnell. He graduated just as Bushnell was forming Atari.

Many of the key partnerships in the digital age paired people with different skills and personalities, such as John Mauchly and Presper Eckert, John Bardeen and Walter Brattain, Steve Jobs and Steve Wozniak. But occasionally the partnerships worked because the personalities and enthusiasms were similar, as was the case of Bushnell and Alcorn. Both were burly and fun-loving and irreverent. “Al is one of my favorite people in world,” Bushnell asserted more than forty years later. “He was the perfect engineer and funny, so he was well-suited to video games.”19

At the time, Bushnell had a contract to make a new video game for the Chicago firm Bally Midway. The plan was to do a car racing game, which seemed likely to be more appealing than spaceship navigation to beer drinkers in workingmen’s bars. But before tossing the task to Alcorn, Bushnell decided to give him a warm-up exercise.

At a trade show, Bushnell had checked out the Magnavox Odyssey, a primitive console for playing games on home television sets. One of the offerings was a version of Ping-Pong. “I thought it was kind of crappy,” Bushnell said years later, after he had been sued for stealing its idea. “It had no sound, no score, and the balls were square. But I noticed some people were having some fun with it.” When he arrived back at Atari’s little rented office in Santa Clara, he described the game to Alcorn, sketched out some circuits, and asked him to build an arcade version of it. He told Alcorn he had signed a contract with GE to make the game, which was untrue. Like many entrepreneurs, Bushnell had no shame about distorting reality in order to motivate people. “I thought it would be a great training program for Al.”20

Alcorn got a prototype wired up in a few weeks, completing it at the beginning of September 1972. With his childlike sense of fun, he came up with enhancements that turned the monotonous blip bouncing between paddles into something amusing. The lines he created had eight regions so that when the ball hit smack in the center of a paddle it bounced back straight, but as it hit closer to the paddle’s edges it would fly off at angles. That made the game more challenging and tactical. He also created a scoreboard. And in a stroke of simple genius, he added just the right “thonk” sound from the sync generator to sweeten the experience. Using a $75 Hitachi black-and-white TV set, Alcorn hard-wired the components together inside a four-foot-tall wooden cabinet. Like Computer Space, the game did not use a microprocessor or run a line of computer code; it was all done in hardware with the type of digital logic design used by television engineers. Then he slapped on a coin box taken from an old pinball machine, and a star was born.21 Bushnell dubbed it Pong.

One of Pong’s most ingenious features was its simplicity. Computer Space had required complex instructions; there were enough directives on its opening screen (among them, for example, “There is no gravity in space; rocket speed can only be changed by engine thrust”) to baffle a computer engineer. Pong, by contrast, was simple enough that a beer-sloshed barfly or stoned sophomore could figure it out after midnight. There was only one instruction: “Avoid missing ball for high score.” Consciously or not, Atari had hit upon one of the most important engineering challenges of the computer age: creating user interfaces that were radically simple and intuitive.

Bushnell was so pleased by Alcorn’s creation that he decided it should be more than a training exercise: “My mind changed the minute it got really fun, when we found ourselves playing it for an hour or two after work every night.”22 He flew to Chicago to persuade Bally Midway to accept Pong as a fulfillment of their contract rather than push for a car racing game. But the company declined to take it. It was wary of games that required two players.

This turned out to be a lucky break. To test out Pong, Bushnell and Alcorn installed the prototype at Andy Capp’s, a beer bar in the working-class town of Sunnyvale that had peanut shells on the floor and guys playing pinball in the back. After a day or so, Alcorn got a call from the bar’s manager complaining that the machine had stopped working. He should come fix it right away, because it had been surprisingly popular. So Alcorn hurried over. As soon as he opened the machine, he discovered the problem: the coin box was so filled with quarters that it was jammed. The money gushed onto the floor.23

Bushnell and Alcorn knew they had a hit on their hands. An average machine made $10 a day; Pong was taking in $40. Suddenly Bally’s decision to decline it seemed like a blessing. The true entrepreneur in Bushnell came out: he decided that Atari would manufacture the game on its own, even though it had no financing or equipment.

He took the gamble of deciding to bootstrap the whole operation; he would fund as much as possible from the cash flow he made on sales. He looked at how much money he had in the bank, divided it by the $280 cost of making each machine, and figured that he could build thirteen of them initially. “But that was an unlucky number,” he recalled, “so we decided to build twelve.”24

Bushnell made a small model of the console shell he desired out of clay, then took it to a boat manufacturer who began producing them in fiberglass. It took just a week to build each complete game and another few days to sell it for $900, so with the $620 profit he had a positive cash flow to keep things going. Some of the early proceeds were spent on a sales brochure, which featured a beautiful young woman in a slinky sheer nightgown draping her arm over the game machine. “We hired her from the topless bar down the street,” Bushnell recounted forty years later to an audience of earnest high school students, who seemed somewhat baffled by the tale and unsure what a topless bar was.25

Venture capital, a realm that had just begun in Silicon Valley with Arthur Rock’s financing of Intel, was not available for a company proposing to make video games, which were not yet a known product and were associated with the mobbed-up pinball industry.II Banks demurred as well when Bushnell ambled in for a loan. Only Wells Fargo came through, providing a credit line of $50,000, which was far less than Bushnell had requested.

With the money, Bushnell was able to open up a production facility in an abandoned roller-skating rink a few blocks from Atari’s Santa Clara office. The Pong games were put together not on an assembly line but in the middle of the floor, with young workers ambling up to stick in the various components. Workers were dragooned from unemployment centers nearby. After weeding out the hires that were heroin addicts or stole the television monitors, the operation scaled up rapidly. At first they were making ten units a day, but within two months they could make almost a hundred. The economics were improved as well; the cost of each game was held to just over $300, but the sales price was raised to $1,200.

The atmosphere was what you might expect from the fun-loving Bushnell and Alcorn, both still in their twenties, and it took to the next level the casual style of Silicon Valley startups. Every Friday there would be a beer bash and pot-smoking party, sometimes capped by skinny-dipping, especially if that week’s numbers had been made. “We found out our employees would respond to having a party for hitting quotas as much as having a bonus,” Bushnell said.

Bushnell bought himself a nice house in the hills of nearby Los Gatos, where he sometimes held board meetings or staff parties in his hot tub. When he built a new engineering facility, he decreed that it should have its own hot tub. “It was a recruiting tool,” he insisted. “We found out that our lifestyle and the parties were hugely good for attracting workers. If we were trying to hire somebody, we’d invite him to one of our parties.”26

In addition to being a recruiting tool, the culture at Atari was a natural outgrowth of Bushnell’s personality. But it was not simply self-indulgent. It was based on a philosophy that drew from the hippie movement and would help define Silicon Valley. At its core were certain principles: authority should be questioned, hierarchies should be circumvented, nonconformity should be admired, and creativity should be nurtured. Unlike at East Coast corporations, there were no fixed working hours and no dress code, either for the office or the hot tub. “At that time in IBM you had to wear a white shirt, dark pants and a black tie with your badge stapled to your shoulder or something,” said Steve Bristow, an engineer. “At Atari the work people did counted more than how they looked.”27

The success of Pong prompted a lawsuit from Magnavox, which marketed the Odyssey home-television game that Bushnell had played at a trade show. The Magnavox game had been devised by an outside engineer named Ralph Baer. He could not claim to have invented the concept; its roots went back at least to 1958, when William Higinbotham at the Brookhaven National Lab rigged up an oscilloscope on an analog computer to knock a blip back and forth in what he called Tennis for Two. Baer, however, was one of those innovators, like Edison, who believed that filing for patents was a key element of the invention process. He had more than seventy of them, including for various aspects of his games. Instead of fighting the lawsuit, Bushnell came up with a clever deal that was a win for both companies. He paid a rather low flat fee, $700,000, for perpetual rights to make the game on the condition that Magnavox enforce its patents and demand a percentage royalty from the other companies, including his former partners Bally Midway and Nutting Associates, that wanted to make similar games. That helped put Atari at a competitive advantage.

Innovation requires having at least three things: a great idea, the engineering talent to execute it, and the business savvy (plus deal-making moxie) to turn it into a successful product. Nolan Bushnell scored a trifecta when he was twenty-nine, which is why he, rather than Bill Pitts, Hugh Tuck, Bill Nutting, or Ralph Baer, goes down in history as the innovator who launched the video game industry. “I am proud of the way we were able to engineer Pong, but I’m even more proud of the way I figured out and financially engineered the business,” he said. “Engineering the game was easy. Growing the company without money was hard.”28

I. A sample of Doc Smith’s prose, from his novel Triplanetary (1948): “Nerado’s vessel was completely ready for any emergency. And, unlike her sister-ship, she was manned by scientists well-versed in the fundamental theory of the weapons with which they fought. Beams, rods and lances of energy flamed and flared; planes and pencils cut, slashed and stabbed; defensive screens glowed redly or flashed suddenly into intensely brilliant, coruscating incandescence. Crimson opacity struggled sullenly against violet curtains of annihilation. Material projectiles and torpedoes were launched under full-beam control; only to be exploded harmlessly in mid-space, to be blasted into nothingness or to disappear innocuously against impenetrable polycyclic screens.”

II. Three years later, in 1975, when Atari decided to build a home version of Pong, the venture capital industry had caught fire, and Bushnell was able to get $20 million in funding from Don Valentine, who had just founded Sequoia Capital. Atari and Sequoia helped to launch each other.

J. C. R. Licklider (1915–90).

Bob Taylor (1932– ).

Larry Roberts (1937– ).













CHAPTER SEVEN

THE INTERNET

VANNEVAR BUSH’S TRIANGLE

Innovations often bear the imprint of the organizations that created them. For the Internet, this was especially interesting, for it was built by a partnership among three groups: the military, universities, and private corporations. What made the process even more fascinating was that this was not merely a loose-knit consortium with each group pursuing its own aims. Instead, during and after World War II, the three groups had been fused together into an iron triangle: the military-industrial-academic complex.

The person most responsible for forging this assemblage was Vannevar Bush, the MIT professor who in 1931 built the Differential Analyzer, the early analog computer described in chapter 2.1 Bush was well suited to this task because he was a star in all three camps: dean of the MIT School of Engineering, a founder of the electronics company Raytheon, and America’s top military science administrator during World War II. “No American has had greater influence in the growth of science and technology than Vannevar Bush,” MIT’s president Jerome Wiesner later proclaimed, adding that his “most significant innovation was the plan by which, instead of building large government laboratories, contracts were made with universities and industrial laboratories.”2

Bush was born near Boston in 1890, the son of a Universalist minister who had begun his career as a cook on a mackerel smack. Both of Bush’s grandfathers were whaling captains, which instilled in him a salty and forthright manner that helped make him a decisive manager and charismatic administrator. Like many successful technology leaders, he was an expert in both engineering products and making crisp decisions. “All of my recent ancestors were sea captains, and they have a way of running things without any doubt,” he once said. “That left me with some inclination to run a show once I was in it.”3

Also like many good technology leaders, he grew up loving both the humanities and the sciences. He could quote Kipling and Omar Khayyam “by the yard,” played the flute, loved symphonies, and read philosophy for pleasure. His family, too, had a basement workshop, where he built little boats and mechanical toys. As Time later reported in its inimitable old style, “Lean, sharp, salty, Van Bush is a Yankee whose love of science began, like that of many American boys, in a passion for tinkering with gadgets.”4

He went to Tufts, where in his spare time he built a surveying machine that used two bicycle wheels and a pendulum to trace the perimeter of an area and calculate its size, thus being an analog device for doing integral calculus. He got a patent on it, which became the first of forty-nine that he would accumulate. While at Tufts, he and his roommates consulted with a series of small companies and then, after graduating, founded Raytheon, which grew into a sprawling defense contractor and electronics firm.

Bush earned a PhD in electrical engineering jointly from MIT and Harvard, then became a professor and dean of engineering at MIT, where he built his Differential Analyzer. His passion was elevating the role of science and engineering in society at a time, the mid-1930s, when not much exciting seemed to be happening in either field. Televisions were not yet a consumer product, and the most notable new inventions put into the time capsule at the New York 1939 World’s Fair were a Mickey Mouse watch and a Gillette Safety Razor. The advent of World War II would change that, producing an explosion of new technologies, with Vannevar Bush leading the way.

Worried that America’s military was lagging in technology, he mobilized Harvard president James Bryant Conant and other scientific leaders to convince President Franklin Roosevelt to form the National Defense Research Committee and then the military’s Office of Scientific Research and Development, both of which he headed. With an ever-present pipe in his mouth and a pencil in his hand, he oversaw the Manhattan Project to build the atom bomb as well as the projects to develop radar and air-defense systems. Time dubbed him “General of Physics” on its cover in 1944. “If we had been on our toes in war technology ten years ago,” the magazine quoted him as saying as he banged his fist on his desk, “we would probably not have had this damn war.”5

With his no-nonsense style tempered by a personal warmth, he was a tough but endearing leader. Once a group of military scientists, frustrated by some bureaucratic problem, walked into his office to resign. Bush couldn’t figure out what the snafu was. “So I just told them,” he recalled, “ ‘One does not resign in time of war. You chaps get the hell out of here and get back to work, and I’ll look into it.’ ”6 They obeyed. As MIT’s Wiesner later observed, “He was a man of strong opinions, which he expressed and applied with vigor, yet he stood in awe of the mysteries of nature, had a warm tolerance for human frailty, and was open-minded to change.”7

When the war ended, Bush produced a report in July 1945 at Roosevelt’s behest (which ended up being delivered to President Harry Truman) that advocated government funding of basic research in partnership with universities and industry. Bush chose an evocative and quintessentially American title, “Science, the Endless Frontier.” His introduction deserves to be reread whenever politicians threaten to defund the research needed for future innovation. “Basic research leads to new knowledge,” Bush wrote. “It provides scientific capital. It creates the fund from which the practical applications of knowledge must be drawn.”8

Bush’s description of how basic research provides the seed corn for practical inventions became known as the “linear model of innovation.” Although subsequent waves of science historians sought to debunk the linear model for ignoring the complex interplay between theoretical research and practical applications, it had a popular appeal as well as an underlying truth. The war, Bush wrote, had made it “clear beyond all doubt” that basic science—discovering the fundamentals of nuclear physics, lasers, computer science, radar—“is absolutely essential to national security.” It was also, he added, crucial for America’s economic security. “New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science. A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade.” By the end of his report, Bush had reached poetic heights in extolling the practical payoffs of basic scientific research: “Advances in science when put to practical use mean more jobs, higher wages, shorter hours, more abundant crops, more leisure for recreation, for study, for learning how to live without the deadening drudgery which has been the burden of the common man for past ages.”9

Based on this report, Congress established the National Science Foundation. At first Truman vetoed the bill because it mandated that the director be appointed by an independent board rather than the president. But Bush turned Truman around by explaining that this would buffer him from those seeking political favors. “Van, you should be a politician,” Truman told him. “You have some of the instincts.” Bush replied, “Mr. President, what the hell do you think I’ve been doing around this town for five or six years?”10

The creation of a triangular relationship among government, industry, and academia was, in its own way, one of the significant innovations that helped produce the technological revolution of the late twentieth century. The Defense Department and National Science Foundation soon became the prime funders of much of America’s basic research, spending as much as private industry during the 1950s through the 1980s.I The return on that investment was huge, leading not only to the Internet but to many of the pillars of America’s postwar innovation and economic boom.11

A few corporate research centers, most notably Bell Labs, existed before the war. But after Bush’s clarion call produced government encouragement and contracts, hybrid research centers began to proliferate. Among the most notable were the RAND Corporation, originally formed to provide research and development (hence the name) to the Air Force; Stanford Research Institute and its offshoot, the Augmentation Research Center; and Xerox PARC. All would play a role in the development of the Internet.

Two of the most important of these institutes sprang up around Cambridge, Massachusetts, just after the war: Lincoln Laboratory, a military-funded research center affiliated with MIT, and Bolt, Beranek and Newman, a research and development company founded and populated by MIT (and a few Harvard) engineers. Closely associated with both of them was an MIT professor with a Missouri drawl and an easygoing talent for teambuilding. He would become the single most important person in creating the Internet.

J. C. R. LICKLIDER

In searching for fathers of the Internet, the best person to start with is a laconic yet oddly charming psychologist and technologist, with an open-faced grin and show-me attitude, named Joseph Carl Robnett Licklider, born in 1915 and known to everyone as “Lick.” He pioneered the two most important concepts underlying the Internet: decentralized networks that would enable the distribution of information to and from anywhere, and interfaces that would facilitate human-machine interaction in real time. Plus, he was the founding director of the military office that funded the ARPANET, and he returned for a second stint a decade later when protocols were created to weave it into what became the Internet. Said one of his partners and protégés, Bob Taylor, “He was really the father of it all.”12

Licklider’s father was a poor Missouri farm boy who became a successful insurance salesman in St. Louis and then, when the Depression wiped him out, a Baptist minister in a tiny rural town. As a doted-upon only child, Lick turned his bedroom into a model plane production facility and rebuilt clunker cars with his mother standing by his side handing him tools. Nevertheless, he felt trapped growing up in an isolated rural area filled with barbed-wire fences.

He escaped first to Washington University in St. Louis and then, after getting a doctorate in psychoacoustics (how we perceive sounds), joined Harvard’s psychoacoustics lab. Increasingly interested in the relationship between psychology and technology, how human brains and machines interacted, he moved to MIT to start a psychology section based in the Electrical Engineering Department.

At MIT Licklider joined the eclectic circle of engineers, psychologists, and humanists gathered around Professor Norbert Wiener, a theorist who studied how humans and machines worked together and coined the term cybernetics, which described how any system, from a brain to an artillery aiming mechanism, learned through communications, control, and feedback loops. “There was tremendous intellectual ferment in Cambridge after World War II,” Licklider recalled. “Wiener ran a weekly circle of forty or fifty people who got together. They would gather together and talk for a couple of hours. I was a faithful adherent to that.”13

Unlike some of his MIT colleagues, Wiener believed that the most promising path for computer science was to devise machines that would work well with human minds rather than try to replace them. “Many people suppose that computing machines are replacements for intelligence and have cut down the need for original thought,” Wiener wrote. “This is not the case.”14 The more powerful the computer, the greater the premium that will be placed on connecting it with imaginative, creative, high-level human thinking. Licklider became an adherent of this approach, which he later called “man-computer symbiosis.”

Licklider had a mischievous but friendly sense of humor. He loved watching the Three Stooges and was childishly fond of sight gags. Sometimes, when a colleague was about to give a slide presentation, Licklider would slip a photo of a beautiful woman into the projector’s carousel. At work he energized himself with a steady supply of Cokes and candies from the vending machines, and he gave out Hershey bars to his kids and students whenever they delighted him. He was also devoted to his graduate students, whom he would invite to dinners at his home in the Boston suburb of Arlington. “To him, collaboration was what it was all about,” his son Tracy said. “He wandered around setting up islands of people and encouraging them to be inquisitive and solve problems.” That was one reason he became interested in networks. “He knew that getting good answers involved distant collaboration. He loved spotting talented people and tying them together in a team.”15

His embrace, however, did not extend to people who were pretentious or pompous (with the exception of Wiener). When he thought a speaker was spouting nonsense, he would stand up and ask what seemed to be innocent but were in fact devilish questions. After a few moments, the speaker would realize he had been deflated and Licklider would sit down. “He didn’t like poseurs or pretenders,” Tracy recalled. “He was never mean, but he slyly pricked people’s pretensions.”

One of Licklider’s passions was art. Whenever he traveled he would spend hours at museums, sometimes dragging along his two reluctant children. “He became a nut about it, couldn’t get enough of it,” said Tracy. Sometimes he would spend five hours or more in a museum marveling at each brushstroke, analyzing how each picture came together, and attempting to fathom what it taught about creativity. He had an instinct for spotting talent in all fields, arts as well as sciences, but he felt that it was most easy to discern in its purest forms, such as the brushstroke of a painter or the melodic refrain of a composer. He said he looked for the same creative strokes in the designs of computer or network engineers. “He became a really skilled scout of creativity. He often discussed what made people creative. He felt it was easier to see in an artist, so he tried even harder to spot it in engineering, where you can’t see the brushstrokes quite as readily.”16

Most important, Licklider was kind. When he worked at the Pentagon later in his career, according to his biographer Mitchell Waldrop, he noticed the cleaning woman admiring the art prints on his wall late one evening. She told him, “You know, Dr. Licklider, I always leave your room until last because I like to have time by myself, with nothing pressing, to look at the pictures.” He asked which print she liked most, and she pointed to a Cézanne. He was thrilled, since it was his favorite, and he promptly gave it to her.17

Licklider felt that his love of art made him more intuitive. He could process a wide array of information and sniff out patterns. Another attribute, which would serve him well when he helped put together the team that laid the foundations for the Internet, was that he loved to share ideas without craving credit for them. His ego was so tamed that he seemed to enjoy giving away rather than claiming credit for ideas that were developed in conversation. “For all his considerable influence on computing, Lick retained his modesty,” said Bob Taylor. “His favorite kind of joke was one at his own expense.”18

TIME-SHARING AND MAN-COMPUTER SYMBIOSIS

At MIT Licklider collaborated with the artificial intelligence pioneer John McCarthy, in whose lab the hackers of the Tech Model Railroad Club had invented Spacewar. With McCarthy in the lead, they helped to develop, during the 1950s, systems for computer time-sharing.

Up until then, when you wanted a computer to perform a task, you had to submit a stack of punch cards or a tape to the computer’s operators, as if handing an offering to the priests who shielded an oracle. This was known as “batch processing,” and it was annoying. It could take hours or even days to get results back; any little mistake might mean having to resubmit your cards for another run; and you might not be able to touch or even see the computer itself.

Time-sharing was different. It allowed a whole lot of terminals to be hooked up to the same mainframe, so that many users could type in commands directly and get a response almost instantly. Like a grandmaster playing dozens of games of chess simultaneously, the mainframe’s core memory would keep track of all the users, and its operating system would be capable of multitasking and running many programs. This provided users with an enchanting experience: you could have a hands-on and real-time interaction with a computer, like a conversation. “We had a kind of little religion growing here about how this was going to be totally different from batch processing,” said Licklider.19


    Ваша оценка произведения:

Популярные книги за неделю