Текст книги "The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution"
Автор книги: Walter Isaacson
Жанр:
Биографии и мемуары
сообщить о нарушении
Текущая страница: 30 (всего у книги 42 страниц)
Tim Berners-Lee (1955– ).
Marc Andreessen (1971– ).
Justin Hall (1974– ) and Howard Rheingold (1947– ) in 1995.
CHAPTER ELEVEN
THE WEB
There was a limit to how popular the Internet could be, at least among ordinary computer users, even after the advent of modems and the rise of online services made it possible for almost anyone to get connected. It was a murky jungle with no maps, filled with clusters of weird foliage with names like alt.config and Wide Area Information Servers that could intimidate all but the most intrepid pathfinder.
But just when the online services began opening up to the Internet in the early 1990s, a new method of posting and finding content miraculously appeared, as if it had burst into life from an underground atom smasher, which in fact was close to what happened. It made the carefully packaged online services obsolete, and it fulfilled—indeed far surpassed—the utopian dreams of Bush, Licklider, and Engelbart. More than most innovations of the digital age it was invented primarily by one man, who gave it a name that managed to be, as he was personally, both expansive and simple: the World Wide Web.
TIM BERNERS-LEE
As a kid growing up on the edge of London in the 1960s, Tim Berners-Lee came to a fundamental insight about computers: they were very good at crunching step by step through programs, but they were not very good at making random associations and clever links, the way that an imaginative human could.
This is not something that most kids ponder, but both of Berners-Lee’s parents were computer scientists. They worked as programmers on the Ferranti Mark I, the commercial version of the Manchester University stored-program computer. One evening at home his father, who had been asked by his boss to draft a speech on how to make computers more intuitive, talked about some books on the human brain that he was reading. His son recalled, “The idea stayed with me that computers could become much more powerful if they could be programmed to link otherwise unconnected information.”1 They also talked about Alan Turing’s concept of a universal machine. “It made me realize that the limitations on what you could do with a computer were just the limitations of your imagination.”2
Berners-Lee was born in 1955, the same year as Bill Gates and Steve Jobs, and he considered it a lucky time to be interested in electronics. Kids of that era found it easy to get hold of basic equipment and components that they could play with. “Things came along at the right time,” he explained. “Anytime we understood one technology, then industry produced something more powerful that we could afford with our pocket money.”3
In primary school, Berners-Lee and a friend hung around hobby shops, where they used their allowance to buy electromagnets and make their own relays and switches. “You’d have an electromagnet banged into a bit of wood,” he recalled. “When you turned it on, it would attract a bit of tin and that would complete a circuit.” From that they developed a deep understanding of what a bit was, how it could be stored, and the things that could be done with a circuit. Just when they were outgrowing simple electromagnetic switches, transistors became common enough that he and his friends could buy a bag of a hundred pretty cheaply. “We learned how to test transistors and use them to replace the relays we had built.”4 In doing so, he could visualize clearly what each component was doing by comparing them to the old electromagnetic switches they superseded. He used them to make audio sounds for his train set and to create circuits that controlled when the train should slow down.
“We began to imagine quite complicated logical circuits, but those became impractical because you would have to use too many transistors,” he said. But just as he ran into that problem, microchips became available at the local electronics store. “You buy these little bags of microchips with your pocket money and you’d realize you could make the core of a computer.”5 Not only that, but you could understand the core of the computer because you had progressed from simple switches to transistors to microchips and knew how each worked.
One summer just before he went off to Oxford, Berners-Lee had a job in a lumber yard. When he was dumping a pile of sawdust into a Dumpster, he spied an old calculator, partly mechanical and partly electronic, with rows of buttons. He salvaged it, wired it up with some of his switches and transistors, and soon had it working as a rudimentary computer. At a repair shop he bought a broken television set and used the monitor to serve as a display, after figuring out how the circuit of vacuum tubes worked.6
During his Oxford years, microprocessors became available. So, just as Wozniak and Jobs had done, he and his friends designed boards that they tried to sell. They were not as successful as the Steves, partly because, as Berners-Lee later said, “we didn’t have the same ripe community and cultural mix around us like there was at the Homebrew and in Silicon Valley.”7 Innovation emerges in places with the right primordial soup, which was true of the Bay Area but not of Oxfordshire in the 1970s.
His step-by-step hands-on education, starting with electromagnetic switches and progressing to microprocessors, gave him a deep understanding of electronics. “Once you’ve made something with wire and nails, when someone says a chip or circuit has a relay you feel confident using it because you know you could make one,” he said. “Now kids get a MacBook and regard it as an appliance. They treat it like a refrigerator and expect it to be filled with good things, but they don’t know how it works. They don’t fully understand what I knew, and my parents knew, which was what you could do with a computer was limited only by your imagination.”8
There was a second childhood memory that lingered: that of a Victorian-era almanac and advice book in his family home with the magical and musty title Enquire Within Upon Everything. The introduction proclaimed, “Whether You Wish to Model a Flower in Wax; to Study the Rules of Etiquette; to Serve a Relish for Breakfast or Supper; to Plan a Dinner for a Large Party or a Small One; to Cure a Headache; to Make a Will; to Get Married; to Bury a Relative; Whatever You May Wish to Do, Make, or to Enjoy, Provided Your Desire has Relation to the Necessities of Domestic Life, I Hope You will not Fail to ‘Enquire Within.’ ”9 It was, in some ways, the Whole Earth Catalog of the nineteenth century, and it was filled with random information and connections, all well indexed. “Enquirers are referred to the index at the end,” the title page instructed. By 1894 it had gone through eighty-nine editions and sold 1,188,000 copies. “The book served as a portal to a world of information, everything from how to remove clothing stains to tips on investing money,” Berners-Lee observed. “Not a perfect analogy for the Web, but a primitive starting point.”10
Another concept that Berners-Lee had been chewing on since childhood was how the human brain makes random associations—the smell of coffee conjures up the dress a friend wore when you last had coffee with her—whereas a machine can make only the associations that it has been programmed to make. He was also interested in how people work together. “You got half the solution in your brain, and I got half in my brain,” he explained. “If we are sitting around a table, I’ll start a sentence and you might help finish it, and that’s the way we all brainstorm. Scribble stuff on whiteboard, and we edit each other’s stuff. How can we do that when we are separated?”11
All of these elements, from Enquire Within to the brain’s ability to make random associations and to collaborate with others, were jangling around in Berners-Lee’s head when he graduated from Oxford. Later he would realize a truth about innovation: New ideas occur when a lot of random notions churn together until they coalesce. He described the process this way: “Half-formed ideas, they float around. They come from different places, and the mind has got this wonderful way of somehow just shoveling them around until one day they fit. They may fit not so well, and then we go for a bike ride or something, and it’s better.”12
For Berners-Lee, his own innovative concepts began to coalesce when he took a consulting job at CERN, the mammoth supercollider and particle physics lab near Geneva. He needed a way to catalogue the connections among the ten thousand or so researchers, their projects, and their computer systems. Both the computers and the people spoke many different languages and tended to make ad hoc links to one another. Berners-Lee needed to keep track of them, so he wrote a program to help him do so. He noticed that when people explained to him the various relationships at CERN, they tended to scribble diagrams with a lot of arrows. So he devised a method to replicate these in his program. He would type in the name of a person or project and then create links that would show which were related. Thus it was that Berners-Lee created a computer program that he named, after the Victorian almanac of his childhood, Enquire.
“I liked Enquire,” he wrote, “because it stored information without using structures like matrices or trees.”13 Such structures are hierarchical and rigid, whereas the human mind makes more random leaps. As he worked on Enquire, he developed a grander vision for what it could become. “Suppose all the information stored on computers everywhere were linked. There would be a single global information space. A web of information would form.”14 What he imagined, although he didn’t know it at the time, was Vannevar Bush’s memex machine—which could store documents, cross-reference them, retrieve them—writ global.
But before he got very far in creating Enquire, his consultancy at CERN came to an end. He left behind his computer and his eight-inch floppy disk containing all of the code, and it was promptly lost and forgotten. For a few years he worked in England for a company that made software for publishing documents. But he got bored and applied for a fellowship at CERN. In September 1984 he arrived back there to work with the group that was responsible for gathering the results of all of the experiments being done at the institute.
CERN was a cauldron of diverse peoples and computer systems using dozens of languages, both verbal and digital. All had to share information. “In this connected diversity,” Berners-Lee recalled, “CERN was a microcosm of the rest of the world.”15 In such a setting, he found himself returning to his childhood ruminations about how people with different perspectives work together to turn each other’s half-formed notions into new ideas. “I’ve always been interested in how people work together. I was working with a lot of people at other institutes and universities, and they had to collaborate. If they had been in the same room, they would have written all over the blackboard. I was looking for a system that would allow people to brainstorm and to keep track of the institutional memory of a project.”16
Such a system, he felt, would connect people from afar so that they could complete each other’s sentences and add useful ingredients to each other’s half-formed notions. “I wanted it to be something which would allow us to work together, design things together,” he said. “The really interesting part of the design is when we have lots of people all over the planet who have part of it in their heads. They have parts of the cure for AIDS, part of an understanding of cancer.”17 The goal was to facilitate team creativity—the brainstorming that occurs when people sit around fleshing out each other’s ideas—when the players are not in the same place.
So Berners-Lee reconstructed his Enquire program and began thinking about ways to expand it. “I wanted to access different kinds of information, such as a researcher’s technical papers, the manual for different software modules, minutes of meetings, hastily scribbled notes, and so on.”18 Actually, he wanted to do much more than that. He had the placid exterior of a congenital coder, but lurking underneath he harbored the whimsical curiosity of a child who stayed up late reading Enquire Within Upon Everything. Rather than merely devising a data management system, he yearned to create a collaborative playground. “I wanted to build a creative space,” he later said, “something like a sandpit where everyone could play together.”19
He hit upon a simple maneuver that would allow him to make the connections he wanted: hypertext. Now familiar to any Web surfer, hypertext is a word or phrase that is coded so that when clicked it sends the reader to another document or piece of content. Envisioned by Bush in his description of a memex machine, it was named in 1963 by the tech visionary Ted Nelson, who dreamed up a brilliantly ambitious project called Xanadu, never brought to fruition, in which all pieces of information would be published with two-way hypertext links to and from related information.
Hypertext was a way to allow the connections that were at the core of Berners-Lee’s Enquire program to proliferate like rabbits; anyone could link to documents on other computers, even those with different operating systems, without asking permission. “An Enquire program capable of external hypertext links was the difference between imprisonment and freedom,” he exulted. “New webs could be made to bind different computers together.” There would be no central node, no command hub. If you knew the web address of a document, you could link to it. That way the system of links could spread and sprawl, “riding on top of the Internet,” as Berners-Lee put it.20 Once again, an innovation was created by weaving together two previous innovations: in this case, hypertext and the Internet.
Using a NeXT computer, the handsome hybrid of a workstation and personal computer that Jobs created after being ousted from Apple, Berners-Lee adapted a protocol that he had been working on, called a Remote Procedure Call, that allowed a program running on one computer to call up a subroutine that was on another computer. Then he drew up a set of principles for naming each document. Initially he called these Universal Document Identifiers. The folks at the Internet Engineering Task Force in charge of approving standards balked at what they said was his “arrogance” in calling his scheme universal. So he agreed to change it to uniform. In fact he was pushed into changing all three words, turning it into Uniform Resource Locators—those URLs, such as http://www.cern.ch, that we now use every day.21 By the end of 1990 he had created a suite of tools that allowed his network to come to life: a Hypertext Transfer Protocol (HTTP) to allow hypertext to be exchanged online, a Hypertext Markup Language (HTML) for creating pages, a rudimentary browser to serve as the application software that retrieved and displayed information, and server software that could respond to requests from the network.
In March 1989 Berners-Lee had his design in place and officially submitted a funding proposal to the top managers at CERN. “The hope would be to allow a pool of information to develop which could grow and evolve,” he wrote. “A ‘web’ of notes with links between them is far more useful than a fixed hierarchical system.”22 Unfortunately, his proposal elicited as much bafflement as enthusiasm. “Vague, but exciting,” his boss, Mike Sendall, wrote atop the memo. “When I read Tim’s proposal,” he later admitted, “I could not figure out what it was, but I thought it was great.”23 Once again, a brilliant inventor found himself in need of a collaborator to turn a concept into a reality.
More than most digital-age innovations, the conception of the Web was driven primarily by one person. But Berners-Lee did need a partner in bringing it to fruition. Fortunately, he was able to find one in Robert Cailliau, a Belgian engineer at CERN, who had been toying with similar ideas and was willing to join forces. “In the marriage of hypertext and the Internet,” said Berners-Lee, “Robert was best man.”
With his personable demeanor and bureaucratic skills, Cailliau was the perfect person to be the evangelist for the project within CERN and the project manager who got things done. A fastidious dresser who methodically scheduled his haircuts, he was “the kind of engineer who can be driven mad by the incompatibility of power plugs in different countries,” according to Berners-Lee.24 They formed a partnership often seen in innovative teams: the visionary product designer paired with the diligent project manager. Cailliau, who loved planning and organizational work, cleared the way, he said, for Berners-Lee to “bury his head in the bits and develop his software.” One day Cailliau tried to go over a project plan with Berners-Lee and realized, “He just did not understand the concept!”25 Because of Cailliau, he didn’t have to.
Cailliau’s first contribution was to sharpen the funding proposal that Berners-Lee had submitted to CERN administrators by making it less vague while keeping it exciting. He began with its title, “Information Management.” Cailliau insisted that they figure out a catchier name for the project, which shouldn’t be too hard. Berners-Lee had a few ideas. The first was Mine of Information, but that abbreviated to MOI, French for me, which sounded a bit egocentric. The second idea was The Information Mine, but that abbreviated to TIM, which was even more so. Cailliau rejected the approach, often used at CERN, of plucking the name of some Greek god or Egyptian pharaoh. Then Berners-Lee came up with something that was direct and descriptive. “Let’s call it the World Wide Web,” he said. It was the metaphor he had used in his original proposal. Cailliau balked. “We can’t call it that, because the abbreviation WWW sounds longer than the full name!”26 The initials have three times the syllables as the name itself. But Berners-Lee could be quietly stubborn. “It sounds good,” he declared. So the title of the proposal was changed to “WorldWideWeb: Proposal for a HyperText Project.” Thus the Web was named.
Once the project was officially embraced, the CERN administrators wanted to patent it. When Cailliau raised the issue, Berners-Lee objected. He wanted the Web to spread and evolve as quickly as possible, and that meant it should be free and open. At one point he looked at Cailliau and asked accusingly, “Robert, do you want to be rich?” As Cailliau recalled, his initial reaction was “Well, it helps, no?”27 That was the incorrect response. “He apparently didn’t care about that,” Cailliau realized. “Tim’s not in it for the money. He accepts a much wider range of hotel-room facilities than a CEO would.”28
Instead Berners-Lee insisted that the Web protocols should be made available freely, shared openly, and put forever in the public domain. After all, the whole point of the Web, and the essence of its design, was to promote sharing and collaboration. CERN issued a document declaring that it “relinquishes all intellectual property rights to this code, both source and binary form, and permission is granted for anyone to use, duplicate, modify, and redistribute it.”29 Eventually CERN joined forces with Richard Stallman and adopted his GNU General Public License. The result was one of the grandest free and open-source projects in history.
That approach reflected Berners-Lee’s self-effacing style. He was averse to any hint of personal aggrandizement. Its wellsprings also came from someplace deeper within him: a moral outlook based on peer sharing and respect, something he found in the Unitarian Universalist Church that he adopted. As he said of his fellow Unitarians, “They meet in churches instead of wired hotels, and discuss justice, peace, conflict, and morality rather than protocols and data formats, but in other ways the peer respect is very similar to that of the Internet Engineering Task Force. . . . The design of the Internet and the Web is a search for a set of rules which will allow computers to work together in harmony, and our spiritual and social quest is for a set of rules which allow people to work together in harmony.”30
Despite the hoopla that accompanies many product announcements—think Bell Labs unveiling the transistor or Steve Jobs the Macintosh—some of the most momentous innovations tiptoe quietly onto history’s stage. On August 6, 1991, Berners-Lee was glancing through the Internet’s alt.hypertext newsgroup and ran across this question: “Is anyone aware of research or development efforts in . . . hypertext links enabling retrieval from multiple heterogeneous sources?” His answer, “from: [email protected] at 2:56 pm,” became the first public announcement of the Web. “The WorldWideWeb project aims to allow links to be made to any information anywhere,” he began. “If you’re interested in using the code, mail me.”31
With his low-key personality and even lower-key posting, Berners-Lee did not fathom what a profound idea he had unleashed. Any information anywhere. “I spent a lot of time trying to make sure people could put anything on the web,” he said more than two decades later. “I had no idea that people would put literally everything on it.”32 Yes, everything. Enquire Within Upon Everything.
MARC ANDREESSEN AND MOSAIC
For people to summon forth sites on the Web, they needed a piece of client software on their own computers that became known as a browser. Berners-Lee wrote one that could both read and edit documents; his hope was that the Web would become a place where users could collaborate. But his browser worked only on NeXT computers, of which there were few, and he had neither the time nor the resources to create other browser versions. So he enlisted a young intern at CERN, an undergraduate named Nicola Pellow who was majoring in math at Leicester Polytechnic, to write the first all-purpose browser for UNIX and Microsoft operating systems. It was rudimentary, but it worked. “It was to be the vehicle that allowed the Web to take its first tentative step on to the world stage, but Pellow was unfazed,” Cailliau recalled. “She was given the task and she simply sat down to do it, little realizing the enormity of what she was about to unleash.”33 Then she went back to Leicester Polytechnic.
Berners-Lee began urging others to improve on Pellow’s work: “We energetically suggested to everyone everywhere that the creation of browsers would make useful projects.”34 By the fall of 1991 there were a half-dozen experimental versions, and the Web quickly spread to other research centers in Europe.
That December it made the leap across the Atlantic. Paul Kunz, a particle physicist at the Stanford Linear Accelerator Center, was visiting CERN, and Berners-Lee recruited him to the world of the Web. “He twisted my arm and insisted that I come see him,” according to Kunz, who worried that he was in for a boring demonstration of information management. “But then he showed me something that opened my eyes.”35 It was a Web browser on Berners-Lee’s NeXT calling up information from an IBM machine somewhere else. Kunz brought the software back with him, and http://slacvm.slac.stanford.edu/ became the first Web server in the United States.
The World Wide Web hit orbital velocity in 1993. The year began with fifty Web servers in the world, and by October there were five hundred. One reason was that the primary alternative to the Web for accessing information on the Internet was a sending and fetching protocol developed at the University of Minnesota called Gopher,I and word leaked out that the developers were planning to charge a fee for use of the server software. A more important impetus was the creation of the first easy-to-install Web browser with graphic capabilities, named Mosaic. It was developed at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign, which had been funded by the Gore Act.
The man, or overgrown kid, most responsible for Mosaic was a gentle but intense undergraduate named Marc Andreessen, a corn-fed six-foot-four jolly giant born in Iowa in 1971 and raised in Wisconsin. Andreessen was a fan of the pioneers of the Internet, and their writings inspired him: “When I got a copy of Vannevar Bush’s ‘As We May Think,’ I said to myself, ‘Yep, there it is! He figured it out!’ Bush envisioned the Internet as fully as you could, given that you didn’t have digital computers. He and Charles Babbage are in the same league.” Another hero was Doug Engelbart. “His lab was node four on the Internet, which was like having the fourth telephone in the world. He had the amazing foresight to understand what the Internet would be before it got built.”36
When Andreessen saw the Web demonstrated in November 1992, he was blown away. So he enlisted an NCSA staffer, Eric Bina, a first-class programmer, to partner with him in building a more exciting browser. They loved Berners-Lee’s concepts, but they thought CERN’s implementation software was drab and devoid of cool features. “If someone were to build the right browser and server, that would be really interesting,” Andreessen told Bina. “We can run with this and really make it work.”37
For two months they engaged in a programming binge that rivaled those of Bill Gates and Paul Allen. For three or four days straight they would code around the clock—Andreessen fueled by milk and cookies, Bina by Skittles and Mountain Dew—and then crash for a full day to recover. They were a great team: Bina was a methodical programmer, Andreessen a product-driven visionary.38
On January 23, 1993, with just a little more fanfare than Berners-Lee had indulged in when launching the Web, [email protected] announced Mosaic on the www-talk Internet newsgroup. “By the power vested in me by nobody in particular,” Andreessen began, “alpha/beta version 0.5 of NCSA’s Motif-based networked information systems and World Wide Web browser, X Mosaic, is hereby released.” BernersLee, who was initially pleased, posted a response two days later: “Brilliant! Every new browser is sexier than the last.” He added it to the growing list of browsers available for download from info.cern.ch.39
Mosaic was popular because it could be installed simply and enabled images to be embedded in Web pages. But it became even more popular because Andreessen knew one of the secrets of digital-age entrepreneurs: he fanatically heeded user feedback and spent time on Internet newsgroups soaking up suggestions and complaints. Then he persistently released updated versions. “It was amazing to launch a product and get immediate feedback,” he enthused. “What I got out of that feedback loop was an instant sense of what was working and what wasn’t.”40
Andreessen’s focus on continual improvement impressed Berners-Lee: “You’d send him a bug report and then two hours later he’d mail you a fix.”41 Years later, as a venture capitalist, Andreessen made a rule of favoring startups whose founders focused on running code and customer service rather than charts and presentations. “The former are the ones who become the trillion-dollar companies,” he said.42
There was something about Andreessen’s browser, however, that disappointed and then began to annoy Berners-Lee. It was beautiful, even dazzling, but Andreessen’s emphasis was on enabling rich media for publishing eye-catching pages, and Berners-Lee felt that the focus should instead be on providing tools that would facilitate serious collaboration. So in March 1993, after a meeting in Chicago, he drove “across the seemingly interminable cornfields” of central Illinois to visit Andreessen and Bina at NCSA.
It was not a pleasant session. “All of my earlier meetings with browser developers had been meetings of minds,” Berners-Lee recalled. “But this one had a strange tension to it.” He felt that the Mosaic developers, who had their own public relations staff and were garnering a lot of publicity, were “attempting to portray themselves as the center of Web development and to basically rename the Web as Mosaic.”43 They seemed to be trying to own the Web, he thought, and perhaps profit from it.II
Andreessen found Berners-Lee’s recollection amusing. “When Tim came, it was more of a state visit than a working session. The Web had already become a brush fire, and he was uncomfortable that he was no longer controlling it.” Berners-Lee’s opposition to embedding images struck him as quaint and purist. “He only wanted text,” Andreessen remembered. “He specifically didn’t want magazines. He had a very pure vision. He basically wanted it used for scientific papers. His view was that images are the first step on the road to hell. And the road to hell is multimedia content and magazines, garishness and games and consumer stuff.” Because he was customer-focused, Andreessen thought that this was academic hogwash. “I’m a Midwestern tinkerer type. If people want images, they get images. Bring it on.”44
Berners-Lee’s more fundamental criticism was that by focusing on fancy display features, such as multimedia and ornamental fonts, Andreessen was ignoring a capability that should have been in the browser: editing tools that would allow users to interact with and contribute to the content on a Web page. The emphasis on display rather than editing tools nudged the Web into becoming a publishing platform for people who had servers rather than a place for collaboration and shared creativity. “I was disappointed that Marc didn’t put editing tools in Mosaic,” Berners-Lee said. “If there had been more of an attitude of using the Web as a collaborative medium rather than a publishing medium, then I think it would be much more powerful today.”45
Early versions of Mosaic did have a “collaborate” button, which allowed users to download a document, work on it, and repost it. But the browser was not a full-fledged editor, and Andreessen felt it was impractical to turn it into one. “I was amazed at this near-universal disdain for creating an editor,” complained Berners-Lee. “Without a hypertext editor, people would not have the tools to really use the Web as an intimate collaborative medium. Browsers would let them find and share information, but they could not work together intuitively.”46 To some extent, he was right. Despite the astonishing success of the Web, the world would have been a more interesting place if the Web had been bred as a more collaborative medium.