355 500 произведений, 25 200 авторов.

Электронная библиотека книг » Walter Isaacson » The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution » Текст книги (страница 27)
The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution
  • Текст добавлен: 21 сентября 2016, 18:47

Текст книги "The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution"


Автор книги: Walter Isaacson



сообщить о нарушении

Текущая страница: 27 (всего у книги 42 страниц)

Jobs’s two main visits with his team to Xerox PARC were in December 1979. Jef Raskin, an Apple engineer who was designing a friendly computer that would eventually become the Macintosh, had already seen what Xerox was doing and wanted to convince Jobs to look into it. One problem was that Jobs found Raskin insufferable—the technical terminology he used for Raskin was “a shithead who sucks”—but eventually Jobs made the pilgrimage. He had worked out a deal with Xerox that allowed the Apple folks to study the technology in return for allowing Xerox to make a million-dollar investment in Apple.

Jobs was certainly not the first outsider to see what Xerox PARC had wrought. Its researchers had given hundreds of demonstrations to visitors, and they had already distributed more than a thousand Xerox Altos, the expensive computer developed by Lampson, Thacker, and Kay that used a graphical user interface and other PARC innovations. But Jobs was the first to become obsessed with the idea of incorporating PARC’s interface ideas into a simple, inexpensive, personal computer. Once again, the greatest innovation would come not from the people who created the breakthroughs but from the people who applied them usefully.

On Jobs’s first visit, the Xerox PARC engineers, led by Adele Goldberg, who worked with Alan Kay, were reserved. They did not show Jobs much. But he threw a tantrum—“Let’s stop this bullshit!” he kept shouting—and finally was given, at the behest of Xerox’s top management, a fuller show. Jobs bounced around the room as his engineers studied each pixel on the screen. “You’re sitting on a goldmine,” he shouted. “I can’t believe Xerox is not taking advantage of this.”

There were three major innovations on display. The first was Ethernet, the technologies developed by Bob Metcalfe for creating local area networks. Like Gates and other pioneers of personal computers, Jobs was not very interested—certainly not as interested as he should have been—in networking technology. He was focused on the ability of computers to empower individuals rather than to facilitate collaboration. The second innovation was object-oriented programming. That, likewise, did not grab Jobs, who was not a programmer.

What caught his attention was the graphical user interface featuring a desktop metaphor that was as intuitive and friendly as a neighborhood playground. It had cute icons for documents and folders and other things you might want, including a trash can, and a mouse-controlled cursor that made them easy to click. Not only did Jobs love it, but he could see ways to improve it, make it simpler and more elegant.

The GUI was made possible by bitmapping, another innovation pioneered at Xerox PARC. Until then, most computers, including the Apple II, would merely generate numerals or letters on the screen, usually in a ghastly green against a black background. Bitmapping allowed each and every pixel on the screen to be controlled by the computer—turned off or on and in any color. That permitted all sorts of wonderful displays, fonts, designs, and graphics. With his feel for design, familiarity with fonts, and love of calligraphy, Jobs was blown away by bitmapping. “It was like a veil being lifted from my eyes,” he recalled. “I could see what the future of computing was destined to be.”

As Jobs drove back to Apple’s office in Cupertino, at a speed that would have awed even Gates, he told his colleague Bill Atkinson that they had to incorporate—and improve upon—Xerox’s graphical interface in future Apple computers, such as the forthcoming Lisa and Macintosh. “This is it!” he shouted. “We’ve got to do it!” It was a way to bring computers to the people.108

Later, when he was challenged about pilfering Xerox’s ideas, Jobs quoted Picasso: “Good artists copy, great artists steal.” He added, “And we have always been shameless about stealing great ideas.” He also crowed that Xerox had fumbled its idea. “They were copier-heads who had no clue about what a computer could do,” he said of Xerox’s management. “They just grabbed defeat from the greatest victory in the computer industry. Xerox could have owned the entire computer industry.”109

In fact, neither explanation does Jobs and Apple justice. As the case of the forgotten Iowa inventor John Atanasoff shows, conception is just the first step. What really matters is execution. Jobs and his team took Xerox’s ideas, improved them, implemented them, and marketed them. Xerox had the chance to do that, and they in fact tried to, with a machine called the Xerox Star. It was clunky and kludgy and costly, and it flopped. The Apple team simplified the mouse so it had only one button, gave it the power to move documents and other items around the screen, allowed file extensions to be changed just by dragging a document and “dropping” it into a folder, created pull-down menus, and allowed the illusion of documents piling on top of each other and overlapping.

Apple launched Lisa in January 1983 and then, more successfully, Macintosh a year later. Jobs knew when he unveiled the Mac that it would propel the personal computer revolution by being a machine that was friendly enough to take home. At the dramatic product launch, he walked across a dark stage to pull the new computer out of a cloth bag. The theme from Chariots of Fire began to play, and the word MACINTOSH scrolled horizontally across the screen, then underneath it the words insanely great! appeared in elegant script, as if being slowly written by hand. There was a moment of awed silence in the auditorium, then a few gasps. Most had never seen, or even imagined, something so spectacular. The screen then flicked through displays of different fonts, documents, charts, drawings, a chess game, spreadsheet, and a rendering of Jobs with a thought bubble containing a Macintosh by his head. The ovation lasted for five minutes.110

The Macintosh launch was accompanied by a memorable ad, “1984,” that showed a young heroine outracing the authoritarian police to throw a hammer into a screen, destroying Big Brother. It was Jobs the rebel taking on IBM. And Apple now had an advantage: it had perfected and implemented a graphical user interface, the great new leap in human-machine interaction, while IBM and its operating system supplier Microsoft were still using curt command lines with c:> prompts.

WINDOWS

In the early 1980s, before the introduction of the Macintosh, Microsoft had a good relationship with Apple. In fact, on the day that IBM launched its PC in August 1981, Gates was visiting Jobs at Apple, which was a regular occurrence since Microsoft was making most of its revenue writing software for the Apple II. Gates was still the supplicant in the relationship. In 1981 Apple had $334 million in revenue, compared to Microsoft’s $15 million. Jobs wanted Microsoft to write new versions of its software for the Macintosh, which was still a secret development project. So at their August 1981 meeting, he confided his plans to Gates.

Gates thought that the idea of the Macintosh—an inexpensive computer for the masses with a simple graphical user interface—sounded, as he put it, “super neat.” He was willing, indeed eager, to have Microsoft write application software for it. So he invited Jobs up to Seattle. In his presentation there to the Microsoft engineers, Jobs was at his charismatic best. With a bit of metaphorical license, he spun his vision of a factory in California that would take in sand, the raw material of silicon, and churn out an “information appliance” that was so simple it would need no manual. The Microsoft folks code-named the project “Sand.” They even reverse-engineered it into an acronym: Steve’s Amazing New Device.111

Jobs had one major worry about Microsoft: he didn’t want it to copy the graphical user interface. With his feel for what would wow average consumers, he knew that the desktop metaphor with point-and-click navigation would be, if done right, the breakthrough that would make computers truly personal. At a design conference in Aspen in 1981, he waxed eloquently about how friendly computer screens would become by using “metaphors that people already understand such as that of documents on a desktop.” His fear that Gates would steal the idea was somewhat ironic, since Jobs himself had filched the concept from Xerox. But to Jobs’s way of thinking, he had made a business deal for the rights to appropriate Xerox’s idea. Plus he had improved it.

So Jobs wrote into his contract with Microsoft a clause that he believed would give Apple at least a year’s head start in having a graphical user interface. It decreed that for a certain period Microsoft would not produce for any company other than Apple any software that “utilizes a mouse or tracking ball” or had a point-and-click graphical interface. But Jobs’s reality distortion field got the better of him. Because he was so intent on getting Macintosh on the market by late 1982, he became convinced that it would happen. So he agreed that the prohibition would last until the end of 1983. As it turned out, Macintosh did not ship until January 1984.

In September 1981 Microsoft secretly began designing a new operating system, intended to replace DOS, based on the desktop metaphor with windows, icons, mouse, and pointer. It hired from Xerox PARC Charles Simonyi, a software engineer who had worked alongside Alan Kay in creating graphical programs for the Xerox Alto. In February 1982 the Seattle Times ran a picture of Gates and Allen that, as a sharp-eyed reader may have noted, had a whiteboard in the background with a few sketches and the words Window manager on top. By that summer, just as Jobs began to realize that the release date for the Macintosh would slip until at least late 1983, he became paranoid. His fears were heightened when his close pal Andy Hertzfeld, an engineer on the Macintosh team, reported that his contact at Microsoft had begun asking detailed questions about how bitmapping was executed. “I told Steve that I suspected that Microsoft was going to clone the Mac,” Hertzfeld recalled.112

Jobs’s fears were realized in November 1983, two months before the Macintosh was launched, when Gates held a press conference at the Palace Hotel in Manhattan. He announced that Microsoft was developing a new operating system that would be available for IBM PCs and their clones, featuring a graphical user interface. It would be called Windows.

Gates was within his rights. His restrictive agreement with Apple expired at the end of 1983, and Microsoft did not plan to ship Windows until well after that. (As it turned out, Microsoft took so long to finish even a shoddy version 1.0 that Windows would not end up shipping until November 1985.) Nevertheless, Jobs was livid, which was not a pretty sight. “Get Gates down here immediately,” he ordered one of his managers. Gates complied, but he was unintimidated. “He called me down to get pissed off at me,” Gates recalled. “I went down to Cupertino, like a command performance. I told him, ‘We’re doing Windows.’ I said to him, ‘We’re betting our company on graphics interface.’ ” In a conference room filled with awed Apple employees, Jobs shouted back, “You’re ripping us off! I trusted you, and now you’re stealing from us!”113 Gates had a habit of getting calmer and cooler whenever Jobs worked himself into a frenzy. At the end of Jobs’s tirade, Gates looked at him and, in his squeaky voice, replied with what became a classic zinger: “Well, Steve, I think there’s more than one way of looking at it. I think it’s more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.”114

Jobs remained angry and resentful for the rest of his life. “They just ripped us off completely, because Gates has no shame,” he said almost thirty years later, shortly before he died. Upon hearing this, Gates responded, “If he believes that, he really has entered into one of his own reality distortion fields.”115

The courts ended up ruling that Gates was legally correct. A decision by a federal appeals court noted that “GUIs were developed as a user-friendly way for ordinary mortals to communicate with the Apple computer . . . based on a desktop metaphor with windows, icons and pull-down menus which can be manipulated on the screen with a hand-held device called a mouse.” But it ruled, “Apple cannot get patent-like protection for the idea of a graphical user interface, or the idea of a desktop metaphor.” Protecting a look-and-feel innovation was almost impossible.

Whatever the legalities were, Jobs had a right to be angry. Apple had been more innovative, imaginative, elegant in execution, and brilliant in design. Microsoft’s GUI was shoddy, with tiled windows that could not overlap with each other and graphics that looked like they had been designed by drunkards in a Siberian basement.

Nevertheless, Windows eventually clawed its way to dominance, not because its design was better but because its business model was better. The market share commanded by Microsoft Windows reached 80 percent by 1990 and kept rising, to 95 percent by 2000. For Jobs, Microsoft’s success represented an aesthetic flaw in the way the universe worked. “The only problem with Microsoft is they just have no taste, they have absolutely no taste,” he later said. “I don’t mean that in a small way. I mean that in a big way, in the sense that they don’t think of original ideas and they don’t bring much culture into their product.”116

The primary reason for Microsoft’s success was that it was willing and eager to license its operating system to any hardware maker. Apple, by contrast, opted for an integrated approach. Its hardware came only with its software and vice versa. Jobs was an artist, a perfectionist, and thus a control freak who wanted to be in charge of the user experience from beginning to end. Apple’s approach led to more beautiful products, a higher profit margin, and a more sublime user experience. Microsoft’s approach led to a wider choice of hardware. It also turned out to be a better path for gaining market share.

RICHARD STALLMAN, LINUS TORVALDS, AND THE FREE AND OPEN-SOURCE SOFTWARE MOVEMENTS

In late 1983, just as Jobs was preparing to unveil the Macintosh and Gates was announcing Windows, another approach to the creation of software emerged. It was pushed by one of the diehard denizens of the MIT Artificial Intelligence Lab and Tech Model Railroad Club, Richard Stallman, a truth-possessed hacker with the looks of an Old Testament prophet. With even greater moral fervor than the Homebrew Computer Club members who copied tapes of Microsoft BASIC, Stallman believed that software should be collaboratively created and freely shared.117

At first glance, this did not seem like an approach that would provide incentives for people to produce great software. The joy of free sharing wasn’t what motivated Gates, Jobs, and Bricklin. But because there was a collaborative and communitarian ethic that permeated hacker culture, the free and open-source software movements ended up being powerful forces.

Born in 1953, Richard Stallman was intensely interested in math as a child growing up in Manhattan, and he conquered calculus on his own as a young boy. “Mathematics has something in common with poetry,” he later said. “It’s made out of these true relationships, true steps, true deductions, so it has this beauty about it.” Unlike his classmates, he was deeply averse to competition. When his high school teacher divided the students into two teams for a quiz contest, Stallman refused to answer any questions. “I resisted the notion of competing,” he explained. “I saw that I was being manipulated and my classmates were falling prey to this manipulation. They all wanted to beat the other people, who were just as much their friends as were the people on their own team. They started demanding that I answer the questions so we could win. But I resisted the pressure because I had no preference for one team or the other.”118

Stallman went to Harvard, where he became a legend even among the math wizards, and during the summers and after he graduated he worked at the MIT Artificial Intelligence Lab, two subway stops away in Cambridge. There he added to the train track layout at the Tech Model Railroad Club, wrote a PDP-11 simulator to run on the PDP-10, and grew enamored with the collaborative culture. “I became part of a software-sharing community that had existed for many years,” he recalled. “Whenever people from another university or a company wanted to port and use a program, we gladly let them. You could always ask to see the source code.”119

Like a good hacker, Stallman defied restrictions and locked doors. With his fellow students, he devised multiple ways to break into offices where there were forbidden terminals; his own specialty was climbing through the false ceilings, pushing aside a tile, and lowering a long strip of magnetic tape tipped with wads of sticky duct tape to open door handles. When MIT instituted a database of users and a system of strong passwords, Stallman resisted, and he rallied his colleagues to do so as well: “I thought that was disgusting, so I didn’t fill out the form and I created a null-set password.” At one point a professor warned that the university might delete his directory of files. That would be unfortunate for everyone, Stallman replied, since some of the system’s resources were in his directory.120

Unfortunately for Stallman, the hacker camaraderie at MIT began to dissipate in the early 1980s. The lab bought a new time-sharing computer with a software system that was proprietary. “You had to sign a nondisclosure agreement even to get an executable copy,” Stallman lamented. “This meant that the first step in using a computer was to promise not to help your neighbor. A cooperating community was forbidden.”121

Instead of rebelling, many of his colleagues joined for-profit software firms, including a spinoff from the MIT lab called Symbolics, where they made a lot of money by not sharing freely. Stallman, who sometimes slept in his office and looked like he shopped in a thrift store, did not share their money-seeking motivations and regarded them as traitors. The final straw came when Xerox donated a new laser printer and Stallman wanted to institute a software hack so that it would warn users on the network when it jammed. He asked someone to provide the printer’s source code, but he refused, saying he had signed a nondisclosure agreement. Stallman was morally outraged.

All of these events turned Stallman into even more of a Jeremiah, railing against idolatry and preaching from a book of lamentations. “Some people do compare me with an Old Testament prophet, and the reason is Old Testament prophets said certain social practices were wrong,” he asserted. “They wouldn’t compromise on moral issues.”122 Neither would Stallman. Proprietary software was “evil,” he said, because “it required people to agree not to share and that made society ugly.” The way to resist and defeat the forces of evil, he decided, was to create free software.

So in 1982, repelled by the selfishness that seemed to pervade Reagan-era society as well as software entrepreneurs, Stallman embarked on a mission to create an operating system that was free and completely nonproprietary. In order to prevent MIT from making a claim to any rights to it, he quit his job at the Artificial Intelligence Lab, though he was allowed by his indulgent supervisor to keep his key and continue using the lab’s resources. The operating system Stallman decided to develop was one that would be similar to and compatible with UNIX, which had been developed at Bell Labs in 1971 and was the standard for most universities and hackers. With a coder’s subtle humor, Stallman created a recursive acronym for his new operating system, GNU, which stood for GNU’s Not UNIX.

In the March 1985 issue of Dr. Dobb’s Journal, a publication that sprang out of the Homebrew Computer Club and People’s Computer Company, Stallman issued a manifesto: “I consider that the Golden Rule requires that if I like a program I must share it with other people who like it. Software sellers want to divide the users and conquer them, making each user agree not to share with others. I refuse to break solidarity with other users in this way. . . . Once GNU is written, everyone will be able to obtain good system software free, just like air.”123

Stallman’s free software movement was imperfectly named. Its goal was not to insist that all software come free of charge but that it be liberated from any restrictions. “When we call software ‘free,’ we mean that it respects the users’ essential freedoms: the freedom to run it, to study and change it, and to redistribute copies with or without changes,” he repeatedly had to explain. “This is a matter of freedom, not price, so think of ‘free speech,’ not ‘free beer.’ ”

For Stallman, the free software movement was not merely a way to develop peer-produced software; it was a moral imperative for making a good society. The principles that it promoted were, he said, “essential not just for the individual users’ sake, but for society as a whole because they promote social solidarity—that is, sharing and cooperation.”124

To enshrine and certify his creed, Stallman came up with a GNU General Public License and also the concept, suggested by a friend, of “copyleft,” which is the flipside of asserting a copyright. The essence of the General Public License, Stallman said, is that it gives “everyone permission to run the program, copy the program, modify the program, and distribute modified versions—but not permission to add restrictions of their own.”125

Stallman personally wrote the first components for the GNU operating system, including a text editor, a compiler, and many other tools. But it became increasingly clear that one key element was missing. “What about the kernel?” Byte magazine asked in a 1986 interview. The central module of an operating system, a kernel manages the requests from software programs and turns them into instructions for the computer’s central processing unit. “I’m finishing the compiler before I go to work on the kernel,” Stallman answered. “I am also going to have to rewrite the file system.”126

For a variety of reasons, he found it difficult to complete a kernel for GNU. Then, in 1991, one became available not from Stallman or his Free Software Foundation, but from a most unexpected source: a twenty-one-year-old toothy and boyish Swedish-speaking Finn at the University of Helsinki named Linus Torvalds.

Linus Torvalds’s father was a Communist Party member and TV journalist, his mother a student radical and then print journalist, but as a child in Helsinki he became more interested in technology than in politics.127 He described himself as “good at math, good at physics, and with no social graces whatsoever, and this was before being a nerd was considered a good thing.”128 Especially in Finland.

When Torvalds was eleven, his grandfather, a professor of statistics, gave him a used Commodore Vic 20, one of the first personal computers. Using BASIC, Torvalds began writing his own programs, including one that amused his younger sister by writing “Sara is the best” over and over. “One of the biggest joys,” he said, “was learning that computers are like mathematics: You get to make up your own world with its own rules.”

Tuning out his father’s urgings to learn to play basketball, Torvalds focused instead on learning to write programs in machine language, the numerical instructions executed directly by a computer’s central processing unit, exposing him to the joy of being “intimate with a machine.” He later felt lucky to have learned assembly language and machine code on a very basic device: “Computers were actually better for kids when they were less sophisticated, when dweebie youngsters like me could tinker under the hood.”129 Like car engines, computers eventually became harder to take apart and put back together.

After enrolling in the University of Helsinki in 1988 and serving his year in the Finnish Army, Torvalds bought an IBM clone with an Intel 386 processor. Unimpressed with its MS-DOS, which Gates and company had produced, he decided that he wanted to install UNIX, which he had learned to like on the university’s mainframes. But UNIX cost $5,000 per copy and wasn’t configured to run on a home computer. Torvalds set out to remedy that.

He read a book on operating systems by a computer science professor in Amsterdam, Andrew Tanenbaum, who had developed MINIX, a small clone of UNIX for teaching purposes. Deciding that he would replace the MS-DOS with MINIX on his new PC, Torvalds paid the $169 license fee (“I thought it was outrageous”), installed the sixteen floppy disks, and then started to supplement and modify MINIX to suit his tastes.

Torvalds’s first addition was a terminal emulation program so that he could dial into the university’s mainframe. He wrote the program from scratch in assembly language, “at the bare hardware level,” so he didn’t need to depend on MINIX. During the late spring of 1991, he hunkered down to code just as the sun reappeared from its winter hibernation. Everyone was emerging into the outdoors, except him. “I was spending most of my time in a bathrobe, huddled over my unattractive new computer, with thick black window shades shielding me from the sunlight.”

Once he got a rudimentary terminal emulator working, he wanted to be able to download and upload files, so he built a disk driver and file system driver. “By the time I did this it was clear the project was on its way to becoming an operating system,” he recalled. In other words, he was embarking on building a software package that could serve as a kernel for a UNIX-like operating system. “One moment I’m in my threadbare robe hacking away on a terminal emulator with extra functions. The next moment I realize it’s accumulating so many functions that it has metamorphosed into a new operating system in the works.” He figured out the hundreds of “system calls” that UNIX could do to get the computer to perform basic operations such as Open and Close, Read and Write, and then wrote programs to implement them in his own way. He was still living in his mother’s apartment, often fighting with his sister Sara, who had a normal social life, because his modem hogged their phone line. “Nobody could call us,” she complained.130

Torvalds initially planned to name his new software “Freax,” to evoke “free” and “freaks” and “UNIX.” But the person who ran the FTP site he was using didn’t like the name, so Torvalds resorted to calling it “Linux,” which he pronounced, similarly to the way he pronounced his first name, “LEE-nucks.”131 “I never wanted to use that name because I felt, OK, that’s a little too egotistical,” he said. But he later conceded that there was a part of his ego that enjoyed getting acclaim after so many years of living in the body of a reclusive nerd, and he was glad he went along with the name.132

In the early fall of 1991, when the Helsinki sun started disappearing again, Torvalds emerged with the shell of his system, which contained ten thousand lines of code.V Instead of trying to market what he had produced, he decided simply to offer it publicly. He had recently gone with a friend to hear a lecture by Stallman, who had become an itinerant global preacher for the doctrine of free software. Torvalds didn’t actually get religion or embrace the dogma: “It probably didn’t make a huge impact on my life at that point. I was interested in the technology, not the politics—I had enough politics at home.”133 But he did see the practical advantages of the open approach. Almost by instinct rather than as a philosophical choice, he felt Linux should be freely shared with the hope that anyone who used it might help improve it.

On October 5, 1991, he posted a cheeky message on the MINIX discussion newsgroup. “Do you pine for the nice days of minix-1.1, when men were men and wrote their own device drivers?” he began. “I’m working on a free version of a minix-lookalike for AT-386 computers. It has finally reached the stage where it’s even usable (though may not be depending on what you want), and I am willing to put out the sources for wider distribution.”134

“It wasn’t much of a decision to post it,” he recalled. “It was how I was accustomed to exchanging programs.” In the computer world, there was (and still is) a strong culture of shareware, in which people voluntarily sent in a few dollars to someone whose program they downloaded. “I was getting emails from people asking me if I would like them to send me thirty bucks or so,” Torvalds said. He had racked up $5,000 in student loans and was still paying $50 a month for the installment loan on his computer. But instead of seeking donations he asked for postcards, and they started flooding in from people all over the world who were using Linux. “Sara typically picked up the mail, and she was suddenly impressed that her combative older brother was somehow hearing from new friends so far away,” Torvalds recalled. “It was her first tip-off that I was doing anything potentially useful during those many hours when I had the phone line engaged.”

Torvalds’s decision to eschew payments came from a mix of reasons, as he later explained, including a desire to live up to his family heritage:

I felt I was following in the footsteps of centuries of scientists and other academics who built their work on the foundations of others. . . . I also wanted feedback (okay, and praise). It didn’t make sense to charge people who could potentially help improve my work. I suppose I would have approached it differently if I had not been raised in Finland, where anyone exhibiting the slightest sign of greediness is viewed with suspicion, if not envy. And yes, I undoubtedly would have approached the whole no-money thing a lot differently if I had not been brought up under the influence of a diehard academic grandfather and a diehard communist father.


    Ваша оценка произведения:

Популярные книги за неделю