Текст книги "The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution"
Автор книги: Walter Isaacson
Жанр:
Биографии и мемуары
сообщить о нарушении
Текущая страница: 1 (всего у книги 42 страниц)
HOW A GROUP OF HACKERS, GENIUSES, AND GEEKS CREATED THE DIGITAL REVOLUTION
Thank you for downloading this Simon & Schuster eBook.
Join our mailing list and get updates on new releases, deals, bonus content and other great books from Simon & Schuster.
CLICK HERE TO SIGN UP
or visit us online to sign up at
eBookNews.SimonandSchuster.com
CONTENTS
Illustrated Timeline
Introduction
CHAPTER 1
Ada, Countess of Lovelace
CHAPTER 2
The Computer
CHAPTER 3
Programming
CHAPTER 4
The Transistor
CHAPTER 5
The Microchip
CHAPTER 6
Video Games
CHAPTER 7
The Internet
CHAPTER 8
The Personal Computer
CHAPTER 9
Software
CHAPTER 10
Online
CHAPTER 11
The Web
CHAPTER 12
Ada Forever
Acknowledgments
About the Author
Notes
Photo Credits
Index
1843
Ada, Countess of Lovelace, publishes “Notes” on Babbage’s Analytical Engine.
1847
George Boole creates a system using algebra for logical reasoning.
1890
The census is tabulated with Herman Hollerith’s punch-card machines.
1931
Vannevar Bush devises the Differential Analyzer, an analog electromechanical computer.
1935
Tommy Flowers pioneers use of vacuum tubes as on-off switches in circuits.
1937
Alan Turing publishes “On Computable Numbers,” describing a universal computer.
Claude Shannon describes how circuits of switches can perform tasks of Boolean algebra.
Bell Labs’ George Stibitz proposes a calculator using an electric circuit.
Howard Aiken proposes construction of large digital computer and discovers parts of Babbage’s Difference Engine at Harvard.
John Vincent Atanasoff puts together concepts for an electronic computer during a long December night’s drive.
1938
William Hewlett and David Packard form company in Palo Alto garage.
1939
Atanasoff finishes model of electronic computer with mechanical storage drums.
Turing arrives at Bletchley Park to work on breaking German codes.
1941
Konrad Zuse completes Z3, a fully functional electromechanical programmable digital computer.
John Mauchly visits Atanasoff in Iowa, sees computer demonstrated.
1942
Atanasoff completes partly working computer with three hundred vacuum tubes, leaves for Navy.
1943
Colossus, a vacuum-tube computer to break German codes, is completed at Bletchley Park.
1944
Harvard Mark I goes into operation.
John von Neumann goes to Penn to work on ENIAC.
1945
Von Neumann writes “First Draft of a Report on the EDVAC” describing a stored-program computer.
Six women programmers of ENIAC are sent to Aberdeen for training.
Vannevar Bush publishes “As We May Think,” describing personal computer.
Bush publishes “Science, the Endless Frontier,” proposing government funding of academic and industrial research.
ENIAC is fully operational.
1947
Transistor invented at Bell Labs.
1950
Turing publishes article describing a test for artificial intelligence.
1952
Grace Hopper develops first computer compiler.
Von Neumann completes modern computer at the Institute for Advanced Study.
UNIVAC predicts Eisenhower election victory.
1954
Turing commits suicide.
Texas Instruments introduces silicon transistor and helps launch Regency radio.
1956
Shockley Semiconductor founded.
First artificial intelligence conference.
1957
Robert Noyce, Gordon Moore, and others form Fairchild Semiconductor.
Russia launches Sputnik.
1958
Advanced Research Projects Agency (ARPA) announced.
Jack Kilby demonstrates integrated circuit, or microchip.
1959
Noyce and Fairchild colleagues independently invent microchip.
1960
J. C. R. Licklider publishes “Man-Computer Symbiosis.”
Paul Baran at RAND devises packet switching.
1961
President Kennedy proposes sending man to the moon.
1962
MIT hackers create Spacewar game.
Licklider becomes founding director of ARPA’s Information Processing Techniques Office.
Doug Engelbart publishes “Augmenting Human Intellect.”
1963
Licklider proposes an “Intergalactic Computer Network.”
Engelbart and Bill English invent the mouse.
1964
Ken Kesey and the Merry Pranksters take bus trip across America.
1965
Ted Nelson publishes first article about “hypertext.”
Moore’s Law predicts microchips will double in power each year or so.
1966
Stewart Brand hosts Trips Festival with Ken Kesey.
Bob Taylor convinces ARPA chief Charles Herzfeld to fund ARPANET.
Donald Davies coins the term packet switching.
1967
ARPANET design discussions in Ann Arbor and Gatlinburg.
1968
Larry Roberts sends out request for bids to build the ARPANET’s IMPs.
Noyce and Moore form Intel, hire Andy Grove.
Brand publishes first Whole Earth Catalog.
Engelbart stages the Mother of All Demos with Brand’s help.
1969
First nodes of ARPANET installed.
1971
Don Hoefler begins column for Electronic News called “Silicon Valley USA.”
Demise party for Whole Earth Catalog.
Intel 4004 microprocessor unveiled.
Ray Tomlinson invents email.
1972
Nolan Bushnell creates Pong at Atari with Al Alcorn.
1973
Alan Kay helps to create the Alto at Xerox PARC.
Ethernet developed by Bob Metcalfe at Xerox PARC.
Community Memory shared terminal set up at Leopold’s Records, Berkeley.
Vint Cerf and Bob Kahn complete TCP/IP protocols for the Internet.
1974
Intel 8080 comes out.
1975
Altair personal computer from MITS appears.
Paul Allen and Bill Gates write BASIC for Altair, form Microsoft.
First meeting of Homebrew Computer Club.
Steve Jobs and Steve Wozniak launch the Apple I.
1977
The Apple II is released.
1978
First Internet Bulletin Board System.
1979
Usenet newsgroups invented.
Jobs visits Xerox PARC.
1980
IBM commissions Microsoft to develop an operating system for PC.
1981
Hayes modem marketed to home users.
1983
Microsoft announces Windows.
Richard Stallman begins developing GNU, a free operating system.
1984
Apple introduces Macintosh.
1985
Stewart Brand and Larry Brilliant launch The WELL.
CVC launches Q-Link, which becomes AOL.
1991
Linus Torvalds releases first version of Linux kernel.
Tim Berners-Lee announces World Wide Web.
1993
Marc Andreessen announces Mosaic browser.
Steve Case’s AOL offers direct access to the Internet.
1994
Justin Hall launches Web log and directory.
HotWired and Time Inc.’s Pathfinder become first major magazine publishers on Web.
1995
Ward Cunningham’s Wiki Wiki Web goes online.
1997
IBM’s Deep Blue beats Garry Kasparov in chess.
1998
Larry Page and Sergey Brin launch Google.
1999
Ev Williams launches Blogger.
2001
Jimmy Wales, with Larry Sanger, launches Wikipedia.
2011
IBM’s computer Watson wins Jeopardy!
INTRODUCTION
HOW THIS BOOK CAME TO BE
The computer and the Internet are among the most important inventions of our era, but few people know who created them. They were not conjured up in a garret or garage by solo inventors suitable to be singled out on magazine covers or put into a pantheon with Edison, Bell, and Morse. Instead, most of the innovations of the digital age were done collaboratively. There were a lot of fascinating people involved, some ingenious and a few even geniuses. This is the story of these pioneers, hackers, inventors, and entrepreneurs—who they were, how their minds worked, and what made them so creative. It’s also a narrative of how they collaborated and why their ability to work as teams made them even more creative.
The tale of their teamwork is important because we don’t often focus on how central that skill is to innovation. There are thousands of books celebrating people we biographers portray, or mythologize, as lone inventors. I’ve produced a few myself. Search the phrase “the man who invented” on Amazon and you get 1,860 book results. But we have far fewer tales of collaborative creativity, which is actually more important in understanding how today’s technology revolution was fashioned. It can also be more interesting.
We talk so much about innovation these days that it has become a buzzword, drained of clear meaning. So in this book I set out to report on how innovation actually happens in the real world. How did the most imaginative innovators of our time turn disruptive ideas into realities? I focus on a dozen or so of the most significant breakthroughs of the digital age and the people who made them. What ingredients produced their creative leaps? What skills proved most useful? How did they lead and collaborate? Why did some succeed and others fail?
I also explore the social and cultural forces that provide the atmosphere for innovation. For the birth of the digital age, this included a research ecosystem that was nurtured by government spending and managed by a military-industrial-academic collaboration. Intersecting with that was a loose alliance of community organizers, communal-minded hippies, do-it-yourself hobbyists, and homebrew hackers, most of whom were suspicious of centralized authority.
Histories can be written with a different emphasis on any of these factors. An example is the invention of the Harvard/IBM Mark I, the first big electromechanical computer. One of its programmers, Grace Hopper, wrote a history that focused on its primary creator, Howard Aiken. IBM countered with a history that featured its teams of faceless engineers who contributed the incremental innovations, from counters to card feeders, that went into the machine.
Likewise, what emphasis should be put on great individuals versus on cultural currents has long been a matter of dispute; in the mid-nineteenth century, Thomas Carlyle declared that “the history of the world is but the biography of great men,” and Herbert Spencer responded with a theory that emphasized the role of societal forces. Academics and participants often view this balance differently. “As a professor, I tended to think of history as run by impersonal forces,” Henry Kissinger told reporters during one of his Middle East shuttle missions in the 1970s. “But when you see it in practice, you see the difference personalities make.”1 When it comes to digital-age innovation, as with Middle East peacemaking, a variety of personal and cultural forces all come into play, and in this book I sought to weave them together.
The Internet was originally built to facilitate collaboration. By contrast, personal computers, especially those meant to be used at home, were devised as tools for individual creativity. For more than a decade, beginning in the early 1970s, the development of networks and that of home computers proceeded separately from one another. They finally began coming together in the late 1980s with the advent of modems, online services, and the Web. Just as combining the steam engine with ingenious machinery drove the Industrial Revolution, the combination of the computer and distributed networks led to a digital revolution that allowed anyone to create, disseminate, and access any information anywhere.
Historians of science are sometimes wary about calling periods of great change revolutions, because they prefer to view progress as evolutionary. “There was no such thing as the Scientific Revolution, and this is a book about it,” is the wry opening sentence of the Harvard professor Steven Shapin’s book on that period. One method that Shapin used to escape his half-joking contradiction is to note how the key players of the period “vigorously expressed the view” that they were part of a revolution. “Our sense of radical change afoot comes substantially from them.”2
Likewise, most of us today share a sense that the digital advances of the past half century are transforming, perhaps even revolutionizing the way we live. I can recall the excitement that each new breakthrough engendered. My father and uncles were electrical engineers, and like many of the characters in this book I grew up with a basement workshop that had circuit boards to be soldered, radios to be opened, tubes to be tested, and boxes of transistors and resistors to be sorted and deployed. As an electronics geek who loved Heathkits and ham radios (WA5JTP), I can remember when vacuum tubes gave way to transistors. At college I learned programming using punch cards and recall when the agony of batch processing was replaced by the ecstasy of hands-on interaction. In the 1980s I thrilled to the static and screech that modems made when they opened for you the weirdly magical realm of online services and bulletin boards, and in the early 1990s I helped to run a digital division at Time and Time Warner that launched new Web and broadband Internet services. As Wordsworth said of the enthusiasts who were present at the beginning of the French Revolution, “Bliss was it in that dawn to be alive.”
I began work on this book more than a decade ago. It grew out of my fascination with the digital-age advances I had witnessed and also from my biography of Benjamin Franklin, who was an innovator, inventor, publisher, postal service pioneer, and all-around information networker and entrepreneur. I wanted to step away from doing biographies, which tend to emphasize the role of singular individuals, and once again do a book like The Wise Men, which I had coauthored with a colleague about the creative teamwork of six friends who shaped America’s cold war policies. My initial plan was to focus on the teams that invented the Internet. But when I interviewed Bill Gates, he convinced me that the simultaneous emergence of the Internet and the personal computer made for a richer tale. I put this book on hold early in 2009, when I began working on a biography of Steve Jobs. But his story reinforced my interest in how the development of the Internet and computers intertwined, so as soon as I finished that book, I went back to work on this tale of digital-age innovators.
The protocols of the Internet were devised by peer collaboration, and the resulting system seemed to have embedded in its genetic code a propensity to facilitate such collaboration. The power to create and transmit information was fully distributed to each of the nodes, and any attempt to impose controls or a hierarchy could be routed around. Without falling into the teleological fallacy of ascribing intentions or a personality to technology, it’s fair to say that a system of open networks connected to individually controlled computers tended, as the printing press did, to wrest control over the distribution of information from gatekeepers, central authorities, and institutions that employed scriveners and scribes. It became easier for ordinary folks to create and share content.
The collaboration that created the digital age was not just among peers but also between generations. Ideas were handed off from one cohort of innovators to the next. Another theme that emerged from my research was that users repeatedly commandeered digital innovations to create communications and social networking tools. I also became interested in how the quest for artificial intelligence—machines that think on their own—has consistently proved less fruitful than creating ways to forge a partnership or symbiosis between people and machines. In other words, the collaborative creativity that marked the digital age included collaboration between humans and machines.
Finally, I was struck by how the truest creativity of the digital age came from those who were able to connect the arts and sciences. They believed that beauty mattered. “I always thought of myself as a humanities person as a kid, but I liked electronics,” Jobs told me when I embarked on his biography. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” The people who were comfortable at this humanities-technology intersection helped to create the human-machine symbiosis that is at the core of this story.
Like many aspects of the digital age, this idea that innovation resides where art and science connect is not new. Leonardo da Vinci was the exemplar of the creativity that flourishes when the humanities and sciences interact. When Einstein was stymied while working out General Relativity, he would pull out his violin and play Mozart until he could reconnect to what he called the harmony of the spheres.
When it comes to computers, there is one other historical figure, not as well known, who embodied the combination of the arts and sciences. Like her famous father, she understood the romance of poetry. Unlike him, she also saw the romance of math and machinery. And that is where our story begins.
Ada, Countess of Lovelace (1815–52), painted by Margaret Sarah Carpenter in 1836.
Lord Byron (1788–1824), Ada’s father, in Albanian dress, painted by Thomas Phillips in 1835.
Charles Babbage (1791–1871), photograph taken circa 1837.
CHAPTER ONE
ADA, COUNTESS OF LOVELACE
POETICAL SCIENCE
In May 1833, when she was seventeen, Ada Byron was among the young women presented at the British royal court. Family members had worried about how she would acquit herself, given her high-strung and independent nature, but she ended up behaving, her mother reported, “tolerably well.” Among those Ada met that evening were the Duke of Wellington, whose straightforward manner she admired, and the seventy-nine-year-old French ambassador Talleyrand, who struck her as “an old monkey.”1
The only legitimate child of the poet Lord Byron, Ada had inherited her father’s romantic spirit, a trait that her mother tried to temper by having her tutored in mathematics. The combination produced in Ada a love for what she took to calling “poetical science,” which linked her rebellious imagination to her enchantment with numbers. For many, including her father, the rarefied sensibilities of the Romantic era clashed with the techno-excitement of the Industrial Revolution. But Ada was comfortable at the intersection of both eras.
So it was not surprising that her debut at court, despite the glamour of the occasion, made less impression on her than her attendance a few weeks later at another majestic event of the London season, at which she met Charles Babbage, a forty-one-year-old widowed science and math eminence who had established himself as a luminary on London’s social circuit. “Ada was more pleased with a party she was at on Wednesday than with any of the assemblages in the grand monde,” her mother reported to a friend. “She met there a few scientific people—amongst them Babbage, with whom she was delighted.”2
Babbage’s galvanizing weekly salons, which included up to three hundred guests, brought together lords in swallow-tail coats and ladies in brocade gowns with writers, industrialists, poets, actors, statesmen, explorers, botanists, and other “scientists,” a word that Babbage’s friends had recently coined.3 By bringing scientific scholars into this exalted realm, said one noted geologist, Babbage “successfully asserted the rank in society due to science.”4
The evenings featured dancing, readings, games, and lectures accompanied by an assortment of seafood, meat, fowl, exotic drinks, and iced desserts. The ladies staged tableaux vivants, in which they dressed in costume to re-create famous paintings. Astronomers set up telescopes, researchers displayed their electrical and magnetic contrivances, and Babbage allowed guests to play with his mechanical dolls. The centerpiece of the evenings—and one of Babbage’s many motives for hosting them—was his demonstration of a model portion of his Difference Engine, a mammoth mechanical calculating contraption that he was building in a fireproof structure adjacent to his home. Babbage would display the model with great drama, cranking its arm as it calculated a sequence of numbers and, just as the audience began to get bored, showed how the pattern could suddenly change based on instructions that had been coded into the machine.5 Those who were especially intrigued would be invited through the yard to the former stables, where the complete machine was being constructed.
Babbage’s Difference Engine, which could solve polynomial equations, impressed people in different ways. The Duke of Wellington commented that it could be useful in analyzing the variables a general might face before going into battle.6 Ada’s mother, Lady Byron, marveled that it was a “thinking machine.” As for Ada, who would later famously note that machines could never truly think, a friend who went with them to the demonstration reported, “Miss Byron, young as she was, understood its working, and saw the great beauty of the invention.”7
Ada’s love of both poetry and math primed her to see beauty in a computing machine. She was an exemplar of the era of Romantic science, which was characterized by a lyrical enthusiasm for invention and discovery. It was a period that brought “imaginative intensity and excitement to scientific work,” Richard Holmes wrote in The Age of Wonder. “It was driven by a common ideal of intense, even reckless, personal commitment to discovery.”8
In short, it was a time not unlike our own. The advances of the Industrial Revolution, including the steam engine, mechanical loom, and telegraph, transformed the nineteenth century in much the same way that the advances of the Digital Revolution—the computer, microchip, and Internet—have transformed our own. At the heart of both eras were innovators who combined imagination and passion with wondrous technology, a mix that produced Ada’s poetical science and what the twentieth-century poet Richard Brautigan would call “machines of loving grace.”
LORD BYRON
Ada inherited her poetic and insubordinate temperament from her father, but he was not the source of her love for machinery. He was, in fact, a Luddite. In his maiden speech in the House of Lords, given in February 1812 when he was twenty-four, Byron defended the followers of Ned Ludd, who were rampaging against mechanical weaving machines. With sarcastic scorn Byron mocked the mill owners of Nottingham, who were pushing a bill that would make destroying automated looms a crime punishable by death. “These machines were to them an advantage, inasmuch as they superseded the necessity of employing a number of workmen, who were left in consequence to starve,” Byron declared. “The rejected workmen, in the blindness of their ignorance, instead of rejoicing at these improvements in arts so beneficial to mankind, conceived themselves to be sacrificed to improvements in mechanism.”
Two weeks later, Byron published the first two cantos of his epic poem Childe Harold’s Pilgrimage, a romanticized account of his wanderings through Portugal, Malta, and Greece, and, as he later remarked, “awoke one morning and found myself famous.” Beautiful, seductive, troubled, brooding, and sexually adventurous, he was living the life of a Byronic hero while creating the archetype in his poetry. He became the toast of literary London and was feted at three parties each day, most memorably a lavish morning dance hosted by Lady Caroline Lamb.
Lady Caroline, though married to a politically powerful aristocrat who was later prime minister, fell madly in love with Byron. He thought she was “too thin,” yet she had an unconventional sexual ambiguity (she liked to dress as a page boy) that he found enticing. They had a turbulent affair, and after it ended she stalked him obsessively. She famously declared him to be “mad, bad, and dangerous to know,” which he was. So was she.
At Lady Caroline’s party, Lord Byron had also noticed a reserved young woman who was, he recalled, “more simply dressed.” Annabella Milbanke, nineteen, was from a wealthy and multi-titled family. The night before the party, she had read Childe Harold and had mixed feelings. “He is rather too much of a mannerist,” she wrote. “He excels most in the delineation of deep feeling.” Upon seeing him across the room at the party, her feelings were conflicted, dangerously so. “I did not seek an introduction to him, for all the women were absurdly courting him, and trying to deserve the lash of his Satire,” she wrote her mother. “I am not desirous of a place in his lays. I made no offering at the shrine of Childe Harold, though I shall not refuse the acquaintance if it comes my way.”9
That acquaintance, as it turned out, did come her way. After he was introduced to her formally, Byron decided that she might make a suitable wife. It was, for him, a rare display of reason over romanticism. Rather than arousing his passions, she seemed to be the sort of woman who might tame those passions and protect him from his excesses—as well as help pay off his burdensome debts. He proposed to her halfheartedly by letter. She sensibly declined. He wandered off to far less appropriate liaisons, including one with his half sister, Augusta Leigh. But after a year, Annabella rekindled the courtship. Byron, falling more deeply in debt while grasping for a way to curb his enthusiasms, saw the rationale if not the romance in the possible relationship. “Nothing but marriage and a speedy one can save me,” he admitted to Annabella’s aunt. “If your niece is obtainable, I should prefer her; if not, the very first woman who does not look as if she would spit in my face.”10 There were times when Lord Byron was not a romantic. He and Annabella were married in January 1815.
Byron initiated the marriage in his Byronic fashion. “Had Lady Byron on the sofa before dinner,” he wrote about his wedding day.11 Their relationship was still active when they visited his half sister two months later, because around then Annabella got pregnant. However, during the visit she began to suspect that her husband’s friendship with Augusta went beyond the fraternal, especially after he lay on a sofa and asked them both to take turns kissing him.12 The marriage started to unravel.
Annabella had been tutored in mathematics, which amused Lord Byron, and during their courtship he had joked about his own disdain for the exactitude of numbers. “I know that two and two make four—and should be glad to prove it too if I could,” he wrote, “though I must say if by any sort of process I could convert two and two into five it would give me much greater pleasure.” Early on, he affectionately dubbed her the “Princess of Parallelograms.” But when the marriage began to sour, he refined that mathematical image: “We are two parallel lines prolonged to infinity side by side but never to meet.” Later, in the first canto of his epic poem Don Juan, he would mock her: “Her favourite science was the mathematical. . . . She was a walking calculation.”
The marriage was not saved by the birth of their daughter on December 10, 1815. She was named Augusta Ada Byron, her first name that of Byron’s too-beloved half sister. When Lady Byron became convinced of her husband’s perfidy, she thereafter called her daughter by her middle name. Five weeks later she packed her belongings into a carriage and fled to her parents’ country home with the infant Ada.
Ada never saw her father again. Lord Byron left the country that April after Lady Byron, in letters so calculating that she earned his sobriquet of “Mathematical Medea,” threatened to expose his alleged incestuous and homosexual affairs as a way to secure a separation agreement that gave her custody of their child.13
The opening of canto 3 of Childe Harold, written a few weeks later, invokes Ada as his muse:
Is thy face like thy mother’s, my fair child!
Ada! sole daughter of my house and of my heart?
When last I saw thy young blue eyes they smiled,
And then we parted.
Byron wrote these lines in a villa by Lake Geneva, where he was staying with the poet Percy Bysshe Shelley and Shelley’s future wife, Mary. It rained relentlessly. Trapped inside for days, Byron suggested they write horror stories. He produced a fragment of a tale about a vampire, one of the first literary efforts on that subject, but Mary’s story was the one that became a classic: Frankenstein, or The Modern Prometheus. Playing on the ancient Greek myth of the hero who crafted a living man out of clay and snatched fire from the gods for human use, Frankenstein was the story of a scientist who galvanized a man-made assemblage into a thinking human. It was a cautionary tale about technology and science. It also raised the question that would become associated with Ada: Can man-made machines ever truly think?
The third canto of Childe Harold ends with Byron’s prediction that Annabella would try to keep Ada from knowing about her father, and so it happened. There was a portrait of Lord Byron at their house, but Lady Byron kept it securely veiled, and Ada never saw it until she was twenty.14
Lord Byron, by contrast, kept a picture of Ada on his desk wherever he wandered, and his letters often requested news or portraits of her. When she was seven, he wrote to Augusta, “I wish you would obtain from Lady B some accounts of Ada’s disposition. . . . Is the girl imaginative? . . . Is she passionate? I hope that the Gods have made her anything save poetical—it is enough to have one such fool in the family.” Lady Byron reported that Ada had an imagination that was “chiefly exercised in connection with her mechanical ingenuity.”15
Around that time, Byron, who had been wandering through Italy, writing and having an assortment of affairs, grew bored and decided to enlist in the Greek struggle for independence from the Ottoman Empire. He sailed for Missolonghi, where he took command of part of the rebel army and prepared to attack a Turkish fortress. But before he could engage in battle, he caught a violent cold that was made worse by his doctor’s decision to treat him by bloodletting. On April 19, 1824, he died. According to his valet, among his final words were “Oh, my poor dear child!—my dear Ada! My God, could I have seen her! Give her my blessing.”16
ADA
Lady Byron wanted to make sure that Ada did not turn out like her father, and part of her strategy was to have the girl rigorously study math, as if it were an antidote to poetic imagination. When Ada, at age five, showed a preference for geography, Lady Byron ordered that the subject be replaced by additional arithmetic lessons, and her governess soon proudly reported, “She adds up sums of five or six rows of figures with accuracy.” Despite these efforts, Ada developed some of her father’s propensities. She had an affair as a young teenager with one of her tutors, and when they were caught and the tutor banished, she tried to run away from home to be with him. In addition, she had mood swings that took her from feelings of grandiosity to despair, and she suffered various maladies both physical and psychological.