Текст книги "Geek Sublime: The Beauty of Code, the Code of Beauty"
Автор книги: Vikram Chandra
Жанры:
Современная проза
,сообщить о нарушении
Текущая страница: 3 (всего у книги 15 страниц)
4 HISTORIES AND MYTHOLOGIES
The American novels I found on the shelves of my lending library in Bombay were dense little packets of information and emotion and culture from across the globe. I consumed them and the values and mythologies they incarnated, and was transformed in some very intimate way. Once I was in America, face-to-face with the foreign, I wrote a novel about another Indian encounter with the Other: about colonialism, about the coming together and clash of cultures. Despite my love for American modernism, it turned out I didn’t want to write a modernist novel. I ended up writing a hybrid book, a kind of mongrel construction which used, in one half, the Indian storytelling mode of magical tale-within-tale and all the sacred and profane registers of classical Indian literature; the other half operated more or less within the mode of modern psychological realism. Colonialism exercised its depredations not only within the realms of economics and politics; an essential part of its ideology was the assertion that Indian narrative modes were primitive, or childish, or degenerate, and that Western aesthetic norms were more civilized and sophisticated. History was progress, the colonized were told, and the West was more evolved. The current state of the world was living proof of this developmental teleology. I wanted to write a book that incarnated in its very form a resistance to this Just-So story about culture.
I understood this intention quite clearly as I wrote, but looking back now I see, also, a very young writer finding a form to contain all his various selves. I was moving between cultures, from India to America and back. I was a wanderer between nation states, I negotiated my way through their rigid borders and bureaucracies, and what could be more modern than that? I was surely a postmodern lover of modernist fiction. Yet, in my creative urges, in the deepest parts of myself, I also remained somehow stubbornly premodern. I didn’t use those premodern forms only for political and polemical reasons; I wasn’t only trying to ironize psychological “realism” by placing it next to the epic and the mythical, or only to create lo real maravillosoas a critique of bourgeois Western imperial notions of the real. No, the impulse was not merely negative. This multiply layered narrative was how I lived within myself, how I knew myself, how I spoke to myself. There was the modern me, and also certain other simultaneous selves who lived on alongside. These “shadow selves”—to follow sociologist Ashis Nandy – responded passionately and instantly to epic tropes, whether in the Mahabharataor in Hindi films; believed implicitly and stubbornly in reincarnation despite a devotion to Enlightenment positivism; insisted on regarding matter and consciousness as one; and experienced the world and oneself as the habitations of devatas, “deities” who simultaneously represent inner realities and cosmic principles. So my book – to speak in my voice – had to contain these selves too.
This un-modern half of my book tended to confuse my American writing-program peers. In our workshops, the prevailing aesthetic tended toward minimalism; the models were Raymond Carver and Ann Beattie and Bobbie Ann Mason. The winding tales I brought in were judged, at least initially, to be melodramatic, mystical, exotic, strange. I didn’t try to explain what I was trying to do mainly because I didn’t have a vocabulary in which I could articulate the lived sensation of this shadow-world within me. I wrote on.
My other life as a computer geek was excitingly active and remunerative. As I taught myself about code, I discovered yet another culture on the newsgroups of Usenet and in meetings of the Houston Area League of PC Users (HAL-PC), “the world’s largest PC user group.” Programmers had their own lingo, their own hierarchies of value and respect, their own mythology. Many of these new norms were being created online. By the turn of the twenty-first century, Scott Rosenberg notes, programmers were writing
personally, intently, and voluminously, pouring out their inspirations and frustrations, their insights and tips and fears and dreams, on Web sites and in blogs. It is a process that began in the earliest days of the Internet, on mailing lists and in newsgroup postings … Not all of this writing is consequential, and not all programmers read it. Yet it is changing the field – creating, if not a canon of the great works of software, at least an informal literature around the day-today practice of programming. The Web itself has become a distributed version of that vending-machine-lined common room … an informal and essential place for coders to share their knowledge and kibitz. It is also an open forum in which they continue to ponder, debate, and redefine the nature of the work they do. 1
One of the urtexts in this shared folklore of computing is “The Story of Mel, a Real Programmer.” It first appeared on a Usenet discussion board in May 1983, as a riposte to a recently published article “devoted to the *macho *side of programming [which] made the bald and unvarnished statement: Real Programmers write in FORTRAN.” 2Our Usenet storyteller here, like any chronicler of the days of yore, wants to set the quiche-eating, FORTRAN-writing young ’uns straight. He begins:
Maybe [real programmers] do [use FORTRAN] now, in this decadent era of Lite beer, hand calculators, and “user-friendly” software but back in the Good Old Days, when the term “software” sounded funny and Real Computers were made out of drums and vacuum tubes, Real Programmers wrote in machine code. Not FORTRAN. Not RATFOR. Not, even, assembly language. Machine Code. Raw, unadorned, inscrutable hexadecimal numbers. Directly. 3
This post was originally written in straightforward prose by Ed Nather, an astronomer, but some anonymous coder responded to its rhythms and elegiac tone and converted it into free verse, and so it has existed on the Web ever since:
Lest a whole new generation of programmers
grow up in ignorance of this glorious past,
I feel duty-bound to describe,
as best I can through the generation gap,
how a Real Programmer wrote code.
I’ll call him Mel,
because that was his name. 4
Mel, the eponymous protagonist of this epic, is the kind of programmer who is already a rarity in 1983: he understands the machine so well that he can program in machine code. The conveniences afforded by high-level languages like FORTRAN and its successors – which now all seem primitive – have by 1983 already so cushioned the practitioners of computing from the metal, from the mechanics of what they do, that they are hard-pressed to debug Mel’s code. Mel’s understanding of his hardware seems uncanny, mystical, a remnant from a bygone heroic epoch:
Mel never wrote time-delay loops, either,
even when the balky Flexowriter
required a delay between output characters to work right.
He just located instructions on the drum
so each successive one was just *past *the read head
when it was needed;
the drum had to execute another complete revolution
to find the next instruction.
He coined an unforgettable term for this procedure.
Although “optimum” is an absolute term,
like “unique,” it became common verbal practice
to make it relative:
“not quite optimum” or “less optimum”
or “not very optimum.”
Mel called the maximum time-delay locations
the “most pessimum.”
…
I have often felt that programming is an art form,
whose real value can only be appreciated
by another versed in the same arcane art;
there are lovely gems and brilliant coups
hidden from human view and admiration, sometimes forever,
by the very nature of the process.
You can learn a lot about an individual
just by reading through his code,
even in hexadecimal.
Mel was, I think, an unsung genius. 5
Within the division of Microsoft that produces programming tools, a Mel-like programmer is represented by the persona “Einstein,” who is an “expert on both low level bit-twiddling and high-level object oriented architectures.” 6There is also another persona named “Elvis,” a “professional application developer.” 7As described by Eric Lippert, former senior software engineer at Microsoft, both Einstein and Elvis “got their jobs by studying computer science and going into development as a career.” 8And then there is the persona “Mort,” who is “an expert on frobnicating [tweaking, adjusting] widgets, [who] one day realizes that his widget-tracking spreadsheets could benefit from a little [Visual Basic for Applications] magic, so he picks up enough VBA to get by.” 9
The vast majority of programmers in the world today are Morts. Despite my intermittent, fumbling attempts at studying data structures and algorithms – the bricks and mortar of computer science – I most definitely remain on the Mort end of the scale. The ever-receding minority of Mels and Einsteins has observed this democratization of the computer with mixed feelings: on the one hand, the legendary early hackers at MIT and Apple are revered precisely because they took on the bureaucratic priesthood that protected the mainframes, defeated its defenses, and made computing available to all; on the other, the millions of Morts who have benefited from the computer revolution produce awful, bloated, buggy software because they don’t knowhow the machine really works, and, what’s worse, most Morts don’t wantto know. “Mort is a very localprogrammer – he wants to make a few changes to one subroutine and be done,” writes Lippert.
Mort does not want to understand how an entire system works in order to tinker with it. And my goodness, Mort hatesreading documentation … Mort’s primary job is to frobnicate widgets – code is just a means to that end – so every second spent making the code more elegant takes him away from his primary job. 10
Mort lacks “mechanical sympathy,” that quality possessed by the best race-car drivers, who understand their machines so well that they flow in harmony with them.
To the Morts of the world, and even to the Elvii, Mel the Real Programmer’s programming is inscrutable and his mystique dazzling. The narrator of our epic is asked to investigate and change the behavior of a program that Mel has written. He reads through Mel’s code, and is baffled by an “innocent loop” which doesn’t have a test within it – as is usual – to break the loop. Code loops always contain a conditional test of the form “if numberOfLoops > 4 then break”; without such a construct you are trapped in an endless circling repetition. “Common sense said that it had to be a closed loop, / where the program would circle, forever, endlessly.” 11But Mel’s program doesn’t get stuck in the loop, it flows through, it works. It takes the narrator two weeks to comprehend Mel’s uncanny melding of code and machine, which uses the test-less loop and a programmer-forced malfunction in the system’s memory to position the next program instruction in the right location; such is the force of this revelation that “when the light went on it nearly blinded me.” After such knowledge, reverence is the only proper emotion; the narrator tells his Big Boss that he can’t fix the error because he can’t find it.
I didn’t feel comfortable
hacking up the code of a Real Programmer. 12
Despite the allusion above to “the *macho *side of programming,” the non-geek may not fully grasp that within the culture of programmers, Mel es muy macho.The Real Programmer squints his eyes, does his work, and rides into the horizon to the whistling notes of Ennio Morricone. To you, Steve Wozniak may be that cuddly penguin who was on a few episodes of Dancing with the Stars, and by all accounts, he really is the good, generous man one sees in interviews. But within the imaginations of programmers, Woz is also a hard man, an Original Gangsta: he wired together his television set and a keyboard and a bunch of chips on a circuit board and so created the Apple I computer. Then he realized he needed a programming language for the microprocessor he’d used, and none existed, so Woz – who had never taken a language-design class – read a couple of books, wrote a compiler, and then wrote a programming language called Integer BASIC in machine code. And when we say “wrote” this programming language we mean that he wrote the assembly code in a paper notebook on the right side of the pages, and then transcribed it into machine code on the left. 13And he did all this while holding down a full-time job at Hewlett-Packard: “I designed two computers and cassette tape interfaces and printer interfaces and serial ports and I wrote a Basic and all this application software, I wrote demos, and I did all this moonlighting, all in a year.” 14
That second computer was the Apple II, the machine that defined personal computing, that is on every list of the greatest computers ever made. Woz designed all the hardware andall the circuit boards andall the software that went into the Apple II, while the other Steve spewed marketing talk at potential investors and customers on the phone. Every piece and bit and byte of that computer was done by Woz, and not one bug has ever been found, “not one bug in the hardware, not one bug in the software.” 15The circuit design of the Apple II is widely considered to be astonishingly beautiful, as close to perfection as one can get in engineering.
Woz did both hardware and software. Woz created a programming language in machine code. Woz is hardcore.
Working in machine code is very hard, so assembly code was created by adding some mnemonics to machine code. Working in assembly code is still hard; doing anything complex – like making games – in it is insanely hard. On programmers.stackexchange.com, a user going by the nom de guerre “DFectuoso” asked, “Are there any famous one-man-army programmers?” 16During the ensuing discussion about lone coders, one of the participants mentioned Chris Sawyer, who wrote the hugely successful 1999 game, RollerCoaster Tycoon: “He had a little help with music and graphics, but otherwise RollerCoaster Tycoonwas all him. Amazing, especially given the physics engine. Last but not least, the entire game was written in assembly language.” And another commenter responded, “He wrote that in assembly?! Jesus Christ. I think I need to go boil my brain now.”
Sawyer’s achievements are indeed brain-boilingly immense, but why and how does a lone Scottish geek toiling obsessively over his virtual roller coasters become a “one-man-army”? Why do the Einsteins of programming affect – in their online personas and sometimes in person – a blustery True Gritswagger? Therein lies one of the tales that Nathan Ensmenger tells in his illuminating social history of computing, The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise.Ensmenger does for computing in North America what historians of science have done for other disciplines: locate the development of knowledge and technology firmly within a messy matrix of human agency and politics; there is no orderly teleological progress from triumph to triumph, only competing interests that struggle over authority, access, and power. The computer boys of Ensmenger’s title are the early software-builders, the pioneers; the irony, as Ensmenger shows, is that many of them were women. In fact, the earliest programmers were all women: the “ENIAC girls” were women recruited by the (male) engineers and managers of the Electronic Numerical Integrator and Computer. The creators of the ENIAC had a clear division of labor in mind: the male scientist or “planner” would do the hard intellectual work of creating the mathematical algorithms and structures necessary to solve a problem; the female “coder” would then carry out the “static” manual labor of entering this plan into the machine by manipulating plugboard wiring and thousands of switches. Numerous wires had to be plugged and switches flipped for each machine instruction; each problem required thousands of instructions, so wiring – which was the programming – would take several days, and checking the wiring would take another few. 17
This division of tasks of course echoed the hierarchies already present; men did the thinking and inventing, women were clerks. “The telephone switchboard-like appearance of the ENIAC programming cable-and-plug panels,” Ensmenger writes, “reinforced the notion that programmers were mere machine operators, that programming was more handicraft than science, more feminine than masculine, more mechanical than intellectual.” 18The planners considered the coding process so transparently simple that they couldn’t imagine that once in the machines, their algorithms might fault and hang, might need to be stopped. One of the ENIAC programmers, Betty Holberton, had to work very hard to convince John von Neumann that programs were complex and therefore fragile:
But to my astonishment, [Dr von Neumann] never mentioned a stop instruction. So I did coyly say, “Don’t we need a stop instruction in this machine?” He said, “No we don’t need a stop instruction. We have all these empty sockets here that just let it go to bed.” And I went back home and I was really alarmed. After all, we had debugged the machine day and night for months just trying to get jobs on it.
So the next week when I came up with some alterations in the code, I approached him again with the same question. He gave me the same answer. Well I really got red in the face. I was so excited and I really wanted to tell him off. And I said, “But Dr. von Neumann, we are programmers and we sometimes make mistakes.” He nodded his head and the stop order went in. 19
Once von Neumann and everyone else involved with computers understood this hitherto unimaginable fact – that programming, translating algorithms into the language of machines, was very difficult – programmers became valuable commodities. A 1959 Price Waterhouse report warned that “high quality individuals are the key to top grade programming. Why? Purely and simply because much of the work involved is exacting and difficult enough to require real intellectual ability and above average personal characteristics.” 20Such individuals weren’t easy to find, and as corporations looked for competitive advantage by computerizing their business processes, a shortage resulted. Corporations set up training programmes, fly-by-night vocational schools sprang up guaranteeing jobs: “There’s room for everyone. The industry needs people. You’ve got what it takes.” 21
In 1967, Cosmopolitanmagazine carried an article titled “The Computer Girls” that emphasized that programming was a field in which there was “no sex discrimination in hiring”—“every company that makes or uses computers hires women to program them … If a girl is qualified, she’s got the job.” Admiral Grace Hopper, programming pioneer, assured the Cosmoreaders that programming was “just like planning a dinner … You have to plan ahead and schedule everything so it’s ready when you need it. Programming requires patience and the ability to handle detail. Women are ‘naturals’ at computer programming.” 22
Already, though, the “masculinization process” of the computing industry was underway. The severe limitations of memory and processing power in the machines of the day demanded Mel the Real Programmer’s wizardry; John Backus described programming in the fifties as “a black art, a private arcane matter … [in which] the success of a program depended primarily on the programmer’s private techniques and inventions.” 23The aptitude tests used by the industry to identify potential Mels consisted primarily of mathematical and logical puzzles, which often required a formal education in these disciplines; even Cosmopolitan magazine, despite its breezy confidence about the absence of sexism in computing, recognized this as a barrier to entry for its readers. An industry analyst argued that the “Darwinian selection” of personnel profiling resulted in an influx of programmers who were “often egocentric, slightly neurotic, and [bordering] upon a limited schizophrenia. The incidence of beards, sandals, and other symptoms of rugged individualism or nonconformity are notably greater among this demographic group.” 24The Real Programmers that the industry found through these aptitude tests were weird male geeks wielding keyboards.
As Ensmenger puts it:
It is almost certainly the case that these [personality] profiles represented, at best, deeply flawed scientific methodology. But they almost equally certainly created a gender-biased feedback cycle that ultimately selected for programmers with stereotypically masculine characteristics. The primary selection mechanism used by the industry selected for antisocial, mathematically inclined males, and therefore antisocial, mathematically inclined males were over-represented in the programmer population; this in turn reinforced the popular perception that programmers ought to be antisocial and mathematically inclined (and therefore male), and so on ad infinitum. 25
The surly male genius, though, was perceived as a threat by corporate managers, especially as the initial enthusiasm over computerization gave way to doubts about actual value being produced for the companies spending the money. A programmer-turned-management-consultant described the long-haired computer wonks as a “Cosa Nostra” holding management to ransom, and warned that computer geeks were “at once the most unmanageable and the most poorly managed specialism in our society. Actors and artists pale by comparison.” 26Managers should “refuse to embark on grandiose or unworthy schemes, and refuse to let their recalcitrant charges waste skill, money and time on the fashionable idiocies of our [computer] racket.” 27
The solution that everyone agreed upon was professionalization. Managers liked the idea of standardization, testing, and certification; it would reduce their dependence on arty individuals practising arcana: “The concept of professionalism affords a business-like answer to the existing and future computer skills market.” 28Programmers wanted to be recognized as something other than mere technicians – the rewards would be “status, greater autonomy, improved opportunities for advancement, and better pay.” 29Within academia, researchers were struggling to establish computer science as a distinct discipline and establish a theoretical foundation for their work. So, “an activity originally intended to be performed by low-status, clerical – and more often than not, female – [workers],” Ensmenger tells us:
was gradually and deliberately transformed into a high-status, scientific, and masculine discipline.
As Margaret Rossiter and others have suggested, professionalization nearly always requires the exclusion of women …
As computer programmers constructed a professional identity for themselves during the crucial decades of the 1950s and 1960s … they also constructed a gender identity. Masculinity was just one of many resources that they drew on to distance their profession from its low-status origins in clerical data processing. The question of “who made for a good programmer” increasingly involved in its answer the qualifier “male.” 30
In 1892, a British colonial official named Sir Lepel Griffin wrote:
The characteristics of women which disqualify them for public life and its responsibilities are inherent in their sex and are worthy of honour, for to be womanly is the highest praise for a woman, as to be masculine is her worst reproach. But when men, as the Bengalis, are disqualified for political enfranchisement by the possession of essentially feminine characteristics, they must expect to be held in such contempt by stronger and braver races, who have fought for such liberties as they have won or retained. 31
Sir Griffin was certain that Bengalis were unfit for political power because they were effeminate, weak, and it was unthinkable that they might “represent … precede … and govern the martial races of India”—that is, certain other ethnic groups within India who were deemed sufficiently warlike by the English. If a Bengali were to demonstrate such aspirations to equality, “then the English, as the common conqueror and master of all, may justly laugh at his pretensions and order him to take the humbler place which better suits a servile race which has never struck a blow against an enemy.” 32
The gender politics of the Raj were of course much on my mind while I wrote my first novel in the late eighties and early nineties, as I traveled back and forth between India and America. The British “Cult of Manliness” had been an essential component of the creed of Empire, which – as above – conflated masculinity, violence, civic virtue, and morality. Even intelligence and intellectual capability were inextricably intertwined with masculinity; women and all others who exhibited symptoms of femininity were fuzzy headed, illogical, and easily overcome by emotion; they were incapable especially of scientific reasoning and therefore self-knowledge and progress. The state of the world – women without power, Englishmen ruling Indians – bore out the truth of these propositions.
By now, I’d read my Edward Said, and I prided myself on being aware of the ideological mechanisms that transformed local contingencies of history and culture into Nature itself. The attractions of Nick Carter, Killmaster, seemed altogether more sinister now that I had listened to many scholarly deconstructions of imperial American masculinity. But at the time, I didn’t question much the demographics of programming. The meetings of the special interest groups of HAL-PC devoted to programming were all-male; I think in all my years of consulting work I met one female programmer. This was just the way things were. The male programmers I met were often astonishingly generous with knowledge and technical advice, and yet, the very same men were also abrupt and outright rude. Indians are frequently taken aback by the American virtues of quick intimacy and bluntness, which come across as shockingly bad manners; I knew to discount for this, and understood that our own predilection for face-saving, izzat-preserving niceties made us maddeningly opaque and slippery to the average American. Still, these coders were deliberately obnoxious by anyone’s standards, especially online. They ad-hominemed, flamed, name-called, dismissed, despised. Not to put too fine a point on it: these guys were assholes. Preeminence among programmers was often decided by competitions of assholery, a kind of ritual jousting.
This unfortunate condition has only intensified over the decades. The “masculinization process” that Ensmenger describes has resulted in a contemporary American culture of programming that is overwhelmingly male, as one can see at conferences, on websites and blogs. The metaphors used within this world of one-man armies are very often martial. Teams working against impossible deadlines go on “death marches.” Finding and fixing defects in software is a pains-taking, detail-oriented task, one which Grace Hopper might have compared to housekeeping; but in the parlance of many programming shops, the most proficient bug sweepers are “bug slayers.”
In March 2011, David Barrett, CEO of Expensify (“Expense Reports That Don’t Suck”), blogged about how his start-up wouldn’t hire programmers who used Microsoft’s very large and elaborate.NET framework, which – according to him – provided ready-made, assembly-line tools that turned these programmers into drudges capable of only mass-producing pre-designed code, the programming equivalent of fast-food burgers. No, he wanted passionate programmers who could write “everything from assembly to jQuery, on PCs to mobile phones, [and code] hard core computer graphics to high level social networking.” 33Barrett wanted Einsteins, not Morts – fair enough. But this is how he described his Einsteins:
As you might know, we’re hiring the best programmers in the world. Sure, everyone says that. But my coders will beat up your coders, any day of the week. For example, Mich is barely 5 foot tall, but is a competitive fencer. Witold is a 6’3” former professional hockey player. Nate practices knife fighting for fun. 34
Over a few days, I read hundreds of comments and blog posts debating the merits of Barrett’s case against.NET programmers; some argued that many great programmers used.NET, and that other frameworks had as many bad or lazy programmers. The discussions were long and nuanced. But nobody seemed to notice his very literal conflation of omnivorous intellectual curiosity with manly combat skills. He extends his fast-food riff—“Programming with.NET is like cooking in a McDonalds kitchen. It is full of amazing tools that automate absolutely everything”—but then turns the metaphor into a paean to programmer-as-blood-soaked-pioneer:
The sort of person [we are looking for] grew up cooking squirrels over a campfire with sharpened sticks – squirrels they caught and skinned while scavenging in the deep forests for survival. We don’t want a short order chef, we want a Lord of the Flies, carried by wolves into civilization and raised in a French kitchen full of copper-bottomed pots and fresh-picked herbs. 35
“A Lord of the Flies in a French kitchen” neatly catches the geek machismo and extraordinary privilege that are essential ingredients in the cultural paradox that is Silicon Valley. Wages are so high here, Rebecca Solnit reports, that “you hear tech workers complaining about not having time to spend their money.” 36Depending on which San Francisco neighborhood you live in, your rent rose by anywhere from 10 percent to 135 percent over 2012, driven up by young techies outbidding each other. 37In the booming restaurants and cafés, there’s a general disdain for government, which is often described as fatally broken, in desperate need of “disruption,” that condition beloved of programmers and venture capitalists. Workers’ unions are regarded as anachronisms that hold back progress. Company founders chafe at any restrictions imposed by local or federal government as leftover mechanisms from a failed system which prevent the markets from working properly. 38