The spring and summer of 2011 seem to have been dominated by uprisings of all sorts, and governments who appeared to be deeply confused about how the technology enabling them works. From the response to Wikileaks to the Arab Spring to the U.K. riots to the shutdown of mobile phone service in certain San Franscisco transit stations, the authoritarian response to civic protest is little more than hapless, n00bish button-mashing. Who do I blame for these FAILs? Not the button-mashers. Me, I blame Hackers.
I don’t mean actual hackers. I mean Hackers, the 1995 piece of bad William Gibson fanfic about kids who save their haxx0r reputations with rollerblades and holograms. And with it, I’d like to blame all other depictions of hacking as easy, technology as simple, and science as the work of solitary geniuses awaiting quick flashes of divine inspiration.
Often, when we talk about the politics of representation in media, we’re discussing how one group of people is depicted in comparison to another, and the fairness (or lack thereof) in that depiction. We talk about systemic privilege, and cultural bias, and how these things influence the contemporary myths with which we frame our identities. We do this because stories are important: they shine a light down pathways we might someday choose to take.
For example, when I about five years old, I had a crush on Matthew Broderick’s character in War Games. But I didn’t want to be with him, I wanted to be him. I wanted to sneak around military bases buried deep in the Rockies. I wanted to ferret out reclusive, misanthropic scientists and fly kites with them. I wanted to be what Broderick’s character was: a smart-mouthed genius hacker with enough 1337 sk1llz to not only start global thermonuclear war, but also end it.
Around ten years later, I had a crush on Robert Redford’s character in Sneakers. And while I found Redford dead sexy, I also wanted his character’s life: my own tiger team of pro hackers, a downtown loft, and enough 1337 social engineering sk1llz to not only thumb my nose at National Security Agents, but also pwn them.
In both cases, I thought hacking was really cool but not because it involved rollerblades or techno or Angelina Jolie. I thought hacking was cool because it looked extremely hard to do, but if you got it right there could be sweeping social change. You could liquidate the Republican Party’s assets and donate them to Greenpeace. You could get the United States military to reconsider automating nuclear weapons. To me, these seemed like epic feats of heroism, accomplished with the aid of humble communication technologies. Those technologies weren’t magic, and that was the whole point. If it were easy, it would have been done already.
All too often in fiction, we choose to batter our science and technology in a thick coating of McGuffin and then deep-fry it in a vat of boiling handwavium. But just as we should avoid an ignorant depiction of human beings whenever possible, we should also avoid ignorant depictions of science and technology because how we discuss science and technology is inherently political.
This would still be true even if scientific research in university labs weren’t largely dependent on government grants, or if governments didn’t regulate telecommunications or food inspection or drug approval, or if criminal codes weren’t constantly being rewritten to account for how people use technology. In democracies, the people elect representatives to make decisions about those matters. And the people are influenced by the “debate” about the use of Twitter during disasters, or anthropogenic climate change, or embryonic stem cell research, or the MMR vaccine, or oil drilling in national forests. In turn, that “debate” is influenced by popular culture, and fictional depictions of science and technology even the ludicrous ones where James Franco cures Alzheimer’s and Natalie Portman models a functional wormhole with Arthur C. Clarke quotations.
I know, I know. You know that could never happen. But are midichlorians any more ridiculous than the idea of “curing” homosexuality? Is “clean coal” any more likely to fix air pollution than unobtanium? Are the “ethical governor” patches on the predators circling Kabul any less fallible than one of Susan Calvin’s patients? Who’s really writing the science fiction, here?
Real science is hard. It’s also slow. It’s done by large, disparate teams of people who have resigned themselves to lives of constant petition, who proceed on the simple faith that even if this experiment (years in the framing and doing and writing) fails, the failure itself is a contribution to the global pool of knowledge. Depicting it as anything less shortchanges not only the ugly but meaningful grind of scientfic progress, but also the people who push it forward day-in, day-out.
Holograms? No. Rollerblades? No. Password: Swordfish? No. Bad Chinese food? Yes. Too many hours spent with intelligent but irritating friends? Yes. Working for days before understanding how to solve the problem? Yes. That’s what science and hacking have in common. And I suspect that if more of our leaders (and more importantly, their policy advisors and constituents) understood that, our world would look different. Because then they’d know: a killswitch can’t stop the signal. You can’t shut down curiosity. People do science not because it is easy, but because it is hard, and as Kennedy observed, “because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one which we intend to win.”
Madeline Ashby is a science fiction writer, foresight consultant, and anime fan. She recently completed a design thesis on the future of border security. Her first novel vN will be available next summer.