There is a technology trend which even the determinedly non-technical should care about. The bad guys are winning. And even though I am only talking about the bad guys in computing — writers of viruses, malware and the like — they are actually the bad guys of all technology, since computing is now central to every aspect of technology. They might even be the bad guys of civilization in general, since computing-driven technology is central to our attacks on all sorts of other global problems ranging from global poverty to AIDS, cancer, renewable energy and Al Qaeda. So turning around and winning this war might even be the single most important challenge facing humanity today. Even that bastion of the liberal arts and humanities, The Atlantic Monthly, has taken note, with this excellent feature on how the best security researchers in the world are losing the battle against the Conficker worm. Simple-minded solutions, ranging from “everybody should get a Mac” to “just stick to Web-based apps and netbooks” to “practice better digital hygeine” are all temporary tactical defenses against an adversary that is gradually gaining the upper hand on many fronts. I have concluded that there is only one major good-guy weapon that has not yet been tried: sexual computing. And it hasn’t been tried because major conceptual advances in computer science are needed. I’ll explain what I mean by the term (it is a fairly obvious idea for those who know the background, so there may be more accepted existing terms for the vision), but I’ll need to lay some groundwork first.
The Ground and Air Wars
The bad-guy ecosystem today is unbelievably diverse. Digital threats abound, from viruses and malware on your PC, to attacks on Web servers, Twitter and Facebook hacks and large-scale identity data theft from banks. Add cyberwarfare among countries and between countries and MNCs (as in, Google vs. China), and the existence of organized botnets, and the term “hacker” starts to seem quaint. You could broadly break this down into a ground war (stuff happening at the level of your PC or other end-user devices) and an air war (involving web servers, organizations (criminal, state, stateless and “good guy”), and the cloud infrastructure).
The annoying old semantic quibbling by the good guys (“we are the hackers, they are the crackers”) has gone from being merely annoying to dangerously inaccurate, because it suggests that the war is between two anarchic groups of roughly similar lone individuals. The ecosystem on both sides is now so complex and organized that we need a whole new set of names just for the various roles. For instance, there are good guys who do nothing besides run rooms full of open, exploitable computers as “honeypots” to attract the newest malware. And there are other good guys working on esoteric encryption technologies. On the bad guys’ side, there are low-level foot soldiers who troll around online and offline looking for credit card numbers to steal, and then there are the big mob bosses running botnets, and (I assume) liaisons who manage the financial relationships between spammers and the pornography industry. Things are so complex that even the experts on both sides have to specialize. Even within my workgroup, the three people I rely on to educate me on this stuff have different areas of expertise.
I’ll leave it to a digital ethnographer to figure out good names all around. For now, I am just going to call them “good guys” and “bad guys.”
Barbarians at the Gates
Let me share my personal experiences with the bad guys, as a poorly-armed civilian digital homesteader. It is no contest. I feel like an 1850s settler in the American West, in my sod hut on the prairie, with only an old musket to defend against the Medellin Cartel, armed with Uzis and helicopters. I am actually particularly lucky, since I work closely with a couple of learned security guys, and can call on them for help when I need to. Most of you are on your own. The only reason I haven’t suffered really badly is that I am just not important enough to be worth individual attention (unlike say, Paris Hilton and her cellphone).
When I first encountered a virus in the mid-80s (an infected floppy, on a pre hard-disk PC), the war against the bad guys was no more than a bunch of isolated skirmishes against not-very-skilful digital vandals who were in it for fun. Any technically-minded person could educate themselves on all the details in a week, and reformating floppies was all it took to get rid of threats and back to your life, after a brief interruption (today, a serious digital security problem can stop your life cold, as comprehensively as a heart attack.)
Then for a couple of decades, I was basically safe (and lucky) in Pax Digitalia. I recall no serious virus-like problems in my increasingly active digital life between about 1990 and 2007. I kept up with the news and best-practice advice and, rather sloppily, with my antivirus updates, which seemed to be working. Then around 2007 all hell started breaking loose. I was under the impression that if you kept your anti-virus software up to date, avoided shady (in particular porn) websites, were smart with passwords, and didn’t download suspicious attachments, you were safe. Apparently not. Here’s a rundown of stuff I’ve encountered since 2007.
- As End-User: Three serious malware infections. All three managed to disable my main anti-virus software from getting updates, and mightily resisted attack by multiple alternative anti-malware programs, which mostly failed to even find something wrong. In the first case, what eventually did the trick was running a couple of different programs in a specific sequence, very quickly after a reboot. In the second case, I had to resort to one of the lesser-known anti-malware programs because the bad guys had apparently figured out how to block all the most popular ones. In the last case, I had to upgrade from XP to Windows 7. I am now seeing small signs that my Windows 7 PC is probably infected again. In each case, the symptom was the usual one, unwanted ads popping up all over the place.
- As Website Owner: I discovered, purely via a casual check, that Google was blocking one of the parked domains I own, as “suspected of distributing malware.” Further checking revealed that 3 of my domains (in fact, ALL of them EXCEPT for ribbonfarm.com) had been hacked, and contained malware-distributing code. I had to clean up my sites, lock them down, and get them off Google’s blacklist. This shattered my illusion that Unix systems were fundamentally safer than Windows systems and that ISPs take care of this stuff. Ribbonfarm escaped (I think) because it runs WordPress, which adds an additional line of defense, but that’s hardly much comfort, since WordPress has its own changing set of exploitable security holes. Its vulnerability goes up and down as it evolves.
- As Customer of Big Organizations: I received one of those ominous letters from an organization I used to be part of, telling me that I was among several thousand users whose personal data had been stolen from the organization, and offering me a free subscription to an identity-theft fighting company’s services to manage any potential consequences. Fortunately, my identity didn’t seem to have been stolen.
- As Professional Technologist: Finally, Trailmeme, the project I manage for Xerox, which runs on Amazon’s EC2 infrastructure was, for a while, an innocent civilian site caught in a broader war. We couldn’t send email from our servers because a vendor of lists of spam sources (used by many firewalls) had added a lot of Amazon-owned IP addresses to their blacklist. For those who aren’t familiar with cloud computing, services like Amazon’s allow you to juggle Web servers like a circus clown, which adds a whole new layer of obscurity and illegibility to Web infrastructure, something that helps the bad guys more than the good guys. Compute clouds, like the real things, obscure visibility. It took some hard work from my team members to get ourselves off the blacklist. More broadly, I’d estimate that the time my development team spends on building the security features of our product is a very non-trivial fraction.
- An Autoimmune Collapse: Like many of you, my laptop, running XP, succumbed to that strange auto-immune mess a month ago, when a flawed McAfee update deleted a legitimate and critical system file, crashing my system comprehensively. I am sure the bad guys were laughing it up, watching the good guys trip over their own feet.
- Twitter hacks: I accidentally gave my Twitter login information to fake services twice, before I clued up and learned what to look for, to tell legitimate Twitter ecosystem services from exploits better (100% certainty is impossible of course). Now it looks like I’ll have to learn a new set of Facebook security behaviors.
And I am not even counting baseline bad-guy stuff, like the fact that there is more mail caught in my spam folders than legitimate stuff in my inbox, or that this site attracts more spam comments than real ones (so far, Akismet is keeping up). That’s my relatively-informed civilian view of the war. That I even understand this much is because I am an engineer (aerospace, not computer) and work directly with computing technology and software professionals. Chances are, you are not exposed on all these fronts, but the fact is, the bad guys are slowly gaining the upper hand on all, and you will be affected, directly or indirectly. Chances are, you imagine your online life is governed by social contracts and the rule of law like your city. Perhaps you think that the online world is just a little bit more Wild West. Like that one small bad neighborhood you avoid in your town.
You are living in a bubble. There is no rule of law; the digital landscape is mostly small islands of civilization surrounded by ungoverned and (currently) ungovernable wild lands. The barbarians are at the gates, and Rome is closer to collapse than you think.
A Fragile Security Bubble
Movies like Live Free or Die Hard, which had a relatively sophisticated depiction of cyberanarchy, still base their scripts on omnipotent digital Merlins on both sides. The good guys can hack into anything anywhere with just a cell phone, while the bad guys are led by one evil genius who can “shut down Norad with a laptop.” There are two desires driving such perceptions.
The first one, easy to dismiss, is simply Hollywood’s preference for strong individual heroes and villains, wielding tons of individual power. They know, and we know, that this is unrealistic. The second is a more seriously dangerous desire: the desire to believe that what’s going on is actually simple enough that individuals or even small groups can comprehend and operate in the cyberanarchy. This is a problem of miscalibration. It is like mistaking World War II for a small-scale gang war in New York. Just because only a tiny fraction of the population is involved in combat does not mean that it is a small war. It merely means very few people have any combat training. If Hollywood were to truly do a cyberspace story, it would be more like The Longest Day, with multiple narratives and an ensemble cast, than a terrorist hostage thriller driven by a single pair of antagonists.
This isn’t entirely perverse Hollywood ignorance. The security companies and the major “good guy” vendors have fostered the illusion that they know what they are doing and have things under control. My Norton software for instance (and I don’t mean to pick on them particularly), has a reassuring UI composed of bold green “good” check marks and red iconography for dangerous stuff. There are things like shields and glossy metallic-looking color schemes. When it runs checks, it tells me reassuring things I want to hear, like “Your System is Secure.” It is a manufactured sense of assurance. Windows promptly delivers key security updates. You get the sense that if you just behave, avoid bad neighborhoods, and keep up with the good guys, you’ll automatically stay ahead of the bad guys. You don’t realize the good guys are the ones who are behind and trying to catch up, until they fail you. When your defenses fail, you end up in Dr. House mode; trying one diagnostic test after another, trying different defender programs in varying sequences, gradually losing heart as you contemplate that nuclear option, a full reformat and OS reload (and several weeks of lost work and costly information recovery). Norton would like you to believe that their program is all you need, and that big, reassuring button, “Scan Now” is all you need to hit to magically get rid of every digital ill.
Unfortunately, that’s not true, and can’t be true; there are no digital panaceas, anymore than there are biological ones.There’s even a theoretical result that states that figuring out whether there is a virus on your computer is a formally undecidable problem (“undecidable” has a very precise meaning in computer science, but for our purposes, all you need to know is that no single “Scan Now” button can ever ensure complete security, even in theory).
Which brings me to the biological metaphor that’s been around since the beginning of the war. The biological metaphor is getting more solid every day, as the digital ecosystem becomes increasingly organic. The good news is that things have gotten sophisticated enough that we can borrow very powerful elements from biology now. I am talking about an idea called the Red Queen.
Homogeneity and the Red Queen
The war we are talking about has the character of an arms race. The defenders and attackers both have to keep working harder and harder in order to keep the defended where they are. Even if you’ve done nothing online in the last 15 years but send email and read the New York Times online, just maintaining those capabilities has cost both the bad guys and the security establishment increasing amounts of expense. It’s like the increasing military budgets on both American and Soviet sides during the Cold War. Eventually, one side can’t keep up the spending. In that case, communism couldn’t keep up. In this war, it is starting to look like it is the good guys who can’t keep up.
This “running to stay in the same place” is the reason people like the phrase “Red Queen” to describe such dynamics (I assume you know your Alice in Wonderland). In biology, the best example is the arms race between hosts and parasites (which is why the “virus” metaphor works so well). We all like big, powerful creatures and pay more attention to predator-prey interactions (and watch our shark shows and lion/tiger documentaries). But parasite-host dynamics may well have been the more important driver in evolution.
Matt Ridley’s very entertaining The Red Queen, a book about sexual selection in biology, explains the very compelling theory that sexual reproduction evolved primarily as a defense against parasitism. It turns out that this is the most general sort of defense known. Why?
The reason the bad guys are winning the cyberwars is that they have one major advantage: mass production of computing infrastructure. Find one hole in one computing system, attack it in every computing system that looks like it. Even penny-scale benefits multiply into millions of dollars. Economies of scale and mass production of any sort invariably create security brittleness and hand the bad guys a decisive advantage: enormous leverage. This isn’t a particularly new insight. In agriculture, monoculture crop lands can be devastated by a single bug. Airlines and air forces that use homogenous fleets can be laid low by a single defect. Diversity breeds robustness. Every bit of information that can be used to exploit a system has less leverage.
The problem with diversity though, is that the amount of diversity required to stay ahead of the parasites is far higher than the amount of diversity required to actually accomplish whatever the systems are designed to do. You need only one airplane design to run an airline, but to make it robust against single-point failures, you need more varieties, which add costs faster than they add any useful advantages. That’s one reason Southwest is so cheap. They’d be in trouble if a serious flaw were discovered in 737 designs (and I imagine they’ve thought through and insured against such scenarios).
Let’s distinguish two types of diversity. One is simple inter-species diversity. If there are cats and dogs around, cat diseases will probably not jump over and decimate the dog population. But this fact doesn’t particularly help cats. It makes the ecosystem as a whole more stable, but not cat populations. Having both cats and dogs around in a mixed group reduces the frequency of cat-cat contact and transmissions (since there are now dog-cat interactions), so species diversity slows down the spread of dangers through individual populations. For the minority species, it also adds a kind of protection-of-minorities, since parasitic attackers will find more room to grow in the majority species (think Windows vs. Macs until recently). But these are minor advantages.
The technology ecosystem is undergoing an explosion of this kind of species diversity. There are now vastly more kinds of “computer” devices than ever before. It isn’t as significant as it might seem on the surface, since the number of operating systems behind this diversity is fewer than the number of device types. It might even be a loss, because most of this new diversity is in the form of tethered devices (like your Wii or TiVo) that you don’t really have access to, and are hooked into a large-scale system with its own vulnerabilities of scale. You can’t defend yourself, and you are hooked into single-point failure modes on the backend.
The other kind of diversity is intra-species diversity. Different kinds of cats in short.
Here, sexual reproduction drives security because it limits the utility of a parasitic advantage in time and space. At any given time, a parasite that evolves to exploit a flaw in a particular genetic type can only spread to other individuals that share that vulnerability. The advantage is also automatically temporary, since the next generational churn of the gene pool could remove the exploitable pattern, or contain a defense.
The cost of this intra-species diversity defense is sex. A fun cost you might say, but a cost nevertheless, since continually churning out new functionally identical designs is work, and because a whole new Red Queen’s race emerges (the focus of Ridley’s book): the one between male and female. Let’s not go there, read the book if you are curious. It might offend some of you politically, so you’ve been warned.
Biology isn’t the only place this happens. Among corporations, mergers and acquisitions serve a very similar function (an implicit premise in my Gervais Principle series; with the Clueless being the parasitic class).
That brings me to my big point. In this war, the good guys have no real offensive weapons, only defensive ones. They build what they hope are secure and safe systems, the bad guys find exploits, the good guys react, and the whole cycle repeats itself. Periodically, a good guy comes up with an architectural advantage that buys a period of peace.
This is asymmetric, and the advantage is with the bad guys. The good guys have to anticipate and block all known holes. The bad guys only have to find one oversight or new flaw (the Conficker story contains very scary examples of this kind of thing).
In biology and corporate ecosystems, sexual reproduction provides a true offensive weapon to the good guys. Sexual reproduction creates diversity fairly cheaply, without tying increasing diversity to the harder problem of increasing functionality. You have two control knobs, the frequency of mating, and the degree of mixing. Bad guys moving faster? Mix things up more frequently and broadly. The nice thing is that it is a generic defense, and one that can run somewhat ahead of the bad guys.
The problem is that nobody knows how to do sexual computing. That I know of. If any of you have kept up with the theoretical CS literature better than me, please educate me. Von Neumann showed decades ago that computer programs could evolve and reproduce, just like real biological systems, so long as there was a source of random mutations. There are things called genetic algorithms that allow individual programs that fulfill the same function to reproduce and evolve sexually. But as far as I know, nothing that allows entire computers to behave like sexual beings.
What we want is an architectural paradigm that can churn the gene pool of computing design at a controllable rate, independently of advances in functionality. In other words, if you have a Windows PC, and I have one, we should be able to have our computers date, mix things up, and replace themselves with two new progeny, every so many weeks, while leaving the functional interface of the systems essentially unchanged. Malware threat levels go up? Reproduce faster. Today computing only evolves at the pace of needed (or available) new functionality. New OS versions come out when there are useful new features to be added (not counting cosmetic releases that are simply created to make money). That’s too slow. Yes, upgrading from XP to 7 cured one of my infections, but that was a side effect (and an unreliable one, since Windows 7 is an asexual descendant of Windows XP).
I have no idea how to do that, but it ought to be one of the Grand Challenges of computer science. I don’t believe it is, but the impending collapse of computing civilization, under the onslaught of digital barbarians, should really be a good enough reason to prioritize this challenge.
Are there other promising directions of attack? I can’t think of any, and neither, it seems, has biology, which has been at it for a billion years. It’s a whole other long debate, but every argument I’ve ever heard about how to make computing sustainably secure has been local and tactical. There will be no permanent victory, ever (theory tells us that), but we are in danger of losing even the fragile dynamic equilibrium that has been maintained so far. Parasites are not known for foresight. Even if they destroy themselves in destroying their hosts, they usually proceed anyway.
Sexual computing seems like the only strategic capability we could conceivably build to stay ahead. We might need a Manhattan project sized effort.