Love Your Parasites

by Jordan Peacock on March 26, 2014

Jordan is a 2014 blogging resident visiting us from his home turf on Google+ and

Parasitism is usually defined as a multi-party ecological organization in which one party benefits at another’s expense, and is contrasted with commensalism (the host is neither harmed nor helped) and mutualism (a type of symbiosis in which both parties benefit). Missing from this triptych are organizations in which a harm is partially offset with second-order benefits.

New research brings a little light to the subject in its analysis of the notorious brood parasites, the common cuckoo. The cuckoo lays its eggs in the nests of other birds, externalizing the costs of raising its young to other species, which bear the burden of feeding and caring for the cuckoo chicks, who compete strenuously with their own. However, it was found that the parasitized nests thrived relative to those left alone by the cuckoo; and this effect was causally related to the cuckoo chicks themselves, as moving the eggs to other nests moved the beneficient effects as well.

It turns out that cuckoo chicks defecate a kind of black, tarry substance that is incredibly toxic and serves to dissuade predators, resulting in net improved fitness for the host species despite the costs.

Ecological thinking is transforming our understanding of the natural world, and is blurring many of the firm boundaries erected under the old paradigms that fetishisized ‘identity’ and assumed in advance the nature of benefit and harm. The world of software seems perfectly poised for ecological analysis, as many of its fundamental concepts parallel those of biological systems (source code as the genotype to compiled code’s phenotype, for instance).

So what would parasitism in software look like?

The examples that come most easily to mind are phenomena such as banner advertising, pre-installed software (PUPs) on new devices, keyloggers, licensing copy, backwards compatibility, SEO, spam, cookies, stored user profiles, computer viruses. In most of these cases there are clear perceptions of harm, and a clear division between which entity could be considered the ‘host’ and which the ‘parasite’. A banner advertisement competes with the webpage it resides on for the user’s attention, but by stealing a marginal amount of attention it may finance the continued hosting of the webpage. Maintaining backwards compatibility for deprecated file formats (as Microsoft Office does) is costly, but it fosters user retention across upgrade cycles.

For our purposes, parasitism exists in systems with asymmetric ongoing costs. In a some cases, these costs are offset by attendant gains elsewhere in the system (harm could also be negative benefit, as in the brood parasitism above; host chicks receive less food/attention than they would otherwise). These are endemic in the system, persisting or reproducing as long as the ecosystem permits. Parasitism is never the theoretical optimal situation for fitness, and in a purely engineered environment all multi-party organizations would be relationships of mutualism. However, populations in nature and in software evolve and respond to their environments, and to the extent that design occurs, it’s limited in scope and subject to negative feedback.

At the limit, the harm is so severe as to kill the host. Historically, plague is maladaptive, and parasitism has a much longer evolutionary history precisely because it doesn’t entirely saw off the branch upon which it sits. However, when host populations grow large enough to survive epidemics (with the rise of agriculture, originally, and later with trends towards urbanization), you see a commensurate rise or transition to epidemics rather than patterns of endemic infection.

Continuing with the metaphor, then, the rise of personal computing and pervasive networking functions in technology much as urbanization did for human health; killing many and rendering the survivors immune or tolerant to the infection. Moore’s Law played a role here, with increased hardware capabilities giving the average computer more margin that could be lost while keeping the system tolerably usable.

The history of viruses and malware in computing, and the existential nature of the risks they pose to systems and their users has overshadowed the less virulent, and even amongst that subset (which includes, among other things, the all-too-common phenomena of vendor bloatware), there is little examination of whether infection confers second-order benefits to the infected. Complicating things yet further, what is considered a harm may be relative; to an end user, software licensing may be at best an antifeature, but it confers benefit in that by incentizing purchases it reproduces its conditions for existence.

Not all forms of parasitism need to be embodied in software. There are habits and practices that are endemic and are costly, but whose costs are minimal to sustain. The QWERTY keyboard style is a relatively benign example, where the costs to the initial learning curve are small compared to the cost of eradication.

Another involves software dependencies. The relationship a software project has to it’s libraries can have complex interdependencies reminiscent of the sloth’s. Here the relationships are typically classified as commensalist, but antiquated techniques, ill-documented functions and punitive licensing requirements tip the scales from “mutual benefit” or “benefit, no/low cost” to “mixed cost/benefit” (and in the case of some libraries which I won’t name, nearly pure cost).

An example of mixed cost/benefit is Ghostscript. My colleague and I worked substantially with the Ghostscript/GhostPDL library on one project. It’s uncontested if you need to work comprehensively with PostScript, PCL and PDF, particularly in a mainframe or midrange context. However, those gains come with the costs of a project that, while open source, can be punishingly difficult to productively enhance; like an eastern Mediterranean city, it is a code base that has become definitive through use and accretion, with the inconsistencies and opacity that implies. But because of that, it has so deeply internalized the problem space that numerous attempts at writing replacements have failed to address a fraction of the same. Therefore, the costs continue to be borne.

Another example would be UNIX: by all accounts a perverse and arcane system which makes a certain amount of sense once you understand it (deriving less from more, for example), made virulent through licensing innovations, educational efforts and effective toolsets. UNIX isn’t immune to attempts at disruptive innovation, but the contenders face a punishing nest of challenges, technical and social. Here, the intuitive host/parasite terms flip, with networked applications tolerating the costs imposed by the legacies of operating system design.

In November 2010, Moises Velasquez-Manoff traveled to Mexico and infected himself with the larva of Necator americanus (the New World hookworm). Velasquez-Manoff had hay fever, allergies to peanuts, and an autoimmune disorder known as alopecia which caused his body to fight the development of hair follicles, leaving him mostly hairless. His decision to introduce this notorious parasite into his body is explained in his book, An Epidemic of Absence (and in this EconTalk interview), and is premised upon new insights that the medical community has been discovering with regards to the relationship between human health and parasitism. Velasquez-Manoff explains the connection with the example of Sardinia:

Velasquez-Manoff: Sardinia had an epidemic of malaria for possibly millennia…Malaria basically was eradicated in the 1940s, after WWII that is, in the space of just a few years. They went in with DDT and sprayed the whole island….People who had evolved, in theory, with malaria, suddenly no longer had it.


Two autoimmune diseases started increasing dramatically: multiple sclerosis and Type 1 diabetes… A chronic malaria infection suppresses your immune system. So if you evolved with this constant immune suppression, and then you suddenly remove it, it’s going to reveal things to natural selection… The very tendencies in the immune system that produce autoimmune disease in the absence of malaria may have actually helped manage the infection, may have actually helped fight off malaria when it’s present.

This example is telling, in that prior to eradication, malarial infection was seen as pure harm, and its benefits only are becoming to be apparent in its absence. This complicates the discovery process, but I think we can sketch out some possible examples in the technological realm.

It’s here that the information security community is ahead of the curve. “The idea of perfect security is a trap.” We have come to terms somewhat with the necessity of developed immunological responses in our technical systems—passwords, anti-virus, spam detection, etc. It’s still unclear to me what autoimmune disease would be in these contexts; perhaps that’s a metaphorical overreach, or perhaps we simply haven’t witnessed an event of the scale and severity of the Sardinian DDT inundation.

The one example to the contrary would be the continued fallout from Edward Snowden’s leaks. The NSA’s surveillance and and cyberwarfare capabilities have become recognized by many individuals and organizations as threats, and we are in the process of witnessing a society-level immune response. It’s not so different in kind from the equally endemic private surveillance (in the forms of cookies, social networking profiles, advertising algorithms, etc) that confers the same complicated nest of benefits and harms, and as with the endemic malaria of Sardinia, the net benefits to the population does not obviate the existential risk to individuals.

Evolutionary biology has a concept known as fitness landscapes that gives us a helpful language for talking about the problem. A fitness landscape provides a topological view of a population in its environment: the peaks identify “ideal fitness” given the environment. A population that is positioned on an incline will experience random mutations, and mutations that are higher on the peak are (tautologically) more fit given that environment, and over generations you see a drift upwards until the population peaks or the topology changes.


However, the presence of multiple peaks indicates what is patently obvious to genetic engineers everywhere: that there are other possibilities, some of which might be substantially better than the status quo, if only they could be reached. They are inaccessible to incremental change, due to the sustained troughs in between, but they could be perhaps be landed upon given effective engineering.

Returning to the Ghostscript example: there are numerous projects that bring Ghostscript’s capacities up-to-date, or into new languages, or into new paradigms. But they fail to displace Ghostscript because they ultimately fail to take into account the full problem space that Ghostscript occupies, or the full costs of transition: in this respect, these movements reflect the authoritarian high modernism (you also see this mindset in many technical approaches to social problems, a methodology Evgeny Morozov calls solutionism). Ghostscript is endemic malarial infection.

Our biomass as humans is predominantly our own, but our cells are outnumbered significantly by those which do no possess our own DNA (some 100 trillion, mostly in our gut and gastrointestinal tract). A legible, “human-first” extermination regime would kill the host. But there are certainly bacteria we wish to fight, and if removal is not feasible, we focus on converting parasites to symbiotes. The first challenge is the mental shift of what identity consists in: our notion of our “self” is bleeding at the edges, much as technologically we have shifted from an account on a mainframe to a personal computer to accounts that move between multiple devices and live locally and in ‘the cloud’. Threat assessment becomes complicated — is the software from driver manufacturers that comes pre-loaded on your laptop beneficial, harmful, or bits of both? The illegibility is most stark when you look at dark patterns, where the existence of the threat depends entirely on whether you, as a user, find the intent of the application design in concord or in conflict with your own desires.

None of this is prescriptive. “Tip of the iceberg” is a generous assessment of our state of understanding of parasitic relationships in the medical and ecological spheres; in the realm of technology it’s arguably less. Rather, this is intended to open a vista that perhaps was closed or unknown and inspire a future practice of technological ecology.

John Doe March 26, 2014 at 1:48 pm

I think whether a “parasite” is harmful or beneficial can depend on the host. Take the pre-installed software for instance, the reason they exist is because they make the vendors money, potentially lowering the price of their products. If you have the know-how of how to get rid of it is harm-less/potentially benificial, however for the average user it will be “harmful”.

Shawn Conn March 26, 2014 at 7:02 pm

Great post Jordan! Looking at software system as holistic ecosystem is something I haven’t appreciated until the last few years in my career.

To address one your thoughts:

“We have come to terms somewhat with the necessity of developed immunological responses in our technical systems—passwords, anti-virus, spam detection, etc. It’s still unclear to me what autoimmune disease would be in these contexts; perhaps that’s a metaphorical overreach, or perhaps we simply haven’t witnessed an event of the scale and severity of the Sardinian DDT inundation.”

It sounds like you need a good anecdotal example. In your 3 examples, the “harm” introduced is, respectively, password overload, CPU taxing (all the CPU cycles spent analyzing benign programs), & lost messages (due to false positives). This conflicts with the end users’ desire for simplicity, performance, and reliability. The users’ adapted responses to this harm are password sharing (password managers), disabling AV systems (or upgrading hardware), and using another communication service (or whitelisting false positives). These adapted responses would probably cease to exist in a perfect spam/virus free world with bulletproof authentication.

On the other hand, if there are any automated systems that dealt with “harm” of security systems I could see the potential of some sort of autoimmune response. I sadly admit that I’ve written software to circumvent security systems for the sake of “just fixing it.” I’m sure some well-used code is out there circumventing security systems that would trigger autoimmune response if the underlying security mechanism was removed.

Jordan Peacock March 27, 2014 at 6:16 am

The initial example that came to mind were the “market corrections” induced by high-frequency trade algorithms when they encounter a positive feedback loop that is unconstrained within the typical parameters.

But I’m still thinking that one through, and am very interested in what other people come up with.

mtraven March 26, 2014 at 8:23 pm

It՚s been suggested that sexual reproduction evolved primarily as a response to infection. In asexual reproduction, offspring have the same genome as their parent (barring mutation), which means there is not much variation in a population and it is vulnerable to exploitation by infectious agents. Sex shuffles genes around, generates more variation and thus less chance of being wiped out.

The analogy for computer software is clear. Our systems are vulnerable largely because they are so similar, that is, because software reproduces asexually, with all copies of a program more or less the same. The solution is to have operating systems capable of mating with each other and generating varied offspring.

Jordan Peacock March 27, 2014 at 6:17 am

Yes! Actually from my notes on “An Epidemic of Absence”, I highlighted this excerpt:

“Only half the individuals of any given sexual species, the females, actively procreate. Nonsexual species, on the other hand, can reproduce at twice the speed. So why choose this slow strategy? The answer: to escape parasites.”

Kartik Agaram March 27, 2014 at 9:26 am

Some of you have already heard me recommend

gwern March 27, 2014 at 10:12 am

> However, it was found that the parasitized nests thrived relative to those left alone by the cuckoo; and this effect was causally related to the cuckoo chicks themselves, as moving the eggs to other nests moved the beneficient effects as well.

Incorrect. You need to read the original research here, not a popularization. The cuckoo nests performed better on one metric (any successes) and worse on another metric (total successes). The *net* effect is neutral.

The original paper “From Parasitism to Mutualism: Unexpected Interactions Between a Cuckoo and Its Host” says:

> Overall, throughout the 16 seasons, parasitized and nonparasitized broods did not significantly differ in the number of crows fledged (1.584 ± 0.149 versus 1.379 ± 0.068, respectively; z = 0.390, P = 0.694, n = 550), though results suggest a slight benefit from raising a cuckoo.

(At p=0.7, ‘suggest’ is really reaching for a cool counterintuitive interpretation….)

> This example is telling, in that prior to eradication, malarial infection was seen as pure harm, and its benefits only are becoming to be apparent in its absence.

Economies and life spans both grow in the absence of malaria. While it’s interesting that the pure harm view was not 100% correct, it was still 99% correct.

Jordan Peacock March 27, 2014 at 10:43 am

re: cuckoos

Thanks for the pushback and the clarification.

re: pure harm

It’s not so much about the %s, as much as shifting from a trinary perspective (good/neutral/bad) to thinking in terms of interdependent trade-offs. Is malaria good? Well, it depends; if you have the genes and the immune system to fend it off, it’s possible that exposure can protect you from some other problems. On the other hand, it can kill you. Ideally you’d have some malaria-like entity that provides the benefits with less risk or side effects, but that presumes a level of knowledge and technology that we don’t yet have. (One interesting discussion regarding stacked interface dependencies is here:

Kartik Agaram March 27, 2014 at 10:15 am

“You also see this [authoritarian high modern] mindset in many technical approaches to social problems, a methodology Evgeny Morozov calls solutionism.”

There’s a tension here with my previous post that I try to be vigilant of. But solutionism feels too blunt a criticism. Identifying inhumanly subtle problems is a kind of technological advance. Maybe I should call this idea problemism.

mtraven March 27, 2014 at 10:43 am

Here’s a cute, recent, and novel example although cataloging things like this could be a full-time job for a software naturalist.

Jordan Peacock March 28, 2014 at 6:49 am
Kartik Agaram April 11, 2014 at 8:16 am

At the risk of seeming self-promoting, my new post owes this one a debt that I found it hard to articulate when I wrote it. Before I read it I was thinking in terms of values: organizations as commons, employees as parasites, etc. Your reframing of parasites got me to the more value-neutral notion of agency transfer. So thanks!

Comments on this entry are closed.