Parasitism is usually defined as a multi-party ecological organization in which one party benefits at another’s expense, and is contrasted with commensalism (the host is neither harmed nor helped) and mutualism (a type of symbiosis in which both parties benefit). Missing from this triptych are organizations in which a harm is partially offset with second-order benefits.
New research brings a little light to the subject in its analysis of the notorious brood parasites, the common cuckoo. The cuckoo lays its eggs in the nests of other birds, externalizing the costs of raising its young to other species, which bear the burden of feeding and caring for the cuckoo chicks, who compete strenuously with their own. However, it was found that the parasitized nests thrived relative to those left alone by the cuckoo; and this effect was causally related to the cuckoo chicks themselves, as moving the eggs to other nests moved the beneficient effects as well.
It turns out that cuckoo chicks defecate a kind of black, tarry substance that is incredibly toxic and serves to dissuade predators, resulting in net improved fitness for the host species despite the costs.
Ecological thinking is transforming our understanding of the natural world, and is blurring many of the firm boundaries erected under the old paradigms that fetishisized ‘identity’ and assumed in advance the nature of benefit and harm. The world of software seems perfectly poised for ecological analysis, as many of its fundamental concepts parallel those of biological systems (source code as the genotype to compiled code’s phenotype, for instance).
So what would parasitism in software look like?
The examples that come most easily to mind are phenomena such as banner advertising, pre-installed software (PUPs) on new devices, keyloggers, licensing copy, backwards compatibility, SEO, spam, cookies, stored user profiles, computer viruses. In most of these cases there are clear perceptions of harm, and a clear division between which entity could be considered the ‘host’ and which the ‘parasite’. A banner advertisement competes with the webpage it resides on for the user’s attention, but by stealing a marginal amount of attention it may finance the continued hosting of the webpage. Maintaining backwards compatibility for deprecated file formats (as Microsoft Office does) is costly, but it fosters user retention across upgrade cycles.
For our purposes, parasitism exists in systems with asymmetric ongoing costs. In a some cases, these costs are offset by attendant gains elsewhere in the system (harm could also be negative benefit, as in the brood parasitism above; host chicks receive less food/attention than they would otherwise). These are endemic in the system, persisting or reproducing as long as the ecosystem permits. Parasitism is never the theoretical optimal situation for fitness, and in a purely engineered environment all multi-party organizations would be relationships of mutualism. However, populations in nature and in software evolve and respond to their environments, and to the extent that design occurs, it’s limited in scope and subject to negative feedback.
At the limit, the harm is so severe as to kill the host. Historically, plague is maladaptive, and parasitism has a much longer evolutionary history precisely because it doesn’t entirely saw off the branch upon which it sits. However, when host populations grow large enough to survive epidemics (with the rise of agriculture, originally, and later with trends towards urbanization), you see a commensurate rise or transition to epidemics rather than patterns of endemic infection.
Continuing with the metaphor, then, the rise of personal computing and pervasive networking functions in technology much as urbanization did for human health; killing many and rendering the survivors immune or tolerant to the infection. Moore’s Law played a role here, with increased hardware capabilities giving the average computer more margin that could be lost while keeping the system tolerably usable.
The history of viruses and malware in computing, and the existential nature of the risks they pose to systems and their users has overshadowed the less virulent, and even amongst that subset (which includes, among other things, the all-too-common phenomena of vendor bloatware), there is little examination of whether infection confers second-order benefits to the infected. Complicating things yet further, what is considered a harm may be relative; to an end user, software licensing may be at best an antifeature, but it confers benefit in that by incentizing purchases it reproduces its conditions for existence.
Not all forms of parasitism need to be embodied in software. There are habits and practices that are endemic and are costly, but whose costs are minimal to sustain. The QWERTY keyboard style is a relatively benign example, where the costs to the initial learning curve are small compared to the cost of eradication.
Another involves software dependencies. The relationship a software project has to it’s libraries can have complex interdependencies reminiscent of the sloth’s. Here the relationships are typically classified as commensalist, but antiquated techniques, ill-documented functions and punitive licensing requirements tip the scales from “mutual benefit” or “benefit, no/low cost” to “mixed cost/benefit” (and in the case of some libraries which I won’t name, nearly pure cost).
An example of mixed cost/benefit is Ghostscript. My colleague and I worked substantially with the Ghostscript/GhostPDL library on one project. It’s uncontested if you need to work comprehensively with PostScript, PCL and PDF, particularly in a mainframe or midrange context. However, those gains come with the costs of a project that, while open source, can be punishingly difficult to productively enhance; like an eastern Mediterranean city, it is a code base that has become definitive through use and accretion, with the inconsistencies and opacity that implies. But because of that, it has so deeply internalized the problem space that numerous attempts at writing replacements have failed to address a fraction of the same. Therefore, the costs continue to be borne.
Another example would be UNIX: by all accounts a perverse and arcane system which makes a certain amount of sense once you understand it (deriving less from more, for example), made virulent through licensing innovations, educational efforts and effective toolsets. UNIX isn’t immune to attempts at disruptive innovation, but the contenders face a punishing nest of challenges, technical and social. Here, the intuitive host/parasite terms flip, with networked applications tolerating the costs imposed by the legacies of operating system design.
In November 2010, Moises Velasquez-Manoff traveled to Mexico and infected himself with the larva of Necator americanus (the New World hookworm). Velasquez-Manoff had hay fever, allergies to peanuts, and an autoimmune disorder known as alopecia which caused his body to fight the development of hair follicles, leaving him mostly hairless. His decision to introduce this notorious parasite into his body is explained in his book, An Epidemic of Absence (and in this EconTalk interview), and is premised upon new insights that the medical community has been discovering with regards to the relationship between human health and parasitism. Velasquez-Manoff explains the connection with the example of Sardinia:
Velasquez-Manoff: Sardinia had an epidemic of malaria for possibly millennia…Malaria basically was eradicated in the 1940s, after WWII that is, in the space of just a few years. They went in with DDT and sprayed the whole island….People who had evolved, in theory, with malaria, suddenly no longer had it.
Two autoimmune diseases started increasing dramatically: multiple sclerosis and Type 1 diabetes… A chronic malaria infection suppresses your immune system. So if you evolved with this constant immune suppression, and then you suddenly remove it, it’s going to reveal things to natural selection… The very tendencies in the immune system that produce autoimmune disease in the absence of malaria may have actually helped manage the infection, may have actually helped fight off malaria when it’s present.
This example is telling, in that prior to eradication, malarial infection was seen as pure harm, and its benefits only are becoming to be apparent in its absence. This complicates the discovery process, but I think we can sketch out some possible examples in the technological realm.
It’s here that the information security community is ahead of the curve. “The idea of perfect security is a trap.” We have come to terms somewhat with the necessity of developed immunological responses in our technical systems—passwords, anti-virus, spam detection, etc. It’s still unclear to me what autoimmune disease would be in these contexts; perhaps that’s a metaphorical overreach, or perhaps we simply haven’t witnessed an event of the scale and severity of the Sardinian DDT inundation.
The one example to the contrary would be the continued fallout from Edward Snowden’s leaks. The NSA’s surveillance and and cyberwarfare capabilities have become recognized by many individuals and organizations as threats, and we are in the process of witnessing a society-level immune response. It’s not so different in kind from the equally endemic private surveillance (in the forms of cookies, social networking profiles, advertising algorithms, etc) that confers the same complicated nest of benefits and harms, and as with the endemic malaria of Sardinia, the net benefits to the population does not obviate the existential risk to individuals.
Evolutionary biology has a concept known as fitness landscapes that gives us a helpful language for talking about the problem. A fitness landscape provides a topological view of a population in its environment: the peaks identify “ideal fitness” given the environment. A population that is positioned on an incline will experience random mutations, and mutations that are higher on the peak are (tautologically) more fit given that environment, and over generations you see a drift upwards until the population peaks or the topology changes.
However, the presence of multiple peaks indicates what is patently obvious to genetic engineers everywhere: that there are other possibilities, some of which might be substantially better than the status quo, if only they could be reached. They are inaccessible to incremental change, due to the sustained troughs in between, but they could be perhaps be landed upon given effective engineering.
Returning to the Ghostscript example: there are numerous projects that bring Ghostscript’s capacities up-to-date, or into new languages, or into new paradigms. But they fail to displace Ghostscript because they ultimately fail to take into account the full problem space that Ghostscript occupies, or the full costs of transition: in this respect, these movements reflect the authoritarian high modernism (you also see this mindset in many technical approaches to social problems, a methodology Evgeny Morozov calls solutionism). Ghostscript is endemic malarial infection.
Our biomass as humans is predominantly our own, but our cells are outnumbered significantly by those which do no possess our own DNA (some 100 trillion, mostly in our gut and gastrointestinal tract). A legible, “human-first” extermination regime would kill the host. But there are certainly bacteria we wish to fight, and if removal is not feasible, we focus on converting parasites to symbiotes. The first challenge is the mental shift of what identity consists in: our notion of our “self” is bleeding at the edges, much as technologically we have shifted from an account on a mainframe to a personal computer to accounts that move between multiple devices and live locally and in ‘the cloud’. Threat assessment becomes complicated — is the software from driver manufacturers that comes pre-loaded on your laptop beneficial, harmful, or bits of both? The illegibility is most stark when you look at dark patterns, where the existence of the threat depends entirely on whether you, as a user, find the intent of the application design in concord or in conflict with your own desires.
None of this is prescriptive. “Tip of the iceberg” is a generous assessment of our state of understanding of parasitic relationships in the medical and ecological spheres; in the realm of technology it’s arguably less. Rather, this is intended to open a vista that perhaps was closed or unknown and inspire a future practice of technological ecology.