The Bloody-Minded Pleasures of Engineering

Welcome back. Labor Day tends to punctuate my year like the eye of a storm (I’ve been watching too much Hurricane-Gustav-TV). For those, like me, who do not vacation in August, it tends to be the hectic anchor month for the year’s work. On the other side of Labor Day, September brings with it the first advance charge of the year to come. The tense clarity of Labor Day is charged with the urgency of the present. There is none of the optimistic blue-sky vitality of spring-time visioning. But neither is there the wintry somnolence and ritual banality of New-Year-Resolution visioning. So I tend to pay attention to my Labor Day thoughts. This year I asked myself: why am I an engineer? The answer I came up with surprised me: out of sheer bloody-mindedness. In this year of viral widgetry, when everyone, degreed or not, became an engineer with a click on an install-this dialog on Facebook, this answer is important, because the most bloody-minded will win. Here is why.


Why Engineer?

There are three answers that preceded mine (“out of sheer bloody-mindedness”).

Engineering outgrew its ancestry in the crafts, and acquired a unique identity, around the turn of the century. Between about 1880-1910, as engineering transformed the world with electricity, steam and oil, the answer to the question, Why engineer? was an officiously triumphalist one — to conquer nature and harness its powers for humanity. Then, as World War I and II left the world with the mushroom cloud as the enduring symbol of engineering, the answer went from apologetic and defensive to subtle. Samuel Florman, in his 1976 classic The Existential Pleasures of Engineering, reconstructed engineering as primarily a private, philosophical act. The social impact of engineering was the responsibility, he suggested, of all of society. Making his case retroactive, he suggested that the triumphalist answer was largely an imputed one: part of a social perception of engineering that was mostly manufactured by non-engineers.

Florman’s answer to Why engineer? can probably be reduced to because it helps me become me.

Curiously, this denial of culpability on the part of engineers was largely accepted as legitimate . Possibly because it was true. As James Scott argues brilliantly in Seeing Like a State, to the extent that there is blame to be assigned, it attaches itself rather clearly to every citizen who participates in the legitimization of a state. Sign here on the social contract; we’ll try to make sure bullies don’t beat you up; you consent to be governed by an entity — the State — with less than 20/20 vision; you accept your part of the blame if we accidentally blow ourselves up by taking on large-scale engineering efforts.

So the first shift in the Big Answer, post WWII (let’s arbitrarily say 1960) was the one from triumphalist to existential. The third answer, which succeeded the triumphalist one around 1980, was the ironic one. The ironic rhetorical non-answer goes, in brief, “Why Not?”


Let’s return for a moment to the surging waters pounding the levees of New Orleans as I write this. Levees are a symbol of that oldest of all engineering disciplines, civil engineering. As I watch Hurricane Gustav pound at this meek and archaic symbol of human defiance, with anxious politicians looking on, it is hard to believe that we ever had the hubris to believe that we could either discipline or destroy nature. The environmentalists of the 90s and the high modernists of 1910 were both wrong. They are as wrong about, say, Facebook, as they were about the dams and bridges of 1908.

This isn’t because technology cannot destabilize nature. It is because nature does such a bang-up job on its own. For every doomsday future we make possible — say nuclear holocaust or a nasty-minded all-conquering post-Singularity global AI — nature cheerfully aims another asteroid at Earth. I was particularly amused by all the talk of the Large Hadron Collider possibly destroying the planet. I don’t understand the physics of the possibility, but I suspect there is an equal probability that nature will randomly lob a black hole at us.

We are not capable of being the stewards of nature, anymore than we are capable of mastering it. The most damage we are likely to do is just destroy ourselves as a species, after which Nature will probably shed a tear at the stupidity of a disobedient child, and move on.

The ironic answer then, is based on two observations. The first observation is that the legitimacy of the let’s-preserve-nature ethic, at least as an objective, selfless stance, is suspect. Postmodernist critiques of simple-minded environmentalism have been around for a while, and it seems to me that the takeaway is that the only good reason to have environmental concerns is a selfish one — to save ourselves. And maybe to be nicer to the cows in our factory farms.

The other observation that leads to the ironic answer is that, unlike in Florman’s time, no credible person today is asking the question why engineer? in the sort of accusatory tone that engineers endured in the 70s. Nobody is suggesting a return to “nature” in the sense of an ossified, never-changing Garden-of-Eden stable ecosystem. An entire cyber-generation has grown up with William Gibson as its saint. And this generation, rather shockingly, is the first human generation that understands at a deep, subconscious level that there is no such thing as technology. It is all nature. Herbert Simon may have been the first to articulate this idea at an intellectual level, but the Millenials are the first generation to get it at gut-level. Even if they haven’t read Gibson, the irony of creating a Facebook group to Save the Rainforests By Abandoning Technology isn’t lost on them.

So the ironic answer to why engineer? is really one that does not differentiate technology at all from, say, art, science or any other human endeavor or natural phenomenon. The rhetorical non-answer, why not? is not quite as shy, private and retiring as the existential one. And the point of this rhetorical non-answer is not (just) to create a decentered debate about engineering, but to legitimize a view of engineering as a socially-engaged aesthetic enterprise.

The iPod, perhaps, is the apotheosis of ironic engineering. It is because it can be, and because Steve Jobs chose to make it be. Its utilitarian inevitability (something like it had to disrupt the music industry) is overwhelmed by its overweening sense of Big-D Design; the aspects of it that didn’t have to be. By being so essentially artistic, the iPod reductively defines technology as art. Which is why the ironic answer fails.

And it’s child, the iPhone, is a symbol of the end of the short, few-decades-long age of Ironic Engineering. As another iconic designer of our times, James Dyson, said, “I have an iPhone and a BlackBerry. And I have to confess that I use the BlackBerry more.” Steve Jobs, for all his phenomenal creativity, seems to be missing an essential idea about what technology is.

So ironic engineering will not do. The raison d’etre of engineering cannot be borrowed from art or science (both of which, I think, may truly be at a terminally ironic stage).


C. P. Snow (he of the Two Cultures fame), was wrong. There aren’t two opposed cultures in the world today. There are three. Besides the sciences and the humanities, engineering represents a third culture. One that is only nominally rooted in the epistemic ethos of the sciences and the design ethos of the fine arts. At heart, engineering is a wild, tribal, synthetic culture that builds before it understands. Its signature characteristics are quantity, energy and passion. By contrast, the dominant characteristics of science and the humanities are probably reason and emotion. Nature, in an editorial dated 22nd June, 2006, titled “The Mad Technologist,” discusses this very subtle distinction:

“We find that pure scientists are often treated kindly by film-makers, who have portrayed them sympathetically, as brooding mathematicians (A Beautiful Mind) and heroic archaeologists (Raiders of the Lost Ark). It is technology that movie-makers seem to fear. Even the best-loved science-fiction films have a distinctly ambivalent take on it. Blade Runner features a genetic designer without empathy for his creations, who end up killing him. In 2001: A Space Odyssey, computers turn against humans, and Star Wars has us rooting for the side that relies on spiritual power over that which prefers technology, exemplified by the Death Star.”

Science is content to poke at nature with just enough force to help verify or falsify its models. The humanities, to the extent that they engage the non-human at all, through art, return quickly to anthropocentric self-absorption, with entirely human levels of energy and passion.

Engineering asks, for the hell of it, just how powerfully can I mess with the world, in all its intertwined natural and artificial beauty? Sometimes — and this is why engineering is sometimes the agnostic force that the Hitlers and Saddams co-opt — the most interesting answer is “blow it up.”


Let’s return to today, and the it-idea of the Singularity. To the idea that some form of artificial intelligence might surpass human intelligence (scroll to the end of this earlier piece for some pointers on this interesting topic).

Here is a simple illustration of the sorts of reasoning that make people panic about a Googlezon global intelligence taking over the world. Start with the (reasonable) axiom that it takes a smarter person to debug a computer program than to write it in the first place. Conclusion: if the smartest programmer in the world were to write a flawed program, nobody will be able to debug it. If it happens to be some sort of protean, self-reconfiguring, critical-to-the-Internet sort of program, it might well trigger the Singularity.

This particular line of reasoning is suspect (a too-complex-for-anyone-to-debug program is far more likely to acquire entropy than intelligence), but the overall line of thinking is not. The idea that the connected beast of technology might become too complex to manage is a sound one. I personally suspect that in this sense, the Singularity actually occurred with the invention of agriculture.

So contemplate, as an an engineer (and remember, this includes anyone who has every chosen to install a Facebook widget), this globe-spanning beast called nature+technology (or nature-including-technology). It has a life of its own, and it is threatening today to either die of a creeping entropy that we aren’t smart enough to control, or become effectively sentient and smarter than us.

How can you engage it productively?

By being even more creatively-destructive than it is capable of being without human intervention. Bloody-minded in short.


Let me make it more concrete. Imagine engineers from 1900, 1965, 1995 and 2008 (time-ported as necessary) answering the question why are you an engineer? within the 2008 context.

1900-engineer: “I thought it was to make the world a better place, but clearly technology is so complex today that any innovation is as likely to spawn terrorism or exacerbate climate change as it is to improve our lot. I quit; I will become a monk.

1965-engineer: “I thought I was doing this to self-actualize within my lonely existence, but clearly engineering in 2008 has become as much self-indulgent ‘art’ as engagement of the natural world. I will not write a Facebook widget. I will become a monk.”

1995-engineer: “I thought I did it for the same reasons that drive that guy to make art and that other guy to do science, but it seems like whatever I do, be it designing a fixture or writing a piece of code, I am fueling the emergence of this strange Googlezon beast. That’s scarily large and impactful. It changes reality far more than any piece of art or science could, and I want no part of it. I am off to become a monk.”

2008-engineer: “Crap! this will either blow up in our faces or it will be the biggest thrill-ride ever. Awesome! lemme dive in!” Carpe Diem.


I spent ten days in August in California, mostly in the Bay Area. It is a part of world that cannot be matched for the sheer obscenity of its relentlessly positive technological energy. There is none of the sense of the tragic that pervades the air on the East Coast.

California is full of people who are cheerfully bloody-minded about their engagement of technology.

Here is a thought experiment about these curious folks. Imagine that a mathematician proved conclusively that a particular type of Uber-Machine was the most complex piece of technology theoretically possible. Call this the Uber-Machine theorem. Maybe the Uber-Machine is the theoretically most complex future-Internet possible, powered by the theoretically most-complex computer chip within its nodes.

Nothing more complex, intelligent or capable is theoretically possible. But there is a corollary. The theorem also implies that it is possible to make a different kind of ultimate artifact, call it Uber-Machine B. One that annihilates the Universe completely. Maybe Uber-Machine B is some descendant of the Large Hadron Collider, capable of provably destroying the fabric of space-time.

Which would you choose to help build? Secretly, I believe the bloody-minded technologists (and I am among them) would want to build Uber-Machine B because it represents the most impact we could ever have on reality. Uber-Machine A would depress us as representing a fundamental plateau.

There is even a higher morality to this. Technology-fueled growth — what Joel Mokyr called Schumpeterian growth — is the only kind of growth, towards the unknown, that leaves open the possibility that we may solve the apparently intractable problems of today. The cost is that we may create the truly intractable problems of tomorrow — civilizational death-forces — that we may have to accept the way we accept the inevitability of our individual deaths. Maybe we’ve already created these problems.

And that is why bloody-mindedness is the only defensible motivation for being a technologist today. You may delude yourself with culturally older reasons, but this is the only one that holds up. It is also the only reason that will allow you to dive in without second-guessing yourself too much, with enough energy to have any hope of having an impact. Because the people shaping the technology tomorrow aren’t holding back out of fear of (say) green-house emissions from large data centers.


Alright. Holiday over. Back to recycling tomorrow.

Get Ribbonfarm in your inbox

Get new post updates by email

New post updates are sent out once a week

About Venkatesh Rao

Venkat is the founder and editor-in-chief of ribbonfarm. Follow him on Twitter


  1. You’ve covered it all, but I can’t resist replying.

    My first answer would be a variant of the ironic: What else is there?

    I’d say there are four, not three cultures: The sciences and the humanities opposite each other, with the other two poles being engineering and… applied humanities. The last two concern themselves with the manipulation of the two forms of reality – objective reality and human reality (large standing waves of belief with often-tenuous connections to objective reality). One side is amoral and brutally honest, reflecting the nature of nature. The other, capricious, often dishonest, moral, reflecting the nature of the human brain-constructed reality.

    I think most people who become engineers have a visceral dislike for that half of the compass which comprises either useless wanking (which you delicately describe as self-indulgent Art) or worse – the manipulators of human minds, the meme merchants, the managers, the multi-level marketers, the salesmen, the politicians, the priests.

    So, what else is there? Either science, which is a rather hands-off, read-only investigation of nature (and requires rather a lot of brain)… or engineering – writing reality, sculpting it into amazing forms or blowing it up spectacularly. Whatever it is, it’s real, an engagement with nature which cannot be faked. You can fool people into being happy, into worshipping gods, into a docile social contract, into thinking you’re a world-class artist, but you cannot fool nature. You cannot fake going to the moon.

    (Aside: consider the schism between the Austrian and the Keynesian schools of economics. Makes more sense if you think of the Austrians as coming from the engineering side, with a preference for reality-based currencies, as opposed to the faith-based fiat preferred by Keynesians coming from the other side.)

    I’m a little puzzled by your view of the iPhone as self-indulgent. It would be self-indulgent if it was marketed as a hockey puck. But as a phone, it’s pretty strictly form-follows-function. A Blackberry is certainly preferable to the iPhone iff I was in the business of sending emails . The on-screen keyboard of the iPhone leaves a lot to be desired. But the average Facebook guy reads far, far more than he writes, and the only omission may have been a couple of large keys labelled OMG and LOL.

  2. I think I agree that there are probably 4, though I don’t actually dislike #4… MLM’s, sales, politicians. I am close to being one myself. They are just engineers with different materials…

    That “read only” phrase is gold. Engineering as ‘writing nature’ is a very good definition.

    I’d disagree though, that social-science engineering isn’t “real.” The idea of ‘currency’ is purely conceptual stuff, as is the notion of democracy, but both are very “real” to me. Green pieces of paper and ballot boxes are physical artifacts that are “physical” but aren’t central to currency or democracy resp. In some ways, fooling people that there is a god is like going to the moon in its own way. Traveling from hunter-gatherer mind to capable-of-religious-conceptualization is an interplanetary journey. I think John Searle has a book about such ideas called ‘the construction of social reality.’

    I think you’re getting at a real issue here, but the distinction between practical and impractical metaphysics isn’t one of “amount of physical-reality substance.” I don’t know what IS the distinction, but you’ve got me thinking now. In the specific case of economics and currency though, the current mess suggests that purely made up financial ideas are bad, as you say. But I am reluctant to go all Ayn Rand and gold. There is something limiting-seeming about conservative physicalist-literalism, to coin a phrase.

    iPhone…gut feel there. It is just too Steve-Jobs-ish for me. Yes, you can rationalize that it has a user-focused design, but it just strikes me as Jobs all over. Oddly enough I feel that with every Apple device, that it is more about Apple than about me, and that I am being bamboozled by a clever guy telling me this is what I MUST want because Lord Jobs tells me that’s how I think. By contrast, the PC has a clumsier, but less theatrical and more honest/utilitarian approach to user experience. I am being incoherent here, but the new PC-fights-back ads are actually making the point quite well. The PC is _real_ in a way nothing from Apple is. TO use your own argument at you, the PC is more like Austrian currency, the Apple brand currency is too feel of Keynesian hot air.

    But ’tis late and I am rambling from insomnia rather than coherently debating here.


  3. I think we’re mostly in agreement… I’ve been trying my hand at #4 as well, though not in as structured and bloody-minded manner as you :)

    Base reality and human reality can be considered different substrates (though physically, the latter is a subset of the former). Engineering of either substrate is challenging, the structures marvelous. Far from being independent, they’re the yin-yang of human advancement… the Pyramids could not have been built without a pyramid of social organization, which itself was built on the back of earlier technological achievement (agriculture) and so on. (Hey, the Dahi Handi, makes for a rather better image than Hobbes’ Leviathan. Especially when you consider that at the apex of this very literal metaphor of the social pyramid is the self-styled peacock-feathered God-King, who scarfs down most of the butter and passes on the remnant to the chaps below.)

    The properties of the two substrates are very different. The laws governing nature are rigid and unchanging, never, ever yielding a free lunch. The laws of human reality are much more protean. Human fashions shift, human perception is riddled with systemic errors which are often exploited by “human reality engineers”.

    HR engineers constantly find themselves in positions which have more of a predator-prey or parasite-host nature, where they have to persuade their victims to do things which might not necessarily be in the victim’s interest. Most choices in human engineering have a moral dimension. The resentment and resistance to HR engineering is a shadow of the near-universal taboo against cannibalistic or vampirish behaviour.

    Many technology engineers, I suspect, have a mild version of the condition which in extremis, is the mind-blindness of autism. While his autistic cousin is perplexed by dishonest behaviour, the tech engineer merely dislikes and mistrusts the dishonesty and deception which pervade and lubricate the great engineered structures and vehicles in human reality. This is not his natural element. Such an attitude is either the cause or the effect of an affinity with nature’s unforgiving honesty, I guess.

    Managers and others in the middle of the human pyramid often have to function as reflectors of the party line. Either they should be genuinely stupid and subcontract their thinking to the High Command, or they have to suppress their individual opinion and basically lie.

    I was at the receiving end of an Amway sales pitch recently, and couldn’t help admiring the structure and beauty of the thing. Engineering that system – a self-replicating salesman – took a lot of skill. But that’s my big crib with Amway and such MLMs: they turn people into vampires and make them prey on their near and dear ones, burning their personal social network capital towards enhancing the MLM network.

    Read-only: I find it a useful epithet for much of my life, from being a good boy in school, always listening to what I was told, obeying the laws, saluting the flag on Independence day, to my transition to the great salaried middle class, constantly sniping about how things don’t work, sucking passively at the teats of television and the internet… It’s only recently that I’ve started looking at things from a writer’s – rather than a read-only critic’s – point-of-view.

  4. As for the Mac – I’ve used ’em all, Windows, Linux and Macs. And the Macs win. Linux comes next. Windows third. Some of what you say resonates strongly with me – I hate iTunes, for instance and I am absolutely insane with fury that there’s no other way to play music on the ipod touch or iphone. Apple has made several one-size-fits-all choices and they do rankle. However, what you call “Jobs all over” I call “unity of design”. Someone, somewhere in Apple – maybe Jobs, maybe Ive, maybe the Dilbert genius garbageman on One Infinite Loop – someone has a coherent vision of how something must be, and then goes and does it.

    Contrast this with Windows, a hodgepodge of many things put together without love, without unity of vision. It’s not meant for anyone in particular. And it *also* results in the same insane fury when you try deviating ever so slightly from the many-headed Microsoft “vision”. Hell, if you want flexibility, use Linux. If you want a headache, you can use Windows.

    As companies, both Microsoft and Apple are evil, the former significantly more than the latter, but evil nonetheless (witness Apple’s anti-competitive app store policies). But if you have to patronize one of them, which will you pick? Why, the one with better taste, of course.

    I’m starting to sound like Stephen Fry so I might as well quote him directly (an eminently serviceable replacement for Douglas Adams in the technophile writer-essayist category):

    …the usual bad design that “corporates” always seem prepared to put up with, as if they’re embarrassed and ashamed by any stylishness which might draw attention to them. The SmartPhone equivalent of living in a block of modern flats. Most mod cons, but no style, delight or emotional attachment to be had. Windows for Mobiles is certainly better than Windows for PCs or, God help us all, Vista, but it is still an insulting offering. The feeling, as with all things Microsoft, is that all design features and functions are there to suit MS rather than to delight, enthuse and compel the user. Compromise, short-cuts, inconveniences, vestigial residues – no one responsible is likely to pat themselves on the back for the design or the s’ware engineering, any more than the architect or project manager of a 60s council flat is likely to point it out with pride as he rides by with his grandchildren. You’re only on this planet once – do something extraordinary, imaginative and inspiring. That’s the difference, ultimately. Those behind Palm OS and the Psion can justifiably be proud of what they did, what they created. WinMob just muscled in on a market they never spotted and they did it in a clumsy, bullying, ugly manner, exactly as they had with Windows before, and exactly as IBM had with the PC itself a decade earlier. Break free, all you corporate software engineers and designers: the excuse that you are under the rule of dullards, greedy share-price number crunchers and visually and ergonomically illiterate yahoos is not good enough. Persuade them. Otherwise we all get a digital environment that’s a vile as a 60s housing estate.

    Design matters
    By design here, I mean GUI and OS as much as outer case design. Let’s go back to houses. The sixties taught us, surely, that architectural design, commercial and domestic, is not an extra. The office you work in every day, the house you live in every day, they are more than the sum of their functions. We know that sick building syndrome is real, and we know what an insult to the human spirit were some of the monstrosities constructed in past decades. An office with strip lighting, drab carpets, vile partitions and dull furniture and fittings is unacceptable these days, as much perhaps because of the poor productivity it engenders as the assault on dignity it represents. Well, computers and SmartPhones are no less environments: to say “well my WinMob device does all that your iPhone can do” is like saying my Barratt home has got the same number of bedrooms as your Georgian watermill, it’s got a kitchen too, and a bathroom.” … I accept that price is an issue here; if budget is a consideration then you’ll have to forgive me, I’m writing from the privileged position of being able to indulge my taste for these objects. But who can deny that design really matters? Or that good design need not be more expensive? We spend our lives inside the virtual environment of digital platforms – why should a faceless, graceless, styleless nerd or a greedy hog of a corporate twat deny us simplicity, beauty, grace, fun, sexiness, delight, imagination and creative energy in our digital lives? And why should Apple be the only company that sees that? Why don’t the other bastards GET IT??

    And he hits the iPhone’s limitations right on the button in spite of being a rabid Apple fan…

    Server side apps only. No, no, no, no, no. This is NOT good. It’s one thing to want to keep the proprietary system closed, but to present a device sealed in digital Araldite is a Bad Idea.

    Text entry. I’m sorry Steve, but physical keyboards are okay. They’re fine. When in your iPhone introductory keynote late last year you dissed the stylus and keyboard, you may have noticed a deafening silence as tumbleweed and sage-brush whizzed through the hall. It is certainly true that the virtual kb used in the iPhone gets better the more you use it. It is also true that the glossary autocorrect system is immensely impressive. But I challenge anyone to type an email as fast on an iPhone than I can on a BB or Treo. I assure you it can’t be done. I’m pretty quick with an iPhone now, but nonetheless text entry just isn’t as satisfying as everything else about the device. It’s an example perhaps of ideology overcoming practicality, as in the early days of the single click mouse. Don’t be stubborn about this Steve, you know I’m right, as in their heart of hearts do the guys at Cupertino. Hence the lack of Quicktime movies on the Apple site showing happy users typing proper length emails and texts. Why else is the only footage of text entry hurried and very much on the short side? Because they know … they know perfectly well it’s a drawback.

    But that’s about it. Everything else in the iPhone lives up to, even surpasses the hype. Another triumph for Jonathan Ive and his design team, Apple have made a wholly desirable and beautiful object. Only a cross and silly person would pretend to be unimpressed or make claims of parity about their O2 xda Trion or similar lumpen beast.