The Unapologetic Case For Bullshit

In 1986 Harry Frankfurt published the first edition of On Bullshit, the essay that, in the years that followed, was to become the authoritative take on the topic. In it, he lamented the amount of bullshit plaguing every aspect of public life, arguing that production of bullshit was tightly correlated with the increase in opportunities and (perceived) obligations for people to speak their mind, even in the absence of a strong “apprehension of reality”.

Thirty years later, this trend is anything but receding. The web in general, and social media in particular, have multiplied the number of channels where we can exercise our fundamental need to be consulted. At the same time, ‘reality’ is an increasingly opaque concept, challenged by fake news on one side and the genuine unintelligibility of a world in the midst of a technological, social and political revolution on the other.

How do we navigate in this situation? Frankfurt, as we will see later, argues for self-restraint in lack of certainty. In a previous post, I have also put forward what I defined a ‘precautionary principle’: when faced with common talk (a sub-category of bullshit) it is better take the safe option and trust our common sense. But the more I think about methods and tools to resist bullshit, the more I become  forgiving of it.

It is easy to dismiss bullshit as pure noise. To treat it as the inevitable, and yet insufferable, exhaust of a world in decline. In doing so, however, we risk falling in an excellence trap: the belief that progress is a smooth climb towards the highest peak.

Maybe, a perfect world would not be a world without bullshit, but rather one where there is just the right amount of it. Maybe, to reach higher peaks of truth we sometimes need to descend into bullshit valleys. Traverse a knowledge fitness landscape, in other words, where bullshit can be adaptive.

Can we make an unapologetic case for bullshit, without descending into post-truth relativism?

[Read more…]

Boat Stories

Last year, I discovered Ursula LeGuin’s fascinating talk, The Carrier Bag Theory of Fiction, (transcript) by way of Donna Haraway’s equally interesting talk Anthropocene, Capitalocene, Chthulucene. Both have been nagging at me for a year now.

The theory, building on the significance of containers (bags, baskets) to early humans — the default human here is female of course — in forager societies, offers a model of narrative as a “carrier bag” of community context and its evolution. It is a model that stands in radical opposition to the hero’s journey model of narrative.

Panels from Asterix and the Great Crossing, a boat story.

Thinking about the two opposed theories, it struck me that between the carrier bag story and the hero’s journey, there is a third kind of story that is superior to both: the boat story. A boat is at once a motif of containment and journeying. The mode of sustenance it enables — fishing, especially with a net, a bag full of holes — is somewhere between gathering and hunting ways of feeding; somewhere between female and male ways of being. It at once stands for the secure attachment to home and a venturesome disposition towards the unknown. It incorporates the conscientiousness and stewardship of settled life, and the openness to experience of nomadic life. A boat is a home, but a home away from home. A boat story is a journey, but one on which you bring home, and perhaps even Mom, along with you. But it isn’t an insular home, even though it has a boundary. It is a territory but it is not territorial. It is socially open enough to accommodate encounters with strangers, and is in fact eager to accommodate them. Xenophobes do not generally go voyaging.

Boat stories, like hero’s journeys and carrier-bag stories, are a good way to understand the human condition. They are especially good as a mental model of blogging.

[Read more…]

A Glitch in the Theocratic Matrix

When I was a kid — I was about 12 I think —  and relatively new to atheism and its social burdens, I had a little run-in with a sincerely religious classmate. He simply would not believe that my non-belief in religion was even possible. He was sure I was lying or being provocative for the hell of it. As a test, he pulled out a little picture of his favorite god from his wallet, and dared me to tear it up. I did, and he was suitably shocked. After a moment of stunned speechlessness, he said something weak, like “err… oh wow!”

I was reminded of this little episode when a little clip from CNN did the rounds a couple of days back. It features a religious conservative being visibly stunned speechless by the revelation that you do not need to swear on the Bible to assume an elected office in the United States. Ted Crockett really appeared to believe that a Muslim politician could not hold office because “You have to swear on a Bible to be an elected official in the United States of America…a Muslim cannot do that, ethically, swearing on the Bible.”

Like my old schoolmate, this guy was genuinely shocked to learn he was wrong in a fairly trivial way. Unlike my old schoolmate, however, we’re not talking about a 12-year old boy. We’re talking about a man who appears to be in his late fifties or sixties, and has held an elected office.

Like many others, once I was done chuckling, I found myself wondering: how is it even possible to arrive at, and hold, this particular sort of bizarre false belief, about swearing-in ceremonies being necessarily tied to the Bible in a non-theocratic state?

[Read more…]

Prolegomena to Any Dark-Age Psychohistory

When I think about history, the picture in my head is that of a roiling canvas of many choppy, intertwingled narrative streams, enveloped by many-hued nebulous fogs of mood and temper. Star-like cosmic irruption-events, ranging from discoveries to disasters, wink through from the void, disturbing the flow of human affairs and forcing steering imperatives onto those living through them. The picture is as much a portrait of a sentimental sense of history, as it is a map of an unfolding gestalt of events.

When I try to capture this poetic mental image in a drawing however, all I get is the kind of crappy cartoon you see below.

It’ll  do to get the idea across though. This particular sample from my doodle files is what contemporary American history looks like to me today: a generally well-defined low-fog Blue story, getting interrupted by less well-defined, high-fog Red tendrils.

It is this kind of image that is conjured up for me when I ask myself the question many are asking today: Are we in a Dark Age?

[Read more…]

The Leaning Tower of Morality

Don’t hate the player, hate the game. — Ice-T

Game theory is asleep, cooperate for no reason. — Deity of Religion

There’s an image that’s taken root in my mind recently that I can’t seem to shake. I picture humanity living in a large, rickety tower, tilting at a precarious angle to the ground — like so:

The tower represents our capacity for moral behavior. Lower levels are more base; higher levels, more virtuous. We don’t need an exact floorplan, but here’s the kind of thing I’m imagining:

  • Ground floor: Perfect zero-sum selfishness. Aggression and exploitation. The war of all against all.
  • Middle floors: Various flavors of mutualism. “I’ll help you if you help me.” Reciprocity. Tit for tat.
  • Higher floors: Empathy and compassion. Turning the other cheek. True virtue (not just signaling). A tendency to cooperate in one-shot prisoner’s dilemmas.
  • Penthouse: Perfect self-sacrificing altruism. A willingness to give time, energy, money, or even one’s life to help a stranger for nothing in return.

Now, some people inhabit higher floors than others, but with the exception of bona fide psychopaths, all of us live somewhere in the tower, happily above ground.

Here’s the question I want to explore today: How does this structure remain standing? On what ultimate explanatory principles do our moral instincts rest?

[Read more…]

The Blockchain Man

The term Organization Man is a rich one. From it, we can conjure up an image and a life.

It’s a man, not a woman. He’s white, standing somewhere between 6’0 and 6’2. He has a strong chin and medium length light brown hair parted on the left.

He walks from one meeting to the next wearing a dark suit with a pressed white dressed shirt and dark Oxford dress shoes. His wrist holds a watch – nice, but not extravagant, with a brown leather strap and a gold-rimmed face.

More than just an image, you can conjure up a life for The Organization Man, a term coined by William Whyte in his 1956 book of the same name. Even though the novel predates Whyte’s book by 30 years, Sinclair Lewis’ Babbitt (1922) established the archetype perfectly.

Today, the successor of the Organization Man — the Blockchain Man — is starting to emerge. To understand how he might evolve, let us first look back.

[Read more…]

“It’s Only Cannibalism if We’re Equals”

This is a guest post by Graham Warnken.

Almost all accounts of cannibalism throughout the years agree on one thing: it’s a communal affair. Native funeral parties consume the flesh of the departed in a ritual of respect and grief. Foreign warriors devour foes in cruel rites of victory. A group of desperate survivors stranded on the sea or in a mountain pass draw straws to see which poor soul will offer himself up.

No matter the situation, the many consume the one—the deceased is partitioned out amongst his friends and relations, the defeated champion doled out to boost morale, the weakest link sacrificed that his companions might live. The latter in particular, while it doesn’t remove the central horror of the act, does possess a certain sense of justice. It allows us to see cannibals as more than monstrous. When we think of the Donner party, we don’t recoil in terror. We feel revulsion, but we understand. The doomed pioneers’ act, born of desperation, was all that allowed the community to scrape through its frigid circumstances, minus a few members.

In the Enlightenment era, this communal cannibalism was an excellent example of the bounds of natural law. Cătălin Avramescu’s An Intellectual History of Cannibalism describes the general philosophical view of anthropophagy by way of necessity:

When danger threatens us and another equally, we are obliged to think first of ourselves [. . .] we must set precedence on our own interests, when they enter into conflict with those of another. [. . .] If we accept that necessity—evident and unproblematic in the case of killing [an] aggressor—can excuse an action that is illicit in itself, then on the basis of this reasoning we must also tackle the aberration of forced cannibalism, since it is directed by the same natural and legal resorts.

As with any philosophical topic, there was a mind-numbing degree of back-and-forth about the anthropophagus over the course of the Enlightenment, chiefly because he functioned as a pawn in the larger game of whether or not natural law is valid. But the general philosophical consensus was clear. In cases where cannibalism is necessary for the survival of the community, it is abhorrent but permissible.

[Read more…]

The Internet of Electron Microscopes

This is a guest post by Chenoe Hart

After you have stared at your computer screen for a while, it’s recommended that you give your eyes a break to refocus on a more distant outside view. In past years when our monitors looked more like boxes than tablets, you might have already been looking into such a space. The perception of digital content on the screens of CRT displays was inextricably accompanied by the additional perspective lines of the monitor enclosure extending behind it. Expanses of beige plastic stretching past the foreground of your observation might make the eyes operate in a slightly different manner compared to our modern condition of viewing flat panels whose minimal depth renders them closer to two-dimensional apparitions. We always knew that the internet was an ephemeral entity presented in translation from abstract code into pixels on our screen, but our immediate sensory feedback perceived it to be the front of a three-dimensional box possessing further physical extension.

Construction photograph of the interior of the Statue of Liberty, from the U.S. Library of Congress.

Many of the words we commonly used to describe the early emerging internet reference an implicit dimensionality: “cyberspace” was non-ironically used as a descriptive term, we explored it those spaces through “web portals,” and the act of “surfing” the web implied negotiating the surface of a physical mass which contained further inaccessible fathoms underneath. The “-tron” suffix marketing the electron gun technology used in those CRTs became the name of a film in which our computers contained an alternate universe of extending light grids. Before “the cloud” gained popularity as an ephemeral metaphor abstracting away the details of how we store our data in other people’s computers, The Matrix rendered the physicality of the internet as a dark enclosed underworld of forbidden knowledge.

[Read more…]

Common Sense Eats Common Talk

In November 2008, with the financial crisis in full swing, Queen Elizabeth attended a ceremony at the London School of Economics. Facing an audience of high ranked academics, she posed a simple question: “Why did nobody notice it?”

How could it be that no one among the smartest economists, commentators, and policymakers in all her kingdom – and beyond – had been able to see the formation of a bubble of such dimensions?

Illustration of The Emperor’s New Clothes by Vilhelm Pedersen, Andersen’s first illustrator

And yet critical facts were readily available – facts that could have warned about the craziness of the housing market, on which an even bigger financial house of cards had been erected. A short trip to a “regular” American neighbourhood – like the one undertaken by Mark Baum in The Big Short – would have presented an endless list of properties under foreclosure, real estate agents openly bragging about the laxity of credit requirements, and exotic dancers with multiple mortgage-financed properties.1

Such evidence would have been sufficient to convince most people of the existence of a bubble. However, in London, New York and the other financial centres of the world, an entire class of experts kept blatantly ignoring the facts, anecdotal evidence, and common sense that could have anticipated what was about to happen.

This is a high profile example of a more general situation in which a narrative establishes itself and resists being disproven, even when it is clearly contradicted by information right under our noses. Like the crowd in Hans Christian Andersen’s famous parable, we watch our sovereign parading naked in the street, but are unable to see through his invisible clothes. Until a young boy steps forward and with a little common sense lifts the veil on our “common talk”.

[Read more…]

How to Make History

In the past year, I’ve found myself repeatedly invoking, in all sorts of conversations, a hierarchy of agency with three levels: labor, making, and action. Here’s a visualization. The annotations on the left characterize the kind of agency. The annotations on the right characterize the locus where it is exercised, and the associated human condition.

The hierarchy is based on Hannah Arendt’s Human Condition, so I’ve named the visualization the Arendt hierarchy.

A mnemonic to remember the distinctions is mark time or make history. In everything you do, from posting a tweet or buying a coffee to running for President or tackling the Riemann hypothesis, you must choose between two extreme contexts: to either mark time with labor, or make history with action. In between there is a third context, where you can choose to slow time, which includes any sort of making, including art and trade (which is making in the sense of market-making). Naturally, Arendt thought (as do I) that you must choose action and history-making as much as possible. That is what it means to be fully human.

The scheme is non-intuitive, but once you’ve internalized the concepts, they turn out to be weirdly useful for thinking about what you’re doing and why, whether it is futile or meaningful, nihilistic or generative.

[Read more…]