When I had my first mid-life crisis at age 17, I really didn’t know how to handle it. I went from sociable and friendly to morose and uncommunicative overnight, and stayed that way for a year. Now, 24 years later, I am getting really good at navigating them. I predicted 11 of my last 5 mid-life crises. I’m now skilled enough that I can provide expert consulting support to people who are too busy to have mid-life crises frequently enough to get good at it.
The key is to freak out early and freak out often (FEFO) in an agile way, and work towards a lifestyle that (ideally) feels like one continuously integrated and deployed mid-life crisis. There is actually good intellectual justification for approaching life this way. It’s called the Lindy effect, which says you’ll live as long again as you already have, until you don’t.
Which means you’re always at mid-life. Until you’re not.
This can be a difficult idea to grasp, so as Matt Damon said recently about poop-grown potatoes on Mars, we’re going to have to discourse the shit out of this thing.
Most people seem to think of a mid-life crisis as something you experience when your age hits roughly half the average life-expectancy of the population you’re part of. The problem with this approach, besides the embarrassing levels of statistical and psychological illiteracy it reveals, is that you get to have only one. What’s more, you waste most of it vigilantly awaiting the urge to acquire a red sports car and a much younger second wife, so you can repress it aggressively and feel virtuous about having conquered the crisis and grown as a person instead of as a cliche.
Then some tastemaker declares that forty is the new thirty and that the sports car really ought to be silver-grey rather than red, and you wearily realize you have to do it all over again.
See, that’s a negative mindset. It’s not that you have to do it all over again, but that you get to.
A crisis is too good a thing to waste. Not only should you have as many as you have time for, you should succumb to each as quickly and completely as possible, and then bounce back as quickly as you can so you can have another one. Resistance is not just futile, it is counter-productive.
This aesthetically appropriate and functionally necessary response to a crisis is a crash. If you don’t crash, it wasn’t a crisis. A healthy, crisis-ridden life is one that evolves rapidly, one crash at a time. So it stands to reason that navigating such a healthy life well involves crashing early and crashing often (CECO; life is a FEFO in CECO out, or FICO system). I call such a life a crash-only life: one focused on minimizing maximum crash-recovery time (MCRT) rather than mean time between crashes (MTBC). Also, if you’re the sort of data-driven operations research or DevOps wonk to whom those last two sentences made sense, or worse, sounded like exciting insights, you will at some point in your life have a mid-life crisis where you question your faith in metrics and acronymizable truths. It will be priceless.
A pro-tip is important to note here: the size of the crisis has nothing to do with the size of the crash. Not only will every crash not be a Magnitude 9.5 monster on the Mahayana-Richter scale, leading to a major You n.0 release, but you can’t even predict the size based on the size of the crisis. A major life event might only cause a minor crash, and a minor case of barista-rudeness might provoke a major crash.
Dem’s the crises and that’s the way the cookies crash. But if you follow the FICO principle, instead of resting all your hopes on the One Big One that will lead you to Figure It All Out, you’ll naturally generate a spike train of small and big crash events, spanning the whole spectrum of crisis and crash possibilities.
Thus you’ll be gradually awakened and enlightened, you wonderful self-organized-critical deep-learning neural sandpile, you.
But I’m bikeshedding here. The key to living a crash-only life is not operations research metrics or Buddhist power laws, but having urgent conversations and interrogating things. This is called being an architect.
Until recently, I didn’t realize there was an entire professionalized field devoted to the study of midlife crises and how to navigate them, complete with a specialized technical vocabulary. It’s called architecture. My self-taught high-agility mid-life crisis navigation skills actually make me an architect.
In architecture, everyday terms acquire specialized meanings that are primarily about navigating midlife crises, but occasionally also apply to things like designing buildings. This helps architects pay the bills.
Take the word crisis itself. In architecture, it means everything normal, nothing to see here, move along.
Useful word, huh? Makes the invisible visible and stuff. You could point in any random direction right now, wherever you are, and yell “Crisis!”, and architecturally, you’d likely be right. This is called “issuing a provocation,” a sort of architectural fatwa. Every architect within earshot would parkour-race there to have an urgent conversation. Unless of course, you randomly happened to point at a burning building or six-car pile-up. In which case you’d be wrong and they’d stare pityingly at you, shake their heads, and take away your gun and Deleuze-and-Guattari badge. Because it takes spectacular levels of incompetence to be wrong in architecture.
Clearly, a crisis is a matter of some urgency that we must have a conversation about right now.
In architecture, urgent means not urgent and conversation means sit down and shut up, I’m discoursin’ here. If you are feeling particularly aggressive, instead of having an urgent conversation, you can interrogate something problematic.
In architecture, and its parent discipline of Critical Theory (if you’re feeling expansive, you can just call it Theory, and if you’re feeling clever, you can call it Mindfulness in Extremely Unplain French), interrogate means aggressively question a suspect who is not in the room, and problematic means something that is functioning smoothly, making somebody you dislike steadily more powerful. Usually the suspect is a rich, white, straight, cis-male who is driving around in a silver-grey sports car with a trophy younger wife on the other side of the planet from wherever the interrogation is going on. The problematic (noun) is usually some obscenely lucrative neoliberal monopolistic platform that he owns.
This is all very useful. These days, as an established blogger, when people ask me what I do in the tech world, I raise an eyebrow, sip my wine, flick an invisible speck off my black sweater, and reply: “Theory.” If they work for a big Silicon Valley company with free mindfulness meditation classes, I reply: “Mindfulness in extremely unplain French.” I am hoping one of those answers will land me a particularly lucrative gig some day if I A/B test them long enough.
So a mid-life crisis, in my sense of freak out, can be defined, more technically in architectural terms, as critically interrogating a problematic conversation about an urgent crisis. Or having an urgent crisis by critically interrogating a problematic conversation. Or urgently having a problematic conversation about an interrogated crisis.
Or any syntactically well-formed permutation of those words in a sentence. Critical Theory, like Management Science or Buddhist Metaphysics, is antifragile to word-order perturbations. Manglings only make them stronger. The process is similar to the forging of swords: you make them stronger through work hardening, which increases their internal metallurgical screwed-upness.
Theory is obviously a valuable activity to have going on in society, so it is good that it is actually sustainable, through the applied work of designing buildings (and more recently, designing angsty and poignant user interfaces, or my own preferred activity of designing sublime corporate strategies using 2x2s).
How, you might wonder, do urgent conversations help design buildings, user interfaces and corporate strategies? I wondered that too, but some architect friends I recently acquired have assured me that all these activities are actually well within the scope of the field, which is really about using Design to render visible the invisible nature of your place in the universe. A mid-life crisis is just one example of such a project. Architecture has, surprisingly, as much to say about the design of the Parthenon, the Googleplex, a Starbucks store, or an app icon, as it does about mid-life crises.
If you want to be all tediously bureaucratic about making the invisible visible in embodied ways, you can do so by designing a building that helps you negotiate an Urgent Conversation with the Universe. But that sort of literal-minded doerism is considered really bad taste these days.
Really, to do architecture these days, you have to wear black (not a turtleneck though; black turtlenecks have been appropriated by neoliberal auteur capitalists who own monopolistic platforms), and get really good at having mid-life crises.
This skill allows you to critically and profitably engage one of those Big Man types (neoliberal monopolists are only one variety) when they pause for breath between putting Dents in the Universe.
Speaking of your place in the universe, the reason humans have the genetic capacity for midlife crises is that they mostly do not have one.
A place in the universe that is.
I mean, sure, you snagged a table at Starbucks right next to the only power outlet today, but I mean an interesting place. Say one that allows you to own a large chunk of an obscenely lucrative neoliberal monopolistic platform so you can afford to outsource all your urgent critical conversational needs to architects like me while you drive around in your silver-grey sports car.
If you want to think of your seat in the bleachers a ‘place’ in the universe, feel free to do so, but I wouldn’t brag about it. Just sit down and shut up. I am having an urgent conversation at you here.
Now, you might argue that a seat in the bleachers is all any of us have, Pale Blue Dot and all.
But if you think about it, that argument is really just sour grapes. Deployed thusly from a critical perspective, Carl Sagan’s pale-blue-dot passage becomes the biggest sour-grape rant in the universe. It is just disguised as awe-infused poetry. It is not that this particular shade of the pale-blue-dot perspective on life is wrong, but that it is most often used as an excuse to cut people down to size by scaling the y-axis appropriately, rather than to inspire ambitions of astronomical grandeur. Wow, look at the humbling vastness of the cosmos! Those Big Men look like little ants! Say yes to progressive taxation!
On the other hand, the pale blue dot is a somewhat impractical source of inspiration for ambitions of astronomical grandeur. You can’t easily write a personal growth tract titled How to Pale-Blue-Dot the Shit Out of Your Dent-in-the-Universe Unicorn with Evernote, though I suspect emerging productivity star Tiago Forte could pull it off.
See, the real reason you don’t have a place in the universe is Elon Musk.
A midlife crisis is about dealing with the fact that you’re not Elon Musk, and no clever y-axis scaling is going to change that. In fact no such trick at the level of mere ideas can alter the fact that you’re barreling down a path from could-have-been to never-was without even a pit stop at has-been, while Musk goes merrily dentin’ in the universe and arguing with Bezos on Twitter about whose rocket is more used.
A midlife crisis is about suddenly being forced to contemplate the possibility that you might be irrelevant at every scale from quark to quasar. About the possibility that you might never be a billionaire with your own rocket to get you off this damn pale blue dot full of goddamn hippies.
George Costanza of Seinfeld, who was good enough at navigating midlife crises to have been an architect, recognized this. In one episode, he laments, “I’m thirty-three years old. I haven’t outgrown the problems of puberty, I’m already facing the problems of old age. I completely skipped healthy adulthood.”
Sadly he never realized the true implication of his condition: he’d been middle-aged all along. He also didn’t realize how good of a real architect he was. So he went around feeling like an imposter, and thinking he had to pretend to be an architect. If he’d had the money to hire me as an executive coach, I’d have told him, stop PRETENDING to be an architect and BE an architect George. Crash or don’t crash. There is no try.
If only he’d interrogated the problematic notion that you’re only an architect if others say you are and give you a license to critically theorize. Or worse, that you’re only an architect if you can point to buildings you’ve designed. As some of my new Theory friends would say, Costanza was always-already a middle-aged architect. He just didn’t know it.
So are all of you. To paraphrase Big Man Milton Friedman on the subject of that other Big Man Keynes, we are all architects now.
If it helps to make it impersonal, another way to think of it is this: you’re almost certainly not on what engineers call the critical path of the many-streamed flow of history.
If you are on the critical path, your delays delay the universe. Your accelerations accelerate the universe. The order in which you say or do things matters.
If you’re not on the critical path, however, nothing particularly significant hinges on whether you show up late or early, or indeed, whether you show up at all.
On second thoughts, that didn’t make it impersonal, and it probably didn’t help.
People on critical paths — Big Man types like Elon Musk and Jeff Bezos — make dents in the universe beyond the pale blue dot. People not on critical paths have urgent critical conversations about critical paths.
So an architectural crisis is really an urgent critical conversation designed to interrogate a problematic critical path that you’re not on. Or a problematic path on which your design conversation is not urgently critical. Or a conversational path on which your criticism is not urgently interrogated by design. Or something.
In other words, it is a freak-out about non-events starring absent figures who are mostly not even aware that you are having urgent conversations in their dark Jungian underbellies while they go about a dentin’ in the universe.
This is only cause for insecurity and angst if you view it as a bug, rather than a feature.
Here’s another way to think of it: you’re off the hook, you can say or do whatever you like!
When the going gets easy, the lazy declare a crisis.
If you’re actually lazy (very few people are; it takes a lot of effort), there is a lot to be said for not being on the critical path of capital-P Progress. In particular, if you can engineer the right kind of rent or sinecure for yourself, such as a tenure-track position in an architecture department, UX design consultancy, Buddhist seminary, or the right kind of blogger-consultant niche, you can sit back, relax, and enjoy the free ride. For the record, I am aiming for that last one, but haven’t yet managed it. I invariably crash and end up doing some actual work by accident while rebooting. But it seems like I’ve at least managed to create the illusion of a lily of the field who neither toils nor spins. A new LinkedIn acquaintance connected to me with the introductory remark, “I didn’t think a life of pure intellectual play was possible.”
I am glad I help create the illusion that it is. It’s my architectural magnum-opus in progress, designed to distract you from the fact that the rent is too damn high.
But fine, some of you can’t handle the demands of FEFO laziness, leisure, and continuous crisis, and want to make that ambitious leap from critical conversation to critical path (an entirely irrational and not-even-wrong aspiration as I’ll demonstrate later, which is precisely why you should harbor it). You want to go from one mid-life crisis a week to one a decade. You want long periods of boring maya punctuated by short periods of panicked crashing. You want to make history. You want to get on that critical path. So let me offer a few helpful hints.
Critical paths — which, as a reminder, are the paths you are likely not on — are the stuff of what historians like to call Big Histories. Big Men like Elon Musk are the ex-officio heroes of Big Histories. If a Big History pretends to be not be about individual Big Men, but about (say), beer or food or interest rates, just do a double take.
Big Men are people through whom the course of the history of universe flows like a mighty torrent of Significance and Meaning (those are German nouns, so they must always be capitalized). Delays in whose lives delay the whole universe and make the whole show a little less meaningful for all.
All this is also true of Big Women, only more so, since they blindside history more unexpectedly.
To understand history, Big or otherwise, it is useful to start with Paul Graham’s definition of it: history is simply all the data we have so far.
This definition is so very problematic, you can learn a lot simply by interrogating it critically, or criticizing it interrogatively. For example, who’s we? What’s data? What does it mean to have it? Can you have your data and eat it too? Should people without data eat cake instead? Are we going to have to Hadoop the shit out of all this data using an obscenely lucrative neoliberal monopolistic platform?
But it is easier to simply skip to the conclusion that the definition is wrong, not merely problematic.
You see, history has historically been how we choose what data to forget. Our brains work that way too. We don’t form memories to write our stories. We write our stories to suppress inconvenient memories. If you ever go spelunking in the Big Data dump that is your subconscious, you will find a spaghetti landscape of crime-scene tape. History is the technology of forgetting, not the technology of remembering.
A Big History happens when a Big Man on a critical path chooses what to forget for all. Often by commissioning an architect to erect a distracting and awe-inspiring building on the periphery of what needs to be forgotten. Which means you can characterize a Big History by grokking the gestalt of what it works assiduously to forget; the stuff that blips in and out of your peripheral vision when you admire impressive monuments.
Of course, the technology of forgetting also works in more direct ways. Back in the day, the Pharaoh would say, so let it be written, so let it be done, and regret it a minute later. So the high priest would diplomatically fail to lock up the inconvenient precious scroll in a cool dry place.
Today, it means conveniently forgetting to back up a hard drive somewhere.
They say Big Data is when saving the data is easier than deciding what to do with it. So tomorrow, Big History will mean saving everything in an unstructured Big Data slum in the cloud, and making sure almost everything is hopelessly buried and largely undiscoverable except in the form of psychedelic Deep Dream images.
At a startup event I was at recently, I heard a founder remark that roughly 99% of the data collected is never analyzed. I doubt it will ever be, but maybe one day it can be psychoanalyzed. Dibs on that job. It’s early days yet, but going by Deep Dream results so far, we’ve already learned that our online obsession with cats is far unhealthier than it looks. It’s all creepy alien furballs with 17 eyes down there in our collective unconscious.
Here’s a line you’ll hear shortly. “You don’t like my version of events, huh? Well, the data is out there, why don’t you go write your own damn version?”
About 90% of the time, you won’t. About 9% of the time, you will get as far as tweeting the screenshot of a politician’s tweeted wiener picture after it has been deleted, and having a little laugh at the hypocrisy of it all. But when more consequential things hang in the balance, you’ll just mutter, “fuck it, this is too much work” and give yourself permission to not see. Move along, nothing to see here. Let’s go have an urgent conversation over wine and cheese.
But see, the random obsessive crackpot next to you at won’t move along. He will actually go Big Data spelunking to construct an insane alternative Big History, demonstrating conclusively (in green font) that all the data can be explained by Hadooping the shit out of a 3.2 petabyte corpus of politician-wiener pictures using the Golden Ratio, proving conclusively that climate change is a hoax perpetuated by a 3-year old Chinese-Syrian orphan refugee with wiener envy who hates pale blue unicorns and must be stopped by building a wall along the Mexican border.
What unites all Big Histories, whether institutionally approved Big Man myth-making or crackpot, is that they are defined by a sensibility that helps you not see certain things, a sensibility that is not the scientific sensibility. Anglo-Saxon Big History, for example, is defined by hypocrisy. Continental European Big History? Denialism. Middle Eastern Big History? Pomposity. Indian Big History? Cravenness. Chinese Big History? Insecurity.
You don’t even have to try very hard to figure out these core sensibilities. You just invert the sacred qualities that mark declared and advertised aspirations behind every Big History: truth, beauty, humility, peace and stability for the civilizational ones above.
Or maybe you prefer your narrative-industrial complexes horizontally organized. We manufacture our macroeconomic Big Histories out of our irrationalities, our military Big Histories out of our terrors, our labor Big Histories out of hatred of community, our capitalist Big Histories out of fear of being alone.
Again, not rocket science. Just invert rationality, courage, solidarity and individualism. The terminally irrational person is usually the one reaching for game theory or Bayes’ formula. The coward is usually the guy with the bigger gun. You haven’t seen fraternal hatred until you’ve seen an egalitarian argument among a bunch of hippies in a commune of universal love. There is nobody more tribal and mortally afraid of being alone than the Brave Individualist Randian Entrepreneur.
Big History is a depressingly simple Jungian-shadow sausage factory. Once you realize this, it can make you hate almost everybody, almost all the time.
The specific essentialist reductions I’ve offered aren’t critical, and you can go have an urgent conversation about what the sensibilities of various Big Histories actually are. What is critical is that they are not the scientific sensibility.
They are not dispassionate.
A Big History is necessarily a passionate history. One that asks, with a sort of desperate anxiety, where did we come from, where do we go, what should we avoid looking at to go there? What sorts of monuments can we erect to help us assiduously forget if we can’t avoid looking?
And all the data we have so far, be it one clay tablet or 10 zettabytes or 3 yottabytes won’t defeat our spectacular ability to retcon the past into a sequential teleological narrative that goes where we’ve already decided we want to go.
That was not helpful, and that was not a hint. That was me shaving a monstrous deep-dream yak with 17 eyes.
Here’s the actual helpful hint. If you want to be a Big Man, develop a blinding passion. It doesn’t matter what the passion is. It just has to be as blinding as possible.
Big History is an awful thing, and those among us who traffic in Big Histories (I’ve personally made up and blogged at least sixteen) are completely awful people cynically offering bespoke blindnesses for sale. In my defense, it is really fun to do, and writing one is a great way to recover from a major crash.
The only thing worse than Big History is Little History.
Little History is exactly like Big History, except that Big Men and Women are excised from it with extreme prejudice, and replaced with Big Process. Little History reaches out just as broadly, and with as much overweening authority.
An example of Little History is an essay by Matt Might (clearly a Marvel superhero in a counterfactual universe) titled The Illustrated Guide to a Phd. Go read it. It’ll only take a minute. It frames the sum of all human knowledge as a big circular bubble, and your PhD as a little pimple on the surface of it. I’ll call this the Mighty Diagram. It gets passed around in graduate student circles with depressing frequency.
Instead of a dent in the universe, you get a pimple on a uncritically proceduralist conceptualization of the frontier of knowledge as the sum of all the peer-reviewed academic literature in the world.
What makes this essay utterly horrifying is that it is actually an accurate description of what a PhD is; it calibrates academic career expectations correctly and offers an accurate sense of perspective on the peer-reviewed life. I suspect Matt Might sincerely intended the essay as a helpful guide to academic survival, but its effect is to put aspiring scholars in their place, rather than help them find a sense of place in the universe. It’s a You Are Here map for your intellectual journey at the end of a PhD, you disgusting little pimple, you. Kneel before this awe-inspiring edifice of knowledge that you’re lucky to be allowed to add a pimple to.
As Haley Thurston argued in The Awe Delusion, quoting the Total Perspective Vortex parable of Douglas Adams, if you are going to make sense of your place in the universe, the last thing you can afford is a sense of perspective. This is why you should strive to move from critical conversation to critical path, even though it makes absolutely no sense and is a not-even-wrong aspiration. Because if you’re cursed with the inability to not pay attention to what’s going on, it is the best way to sufficiently distort your perspective to find passion for life. It’s the next best thing after a true passionate blindness.
I almost did a Mighty PhD and turned into Pimple-Man (you must allow me my one humble joke, my pimple of a contribution to the grand historical tradition of peer-reviewed jokes), but was fortunate enough to have a mid-life crisis in the middle of it. So I pivoted to a very different kind of PhD that left me with a warped and twisted romantic perspective of intellectual work as intellectual play. So I ended up unrepentantly striving to be the whole bubble instead of just a pimple on the surface. Or to put a less flattering spin on it, I crashed out of the publish-or-perish treadmill via yet another mid-life crisis at the end of my postdoc, and began my non-peer-reviewed crash-about in the world of ideas, as an exile.
A Little History is a case of a map-territory confusion caused by functional fixedness (which is a feature, not a bug, if you have to live inside a somebody-else’s-map like “academia”). If you have a hammer and everything looks like a nail, leading you to fetishize the hammer, you might write a hammer-and-nail Little History. Which is incidentally why the communist flag has two crossed tools on it.
David Foster Wallace had a great example: concluding that murder is wrong because it is illegal.
The Mighty Map is something like that. Concluding that something is knowledge because it is declared by Peer Review to be Original, Important, Correct and Sufficiently Humbly Pimpular.
The Mighty Map represents a crisis in academia’s sense of itself that deserves to be interrogated. You academics probably ought to go and have an urgent critical conversation about it and how it is adversely selecting for false humility, lack of ambition, rampant irreproducible p-value fishing, and armies of masochistic adjuncts having urgent conversations in return for minimum wages.
This is the kind of silliness that ensues when you decide to labor under somebody else’s map-territory confusion instead of manufacturing one uniquely your own.
Academia isn’t the only source of Little History in the marketplace of grand narratives. Anywhere you find a cargo-cult process from which Big Men have been excised and replaced by Big Process, you will find a Little History full of Uriah Heeps extolling the virtues of humble striving, praising the anonymous masses and cutting Big Histories down to size via carefully designed anti-Big-Man map-territory confusions.
The Californian version of democracy for instance, which I’m told is a process entirely pwned by myriad anti-democratic little interest groups abusing ballot initiatives. I recently learned that the Californian constitution has been amended 521 times in less than a hundred years, while the US constitution has been amended only 27 times in over 200 years.
Of course the Californian political-historical sensibility must be evolving faster than elsewhere. Its constitution is changing faster. Little Historian George Clooney said so, therefore it must be true. Of course China must be beating the United States in science and technology, they are producing more papers and patents, aren’t they?
Big History allows us to forget anything that might threaten our fragile passion for life. Little History allows us to forget any inconvenient map-territory distinctions that threaten our precarious capacity for life.
Here’s a quick way to tell them apart. If it arouses either a burst of fanboy motivation or a burst of envy, it’s a Big History. If it offers a humbling reality check and a true perspective on a false map designed to cut you down to size so you’re fit for an institutionalized life, it’s a Little History.
Between the two kinds of history, it is possible to forget everything of consequence.
In a perfectly efficient market of historical narratives, adequately supplied with Big and Little narratives to suit every unbridled passion and every functionally fixed skill, it is possible to forget the reality of everything, yet deny the significance of nothing. A marketplace of timeless, unchanging histories that only grows more unassailable with every exabyte of data poured into its moats.
Fortunately the market of historical narratives is not perfectly efficient. History is not merely all the data we have forgotten so far. In that tiny, liminal gap of sensation, across which the indeterminate obscene must leap to become data, there is a possibility of raw sensation, a possibility of crisis, a possibility of histories crashing.
And in the process of recovering from a crash, there is the possibility of accommodating indeterminacy and provoking unpredictable change. A possibility of architecture.
History only ever evolves by crashing. And the only way to make it crash is to provoke a crisis. That is ultimately the function of all architecture, including the built kind: to keep the liminal gap of sensation open long enough for indeterminacy to sneak into history, so it can free itself from both the blinding passions of Big Histories and the fearful determinate proceduralism of Little Histories.
The history of history is neither Big, nor Little, nor a pile of data, nor a progressive forgetting. It is the story of a story breaking free of itself.
If you are reasonable and very bright, a midlife crisis is an opportunity to rewrite your own personal history a little more thoughtfully and coherently. You might sublimate the envy and stoke the passion evoked by your favorite Big History. You might develop a humbler, more realistic perspective on life, informed by the various Little Histories in the zeitgeist.
This is of course, a game for brilliant idiots.
If you are unreasonable and hopefully not too bright, a midlife crisis is an opportunity to rewrite the history of the world to revolve around yourself a little more. This is not anthropocentrism (that is something you can find in both the Big History and Little History sections of Walmart), it is egocentrism.
A mid-life crisis is not a way to recenter yourself in the universe through an act of epistemology. It is a way to recenter the universe around yourself through an act of architecture. An act of architecture capable of making the universe all about you.
The enlightened among you, who haven’t already forgotten what I said a couple of thousand words ago, might have jumped to a tempting false conclusion: if it’s all about you, you’re on the critical path. You’re the Big Man or Woman.
If history is a story breaking free of itself, rather than one converging towards a determinate shared fate, then no path is a critical path. Or equivalently, every path is a critical path in a diverging, branching, generative space of possibilities.
I suggest you use the latter formulation on Mondays, Wednesdays and Fridays, and the former on Tuesdays, Thursdays and Saturdays. On Sundays, you should relax by arguing pointlessly with everybody else who thinks it’s all about them. This philosophy is what I called Divergentism a few posts ago, and people on Twitter are claiming I stole it from Schopenhauer or something. I didn’t. Damn Little Historians.
Once you do this, history goes from being a determinate technology for forgetting the known to an indeterminate technology for feeding on the unknown. A technology of curiosity. Because once you make the universe all about you, personally, data flowing from it towards you becomes nurturing instead of threatening.
This means, ironically enough, that egocentrism is how you shrink the ego. Because if it’s all about you, it’s not threatening. Which means you can keep your eyes open and look instead of screwing them shut and trying to forget. And as the universe floods your mind, you can vacate it. Deep, huh?
And maybe, just maybe, if you can do that sufficiently early and sufficiently often, you can become yourself faster than data can be forgotten into history.
This post owes a lot to the good people at the Guggenheim, particularly Troy Conrad Therrien, who invited me to do a Breaking Smart workshop for a bunch of architects last month, and also had me do a panel discussion with Mark Wigley. I’ve liberally stolen ideas from his essay Flash Theory, for this post. Unfortunately I can’t find a copy online.