Algorithmic Governance and the Ghost in the Machine

This is a guest post by Sam Bhagwat from Moore’s Hand.

 Moore’s Law has granted to 21st-century organizations two new methods for governing complexity:  locally powerful god-algorithms we’ll call Athenas and omniscient but bureaucratic god-algorithms we’ll call Adjustment Bureaus

As an example of an Athena, consider this case:

Eighteen months ago a Buddhist convert in Los Angeles named Rick Ruzzamenti donated his kidney. It was flown on ice on a Continental red-eye, to a retiree in Newark in need of a transplant. The retiree’s niece then sent her kidney to a woman in Madison. The woman’s ex-boyfriend sent his kidney to a secretary in Pittsburgh. The secretary’s boyfriend sent his to a young father in San Diego.

When the chain halted six months later, in December 2012, with a final transplant in Chicago, sixty operations had taken place, enabled by an algorithm that crunched through billions of match possibilities.

And for an example of an Adjustment Bureau, consider this case:

For several months in 2011, including the holiday sale season, JC Penney was at the top of a curious number of Google search results. “Dresses.” “Bedding.” “Area rugs.” “Skinny jeans.” “Home decor.” “Furniture.” Even “grommet top curtains.”

The placement generated huge traffic – 3.8 million monthly visitors just for ‘dresses’ – and  revenue in the tens to hundreds of millions of dollars.  Then an enterprising reporter noticed thousands of paid links in link directories, and brought the matter to the attention of Google’s webspam team, which flagged the site:

At 7 p.m. Eastern time on Wednesday, J. C. Penney was still the No. 1 result for “Samsonite carry on luggage.”

Two hours later, it was at No. 71.

At 7 p.m. on Wednesday, Penney was No. 1 in searches for “living room furniture.”

By 9 p.m., it had sunk to No. 68.

In other words, one moment Penney was the most visible online destination for living room furniture in the country. The next it was essentially buried.

Wide-eyed advocates of ‘algorithmic governance’ (Tim O’Reilly) beware: each god may grant you your local optimization, but their intervention is far from free.

Athena will probably stoke conflicts with other local gods; The Adjustment Bureau will add the very complexity-driven-arbitrariness that you probably hoped to avoid.

Keeping the paper-pushers in line

 If you see the poor oppressed in a district, and justice and rights denied, do not be surprised at such things; for one official is eyed [cheated] by a higher one, and over them both are others higher still. (Ecclesiastes 5:8, NIV)

The perennial challenge of kings was to accumulate power over the populace, without frittering it away to those who enforced that power (aristocracies).

To solve this problem, they created classes of childless eunuchs (Byzantium), elevated commoners (Louis XIV), and enforced rigid meritocracy (China). The goal was to create a bureaucracy (more) loyal to the crown, rather than kindred ties or inherited privilege.

In the English-speaking world, between the age of kings and the the late 19th century, the size of government grew gradually, with bureaucracy and civil service able to develop apace.

In the 20th century, as government grew 10x as a proportion of GDP — and even more in absolute terms — rulers demanded new techniques to stave off death by entropy.

They turned to numbers-based control mechanisms: economics and its more popular cousin, management.

Economics, beginning in the late 1930s, became a discipline of mathematical modeling; a simplifying framework, creating equations for GDP with four variables and modelling complex situations in 2×2 game matrices. This hedgehog approach put the discipline in a position to identify levers for political entities to pull.

By creating processes for both revenue collection (tax withholding) and dispersal (means-testing benefits, projecting Social Security solvency), economists helped make the bureaucracy process legible (to the president, Congress, and journalists).

Management similarly created accepted metrics and standards as financial control systems — EBITDA, GAAP, auditing, the P/E ratio, and so forth. In the 1930s and 40s, Alfred Sloan at GM and the Whiz Kids at Ford became famous for their metrics to give shape to the bureaucracies under them. The 1960s saw the rise of conglomerates held together by earnings reports and capital allocation.

Often these control mechanisms foster complexity growth that outstrips the ability of the selfsame mechanisms to function.

Mergers made conglomerates outgrow their divisional reporting systems; they had to be split up.  Whiz Kids leader Robert McNamara became infamous for his application of meaningless body count metrics in Vietnam. Means-testing for multiple welfare programs has created 90% marginal tax rates for low-income Americans.

The upshot of this short history: the problem of “keeping the paper-pushers in line” is timeworn, complex, and vital. Rulers tend to deploy any and all tools available in a frantic battle against entropy.

 Entropy is winning

Of late, this doesn’t seem to be going too well.

“Bureaucracies ate Obama,” proclaimed one article, noting that crises blamed on the current president actually happened in the civil service five or six levels below. (Interestingly, the article was written before the recent rollout fiasco.)

In 2009, the president appointed prominent economist Cass Sunstein to head the Office of Information and Regulatory Affairs. The OIRA is in charge of “regulatory review,” ensuring regulations work as intended; essentially, the government QA department.

OIRA has 50 full-time personnel. It is outspent by other regulatory agencies by a factor of 7000:1.

Presidential candidates of late have campaigned alternately on “hope” and “change” or “management experience.”

The subtext of each is I’ll keep the paper-pushers in line; only the method varies.

Making your organization into a piece of software

The crisis has invited attention to proponents of a 21st-century solution: algorithmic governance.

It’s the political version of a new organizational approach to complexity.

The basic organizational idea is to make your firm into a piece of software, closely constraining the actions of executors (workers) and end-users (customers).

One prominent example, McDonald’s, began development in the 1950s. Sometimes confused with a restaurant, this is actually a piece of licensed software 600 pages long, with a QA department, currently running around 14,000 instances in America.

The 21st century has, of course, created companies like Google and Facebook.that are literally built around their search and Newsfeed algorithms.

Less well known are algorithms created by economists to solve more narrow “matching” problems, ie the previously mentioned kidney matching system:

Each candidate and donor carry with them more than twenty parameters that need to be optimally matching including blood type, body mass index, viral history, willingness to donate or accept a left or right kidney, age, blood pressure, relationship to candidate, types of antibodies in their blood, and even where they‘re willing to travel and the difficulty of their match in general (acutely difficult matches are given extra preference as they should be taken care of before more easily matched candidates). All of these factors make a computer run through millions of possibilities to find the best solution — and each calculation is more complicated than creating solution trees for a chessboard (p. 149-50, Steiner, Algorithms).

The economist behind this algorithm, Alvin Roth, also worked on matching med school residents and hospitals and NYC ninth-graders and high schools. For this, he won the 2012 Nobel prize.

Hedgehog algorithms, fox algorithms

Athena algorithms (which tend to be based on economics) focus on problems where both sides of the transaction — residents and hospitals, kidney recipient and donor — can be completely modeled. Unsurprisingly, especially given economists’ hedgehog modeling impulse, Athena algorithms tackle only specialized problems.

Adjustment Bureau algorithms (which tend to be based on data science) tend to solve general-purpose problems. Completely modeling preferences is impossible; the algorithm must extrapolate content attributes and customer desires.

The cost is complexity. The kidney matching algorithm takes 20 medical factors into consideration, not 100,000 Facebook claims for its algorithms. New York City’s matching system uses one algorithm, not 100 (Netflix prize) or 200 (Google) sub-algorithms.

This hedgehog/fox distinction (focusing on few/many predictive factors) becomes even starker when looking at governance implications.

Athena (hedgehog) algorithms play nice for modern interest-group democracy; they are amenable to the type of design changes that politicians tend to request.

Privileging more difficult kidney matches, preference for certain groups of med students; prioritization of certain hospitals; all of these can be written into the relevant code.

Yet Athena does not now govern the whole world; just a few silos. She can currently respond to Odysseus and Achilles, without caring much about Paris and Hector.

Adjustment Bureau algorithms have no such luxury.

Tech firms qua governments

Strategic behavior

With a minimal set of inputs, the (Athena) med-school-resident matching algorithm can be tested for “strategic” preference listing, such as students listing a less-popular school first even if they liked it less, because it would be easier to get into.

Gaming Google’s algorithm, on the other hand, is both inevitable and the stated purpose of the entire SEO industry.

The result is a constant tug of war.

Since its inception, Google has ranked sites by the number of links pointing to them. SEOrs exploited this by buying links all over the Internet to their sites. So Google’s Aug. 2012 update, Penguin, penalized sites pursuing this behavior. In response, devious minds invented ‘negative SEO’:  producing spammy links to a site’s competitors, falsely triggering a penalty.


In imposing site-specific penalties, Google’s algorithms and employees flag sites deemed to be violating policies.

The webmaster who does nothing will get whatever hand Google chooses to deal; often, as above, past best-practice SEO is penalized under current algorithms.

The proactive webmaster must review all inbound links individually, guess which ones are triggering Google’s alerts, attempt to contact the relevant webmasters and have them removed, document this exhaustively, submit an appeal to Google’s Search Quality team, and wait one to several weeks to receive a form letter in response.

Sounds fun.

Algorithm interaction

Algorithm interaction, though common in complex systems (think alcohol + sleeping pills), is difficult to both isolate and correct in Adjustment Bureau black boxes where intended behavior is unclear.

One candidate here is apps like Socialcam: in publishing their users’ readings, they tend to push quite a few NSFW articles onto their friends’ Newsfeeds. That’s usually not intended behavior for either the user or Facebook.

Local knowledge

It’s reasonably easy to generate public policy analogies to each of the above categories. And the history of economic thought gives some interesting lessons about attempts to create complete world-models.

In the 1930’s, with the downfall of capitalism seemingly inevitable, Western economists began sketching out hybrid neoclassical/Marxist views of the state as a sole economic actor that could model consumer preferences and production costs.

In response, other economists formulated the concept of “local knowledge,” possessed by individual people, made legible by market prices.

Leonard Reed’s 1958  “I, Pencil”  has its titular protagonist list all the different types of knowledge required to produce him — to mine graphite, cut timber, ship goods, produce the coffee for the lumberjacks, and so on.

“No single person on the face of this earth knows how to make me,” the pencil asserts.

Adjustment Bureau algorithm complexity growth is usually an attempt to capture more of this local knowledge.

A Foxy Bureaucratic Future

Government has moved from pre-1900 hedgehog planning to 20th century foxy bureaucratic planning.

Techno-futurists may think the post-algorithmic future is hedgehog and non-bureaucratic.

The truth is that the post-algo world — as illustrated by the internal structure of Google, Facebook, etc — will be far foxier, bureaucratic, and illegible than futurists would like to believe.

Limited deployment of Athena algorithms is easy — but doing the wrong things more efficiently is Athena run amok; and more broad deployment means ensuring your local optimizations aren’t in conflict with each other (or people’s welfare generally).

Whatever mechanism ensures that will behave like an Adjustment Bureau. Further local knowledge-driven optimizations will create algorithm complexity. Gains in local adaptation will be offset by increased illegibility in the algo-bureaucracy administration.

Software may be designed to hide complexity, but can’t make it magically go away. There is no free lunch.

Further reading:

The Whiz Kids: The Founding Fathers of American Business by John Byrne.

Automate This: How Algorithms Came to Rule Our World, by Christopher Steiner

“The Use of Knowledge in Society,” F. A. Hayek

“The Cloistered Hedgehog and Dislocated Fox,” (Tempobook)

Stable Matching: theory, evidence, and practical design,” Nobel Committee, 2012

 Thanks to Venkatesh, Mike Bailey, Kyle Mathews, Brian Thomas, and Laura Brignone for reading drafts of this essay.


About Sam Bhagwat

Sam Bhagwat is an economist/data scientist by training. His ribbonfarm posts explore complex economic systems. Follow him on Twitter.


  1. Nice read.

    I think Amar Bhide’s call for judgement is also worth reading.

    Most systems are designed to get bulk of the work through the process with less effort. However it is the boundary conditions, the marginal cases where judgement becomes critical.


  2. Very nice article. Also, the supporting information that you pulled together is excellent and worth reading in its own right.

    When people want companies to function like software, I would be cautious about defining what type of software they mean.

    McDonald’s is great software; it does a few things really, really well. Let’s think of it as Notepad ++.

    The Social Security Administration is also like software. It does a lot of things in a slow, clunky fashion. Let’s think of it as Microsoft Office.

    There’s lots of software out there, and some of it is as bureaucratic as the most bureaucratic of bureaucracies.

    • Sam Bhagwat says:

      Thanks Carl. I like the attention to the software design space, as this seems to largely determine outcome. I don’t see too many people terribly excited about Open Office, or whatever Apple’s Office clone is, and when they praise Google Docs it’s more about the collaboration aspect than the word-processing functionality.

  3. Does the soft ware make sure that the workers need food stamps to survive?

  4. The hedgehog gets a bit of a bad press lately. Instead of knowing a single big thing he only seems to know a simple trick. So the hedgehog is going to become a narrow minded fox, a mathematically degenerated fox with one idea.

    The relationship between the hedgehog and the fox is turned on its head with the fox running the adjustment bureau with a general purpose algorithm and lots of knobs to, well, adjust. It is the hedgehog who dreams of a single algorithm which is context free and with no free parameters but in face of the lack of any realistic perspective of a theory-of-everything which flows directly from some logic, he chooses knobs for adjustment over a toolbox which contains a variety of instruments and no general methodology to apply them. The knobs don’t violate the purity of the approach, they only keep academics busy in trying to figure out more symmetries which can be exploited and reasonable constraints which can be argued for.

    “All is in the data” – this helps to throw out much of the algorithmic fine tuning. It is a new big idea and o.k. for the hedgehog who endorses Kepler’s paradigm shift which follows Ptolemaic exhaustion. So Norvig vs Chomsky is young hedgehog vs old hedgehog, not fox vs hedgehog. The foxy criticism of the Chomskians has been along the lines that there is so much more going on with languages than the Cyclops eye of a formal grammarian can see. It is similar to the romantic case against rationalism, which is sacrificing the world for a book of calculus as the language of the world. For the new wave of data scientists even syntactic structure is an assumption too much. Chomsky’s reductionism just hasn’t gone far enough.

  5. gregorylent says:

    excellent thinking and writing ..

    unexamined generally, to my knowledge, the role of “rule by algorithm” in creating confirmation bias, lowest common denominator thinking, and the death of serendipity …

    addiction to the quantifiable, not our greatest trait


  1. […] to this blog, I’m guest blogging over at Ribbonfarm; Venkat just published my first post, Algorithmic Governance and the Ghost in the Machine. Hopefully the community over there can further stir the […]

  2. […] 43. Algorithmic Governance and the Ghost in the Machine […]