A few years ago, I was part of a two-day DARPA workshop on the theme of “Embedded Humans.” These things tend to be brain-numbing, so you know an idea is a good one if it manages to stick in your head. One idea really stayed with me, and we’ll call it Bay’s conjecture (John Bay, who proposed it, has held several senior military research positions, and is the author of a well-known technical textbook). It concerns the effect of intelligent automation on work. What happens when the matrix of technology around you gets smarter and smarter, and is able to make decisions on your behalf, for itself and “the overall system?” Bay’s conjecture is the antithesis of the Singularity idea (machines will get smarter and rule us, a la Skynet – I admit I am itching to see Terminator Salvation). In some ways its implications are scarier.
Bay’s conjecture is simply this: Autonomous machines are more demanding of their operator than non-autonomous machines. The implication is this picture:
The point of the picture is this: when technology gets smarter, the total work being performed increases. Or in Bay’s words, “force multiplication through accomplishment of more demanding tasks.” Humans are always taking on challenges that are at the edge of the current capability of humans and machines combined. So like a muscle being stressed to failure, total capacity grows, but work grows faster. We never build technology that will actually relieve the load on us and make things simpler. We only end up building technology that creates MORE work for us.
The one exception is what we might call Bay’s corollary: he asserts that if you design systems with the principle of “human override protection,” total work capacity collapses back to the capability of humans alone. We are both too greedy and too lazy for that. We are motivated by the delusional picture in Case 1, and we end up creating Case 2.
Here’s why this is the opposite of Skynet/Singularity. Those ideas are based (in the caricature Sci-Fi/horror version) on the idea that machines, once they get smarter than us, will want to enslave us. In the Matrix, humans are reduced to batteries. In the Terminator series, it is unclear what Skynet wants to do with humans, though I am guessing we’ll find out and it will probably be some sort of naive enslavement.
The point is: the greed-laziness dynamic will probably apply to computer AIs as well. To get the most bang for the buck, humans will have to be at their most free/liberated/creative within the Matrix. So that’s good news. But on the other hand, the complexity of the challenges we take on cannot increase indefinitely. At some point, the humans+machines matrix will take on a challenge that’s too much for us, and we’ll do it with a creaking, high-entropy worldwide technology matrix that is built on rotting, stratified layers of techno-human infrastructure. The whole thing will fail to rise to the challenge and will collapse, dumping us all back into the stone age.