There are two ways to diffuse a bomb.
A sapper uses the laws of physics to dismantle a device that was designed for the purpose of exploding. The bomb’s workings are linear and causal. Given enough time, the expert can understand how it works and defuse it with a high degree of certainty, no matter how complicated the device.
A therapist’s work is a lot messier. The laws of physics, or even logic, do not so rigorously apply to the human being they are trying to defuse. While the person was not designed to explode, it is capable of it and it can wreak as much damage as the device.
The device is a machine. The human is a complex adaptive system. Understanding one does not really help understand the other. And misunderstanding one for the other causes all kinds of fundamental errors in logic, for even the smartest people.
I see this mistake everywhere – people viewing all manner of things — objects, beings, systems and phenomena — as though they were machines.
Your business — like you, the economy, the planet, even the universe — is not a machine, it’s a complex adaptive system.
From ChatGPT…
“A machine is built for control and efficiency—it’s complicated but not complex.
“A complex adaptive system is characterized by learning and evolution—it’s alive, in a sense, and must be understood in terms of relationships, feedback, and context rather than components.”
The “mechanistic thinking” of viewing complex adaptive systems as machines explains why many people struggle to lose weight.
It’s one of the reasons why the climate change debate is so divisive.
It’s at the heart of the Innoficiency Problem, the mistaken idea that you can improve efficiencies and innovation at the same time.
Mechanistic thinking is why so many smart, successful people assume that LLMs and generative AI will lead to AGI, even though not one of them can explain how. (It won’t.)
In a complex adaptive system, inputs do not directly translate to outputs in a linear, causal way. Cut your calories and your body has numerous ways of adapting, thwarting you, making you feel like a loser who can’t do a simple thing. It’s not simple. It’s not even complicated. It’s complex.
The tools of AI allow you to bring massive efficiencies to your content creation and outreach, but the market adapts by ignoring it all. Your machine is more efficient but it’s not leading to more revenue.
Would the world be a better place if my AI could sell and deliver to your AI and we didn’t have to interact? An efficient Skynet marketplace of buying and selling, with no need for human interaction?
I was being facetious. The answer is no. And that future is not coming anytime soon.
So if optimizing the machine for efficiency isn’t your guiding management principle, then what is?
Look again at the characteristics of complex adaptive systems, articulated by humans and summarized by my robot servant: “learning, evolution, relationships, feedback and context.”
Those will do for me. (Nice job, Chat. Take the rest of the night off.)
-Blair