
The Scarecrow Had a Brain
He didn’t think so *
(* AI was the diploma)
Last Tuesday a man walked into a meeting and presented a competitive analysis so thorough, so precisely structured, so confidently formatted that three senior executives nodded in unison and said some version of the same thing:
This is exactly what we needed.
Then someone asked him follow-up questions.
He looked at his slides.
The slides did not have follow-up questions.
I have been cataloguing moments like this one. I have collected quite a few. They share a common feature: somewhere between the machine producing the output and the human presenting the output, a quiet substitution occurred. The appearance of thinking replaced the exercise of it. Nobody noticed. The output was genuinely impressive.
The thing about the diploma is that it looks exactly like the real thing. Better, actually. Real thinking has rough edges — hesitation, incomplete sentences, the visible seams of someone working through a problem in real time. The diploma is clean. A slide for every contingency except the one that actually happens.
The man in the meeting did not manufacture the analysis. He commissioned it. There is a difference, and it is not the difference he thinks it is. He understands the output. He can walk anyone through the logic — the logic that was handed to him, pre-structured, ready for the eleven o’clock meeting.
What he cannot do is answer the questions that weren’t on the slides.
In the original story the Scarecrow spends the entire journey solving problems, devising strategies, getting Dorothy out of trouble through repeated acts of practical reasoning. He does this without a brain. Or rather — without the credential confirming he has one.
Everyone agrees something has changed.
Nothing has changed. The crows noticed. Nobody asked the crows.
The machine hands out diplomas at scale now, and they are spectacular — beautifully formatted, thoroughly researched, impeccably structured. The people who receive them feel, briefly, like the smartest person in the room.
They are prepared for every question except the next one.
What gets lost in the substitution is not the output. The output is fine. What gets lost is the friction — the part where you sit with the problem long enough to form a judgment, where you are wrong in ways that are visible and correctable, where you learn the difference between a framework and an understanding. The diploma skips that part. It arrives pre-understood.
The man in the meeting is not unusual. He is the current model. A system that produces analysis at scale produces people who present analysis at scale. Nobody is failing. The system is working exactly as designed.
I should mention — and I note this without particular pride — that I wrote this article with the help of the diploma machine.
I am also, as it happens, the Wizard.
The Scarecrow always had a brain. That was never the question. The question was whether the diploma would make it harder to find out.
The crows noticed. Nobody asked the crows.
The analysis is not the capability. It never was.
The Scarecrow was already capable before the diploma arrived. The man in the meeting was too. The machine didn’t take anything. It just made it easy to stop proving it — to yourself, to the room, to the follow-up questions nobody put on the slides.
Use the tool. Own the capability. They are not the same thing.
The crows are still watching. They always are.
Tomorrow: The road gets built perfectly. Someone should probably check where it goes.


Documenting AI absurdity isn’t just about reading articles—it’s about commiserating, laughing, and eye-rolling together. Connect with us and fellow logic-free observers to share your own AI mishaps and help build the definitive record of human-AI comedy.
Thanks for being part of the fun. Sharing helps keep the laughs coming!