The Idiocracy Trap: Why Smart Machines are making Humans Dumb & Dumber
Jacob Ward warned us. Back in January 2022, the Oakland-based tech journalist published The Loop, a warning about how AI is creating a world without choices. He even came on this show to warn about AI’s threat to humanity. Three years later, we’ve all caught up with Ward. So where is he now on AI? Moderately vindicated but more pessimistic. His original thesis has proven disturbingly accurate - we’re outsourcing decisions to AI at an accelerating pace. But he admits his book’s weakest section was “how to fight back,” and he still lacks concrete solutions. His fear has evolved: less worried about robot overlords, he is now more concerned about an “Idiocracy” of AI human serfs. It’s a dystopian scenario where humans become so stupid that they won’t even be able to appreciate Gore Vidal’s quip that “I told you so” are the four most beautiful words in the English language.
I couldn’t resist asking Anthropic’s Claude about Ward’s conclusions (not, of course, that I rely on it for anything). “Anecdotal” is how it countered with characteristic coolness. Well Claude wouldn’t say that, wouldn’t it?
1. The “Idiocracy” threat is more immediate than AGI concerns Ward argues we should fear humans becoming cognitively dependent rather than superintelligent machines taking over. He’s seeing this now - Berkeley students can’t distinguish between reading books and AI summaries.
2. AI follows market incentives, not ethical principles Despite early rhetoric about responsible development, Ward observes the industry prioritizing profit over principles. Companies are openly betting on when single-person billion-dollar businesses will emerge, signaling massive job displacement.
3. The resistance strategy remains unclear Ward admits his book’s weakness was the “how to fight back” section, and he still lacks concrete solutions. The few examples of resistance he cites - like Signal’s president protecting user data from training algorithms - require significant financial sacrifice.
4. Economic concentration creates systemic risk The massive capital investments (Nvidia’s $100 billion into OpenAI) create dangerous loops where AI companies essentially invest in themselves. Ward warns this resembles classic bubble dynamics that could crash the broader economy.
5. “Weak perfection” is necessary for human development Ward argues we need friction and inefficiency in our systems to maintain critical thinking skills. AI’s promise to eliminate all cognitive work may eliminate the mental exercise that keeps humans intellectually capable.
Keen On America is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit keenon.substack.com/subscribe
Hopwood DePree: What One American Learnt From Restoring His Family's English Castle
Aaron Friedberg: Why China, Not Russia, Is Our Greatest Threat And What We Should Do About It
Sasha Issenberg: What America's Long Struggle for Same-Sex Marriage Can Teach Us About the Possibility of Gun Control
Erin Swan: How a First Novel About America's Vanished Earth Took 6 Years to Write and 30 Years to Plan
Nirit Weiss-Blatt: Why the Techlash Has Gone Too Far
Helene Munson on Hitler's Boy Soldiers: Can Germans Ever Forget the Second World War?
Kerri Arsenault and Bathsheba Demuth: How to Tell Effective Stories About the Environment
Jon Taffer: Why the Real Power of Conflict Is About Respect Rather Than Violence
Hal Weitzman: Why Delaware Is At the Root of Everything That Is Wrong With America
George Stevens, Jr.: Remembering (And Mourning) The Golden Age of Hollywood and Washington D.C.
Dov Seidman: How to Make American Capitalism Moral (Or, At Least, Try To)
Marcus Buckingham: Why Work Sometimes Does, Indeed, Love Us Back
Arthur Grace: Photographing Communism(s) and What Life Really Looked Like in Cold War Eastern Europe