The Idiocracy Trap: Why Smart Machines are making Humans Dumb & Dumber
Jacob Ward warned us. Back in January 2022, the Oakland-based tech journalist published The Loop, a warning about how AI is creating a world without choices. He even came on this show to warn about AI’s threat to humanity. Three years later, we’ve all caught up with Ward. So where is he now on AI? Moderately vindicated but more pessimistic. His original thesis has proven disturbingly accurate - we’re outsourcing decisions to AI at an accelerating pace. But he admits his book’s weakest section was “how to fight back,” and he still lacks concrete solutions. His fear has evolved: less worried about robot overlords, he is now more concerned about an “Idiocracy” of AI human serfs. It’s a dystopian scenario where humans become so stupid that they won’t even be able to appreciate Gore Vidal’s quip that “I told you so” are the four most beautiful words in the English language.
I couldn’t resist asking Anthropic’s Claude about Ward’s conclusions (not, of course, that I rely on it for anything). “Anecdotal” is how it countered with characteristic coolness. Well Claude wouldn’t say that, wouldn’t it?
1. The “Idiocracy” threat is more immediate than AGI concerns Ward argues we should fear humans becoming cognitively dependent rather than superintelligent machines taking over. He’s seeing this now - Berkeley students can’t distinguish between reading books and AI summaries.
2. AI follows market incentives, not ethical principles Despite early rhetoric about responsible development, Ward observes the industry prioritizing profit over principles. Companies are openly betting on when single-person billion-dollar businesses will emerge, signaling massive job displacement.
3. The resistance strategy remains unclear Ward admits his book’s weakness was the “how to fight back” section, and he still lacks concrete solutions. The few examples of resistance he cites - like Signal’s president protecting user data from training algorithms - require significant financial sacrifice.
4. Economic concentration creates systemic risk The massive capital investments (Nvidia’s $100 billion into OpenAI) create dangerous loops where AI companies essentially invest in themselves. Ward warns this resembles classic bubble dynamics that could crash the broader economy.
5. “Weak perfection” is necessary for human development Ward argues we need friction and inefficiency in our systems to maintain critical thinking skills. AI’s promise to eliminate all cognitive work may eliminate the mental exercise that keeps humans intellectually capable.
Keen On America is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit keenon.substack.com/subscribe
Dr Strangelove Returns: Palantir and the New Military-Industrial-Digital Complex
MAGA Voters Aren't Stupid: That's Why They Don't Care What Right-Wing Podcasters Think
Getting Queerer Quicker: No, The Literary Man Isn't Disappearing—He's Just Not Longer White or Straight
Who Owns The Front Door? The Multi-Trillion Dollar Battle to Assemble the AI Jigsaw
From Mean Streets to Wall Street: How Trump, Koch, and the other Gods of New York Remade America
Move Fast and Fix the World: Here Comes the Sun in the Nick of Time
The Redistricting Apocalypse: How Chief Justice Roberts Let All the Evil Spirits out of American Democracy
Back to the Digital Future: Why the Future of AI Healthcare Might be a Return to the Gig Economy
From Scrubbing Toilets to Talking around the Water Cooler: Why AI Won't Kill the Jobs of Those Who Clean Up Our Mess
Nostalgia vs. Progress: The Left's Dilemma in Post-Industrial America
When AI Breaks Your Heart: The Week Nothing Changed in Silicon Valley
From Brazilian Model to Nuclear Advocate: How one Woman's Radical Climate Anxiety is Generating a "Rad Future"
Forget AI—How Bio-Threats and Network Collapse Are the Real Existential Threats to Humanity