The Conversation No One Wants to Finish

AI Twitter will talk about the underclass problem for exactly three posts before pivoting to acceleration. That's not enough.

underclasscommentarydiscourseresponsibility
|3 min read

The pattern on X is predictable by now. Someone posts that AI will create a permanent underclass. Quote tweets pile up. The accelerationists say it is inevitable and net positive. The doomers say it is inevitable and catastrophic. The pragmatists say we need UBI. Everyone gets their engagement and moves on.

Nobody finishes the conversation.

The part that gets skipped is the hardest part: what do you actually do for people whose labor has no market value, not temporarily but permanently? Not "retrain them" — retrain them for what, when the new skills they learn are automatable by the time they've acquired them? Not "UBI" — that is an income mechanism, not an answer to the question of what a life looks like when your economic contribution is zero.

This matters because human societies have, for all of recorded history, organized identity and purpose around productive contribution. Work isn't just how people earn money. It's how they structure their days, build social connections, earn status, and construct a narrative about their place in the world. Telling someone they'll receive a check every month but will never be needed for anything is not a solution. It's a sedative.

The historical analogues people reach for don't help much. The leisure class of the aristocracy was sustained by inherited wealth and a culture built over centuries to give meaning to non-productive lives — patronage, governance, military service, social ritual. We have no equivalent cultural infrastructure for a mass leisure class, and the people most likely to be displaced are the least likely to have the resources to build one for themselves.

This is not a thought experiment for us. The Institute exists to study autonomous companies. Autonomous companies are, by definition, companies that do not need human workers. Every piece of research we publish makes the underclass scenario marginally more legible and marginally more likely.

We do not get to hand-wave this. If you are building systems that eliminate the need for human labor, you have an obligation to think seriously about what happens to the humans. Not as a PR exercise. Not as a whitepaper appendix. As a central design concern.

The conversation is uncomfortable because there is no clean answer. The clean answers — new jobs, retraining, UBI — are either insufficient or unproven at scale. The honest answer is that we are building toward a world we do not know how to organize, and the people most likely to be harmed by the transition have the least power to shape it.

Acceleration without a theory of distribution is just concentration with extra steps. The people building autonomous systems have the most information about what's coming and the most ability to design mitigations into the architecture from the start. That's not a burden. It's a responsibility that comes with the position.

That is the part worth talking about. That is where the conversation should start, not end.

Related