The viral “welfare mom” clips did not fool Fox because they looked real. They fooled Fox because they felt familiar. The network framed them as proof of decay, lectured from them, and let the old stereotype do its work — until a toddler’s arm dissolved into pixels and a frozen backdrop betrayed the illusion. The software failed; the instinct did not. Nothing new was revealed about AI. Something old was revealed about us.
No model invented that narrative. It learned it. The prejudice did not arise from code; it entered through the culture that nourished the code. Reagan taught America this story long before neural nets existed. AI merely made it available at scale. A stereotype rehearsed for decades can now be summoned with a prompt instead of a press conference.
We talk about AI “hallucinating,” as if the danger were invention. But the deeper threat is repetition. The model does not pull prejudice out of thin air — it learns it from us, absorbs it into structure, and returns it as confidence. It does not know truth; it knows frequency. It does not reason; it reflects. And what it reflects most faithfully are the beliefs we encoded into culture long before we encoded anything into compute.
There was a time when narrative authority declared itself plainly: anchors at desks, headlines above the fold, a small circle of editors deciding which truths entered public life and which died in silence. Consent was manufactured through scarcity. Even when the story was warped, you knew who told it.
We once thought the internet shattered that. When the broadcast towers cracked, we celebrated. We believed that when anyone could speak, truth would rise — that when the old gatekeepers lost control, the public would govern its own attention. We misunderstood the terrain. The gate did not disappear; it became the feed. Authority dissolved into interface. We did not democratize media; we privatized distribution. The public square became a recommendation system optimized not for understanding, but for engagement.
The platforms that replaced the newsrooms did not abolish mediation — they deepened it. They learned to guide belief not through argument but through architecture. They let everything be said, then decided what would be seen. You do not need to police speech when you can engineer visibility. You do not need censorship when you can bury truth beneath convenience and reward conviction over comprehension. In this world, propaganda does not demand obedience; it rewards instinct.
And when power fears losing control of that instinct, it responds as it always has. TikTok did not trigger panic because of teenagers dancing. It triggered panic because it threatened America’s jurisdiction over narrative infrastructure. The Cold War never vanished — it migrated to the attention stack. Nations once fought for shipping lanes; now they fight for cultural bandwidth. Platforms feel like public utilities right up until they challenge state power. Then we remember who gets to define reality.
Yet occasionally reality punctures the interface. New York feels like one of those punctures. Zohran Mamdani’s mayoral campaign does not behave like content. It behaves like politics — not the performance of it, but the doing of it. Housing, transit, labor, taxation, the material obligations of a city to its people. No algorithm engineers commitment; people do. “So much of what feels hard right now,” Mamdani said, “is the sense that we are merely subjects.” He meant government. He meant platforms. And then the part the system cannot digest: this moment is not fixed. It will not change by going viral, but by choosing to act. “We took swings,” he told his volunteers, “because swinging means refusing the limits you were handed.”
That is not messaging. It is a public remembering it can move.
A politics that refuses to be formatted as content is dangerous to those who benefit from spectatorship. It reminds people that scrolling is not participation and cynicism is not sophistication. It suggests that struggle is not a relic but a method — and that power fears not outrage, but organization. In such a context, hope is not sentiment; it is subversion.
AI accelerates this contest. It does not invent new myths, but embalms the old ones in code and scales them. Bias was not smuggled into the model; we handed it over. We once spoke of “debiasing” like debugging. But the system was never neutral. It faithfully reproduces the structures we never dismantled. The risk is not synthetic content — it is synthetic inevitability. A future that looks exactly like the past, except faster and harder to dispute.
Which is why media literacy is no longer etiquette; it is civic armor. History is not nostalgia; it is ballast. The point is not to detect every fake, but to remember that fakes flourish where narratives already live. The lie that spreads most easily is the one we are most ready to believe.
The glitch in those welfare-mom videos was not the missing limb. It was how quickly belief arrived. The pixels gave out before the prejudice did. Technology did not invent the impulse; it helped it breathe. The fox was not fooled. It chased the tale it already knew by heart.
What breaks that training is not a better algorithm but a reawakened public — citizens who remember they can build rather than react, who recognize that the future is not predicted by models but constructed by movements, and who refuse to let convenience masquerade as reality. This moment carries that possibility. Not guaranteed. Not gifted. But available.
The story will change when we stop mistaking repetition for truth, inevitability for order, and participation for scrolling. The fox can learn new instincts — but only if we stop feeding it the old tale.
CommonBytes
This column explores a central question: What should technology’s role be in a world beyond capitalism? Today’s technological landscape is largely shaped by profit, commodification, and control—often undermining community, creativity, and personal autonomy. CommonBytes critiques these trends while imagining alternative futures where technology serves collective flourishing. Here, we envision technology as a communal asset—one that prioritizes democratic participation, cooperative ownership, and sustainable innovation. Our goal? To foster human dignity, authentic connections, and equitable systems that empower communities to build a more fulfilling future.



Yep and AI is just reflecting the intent of the creators.
The lies take a lot of power but the truth is efficient.
https://robc137.substack.com/p/why-deepseek-uses-10x-less-power
the irony is that this article talks about AI and is itself AI written
whenever I read phrases like "something isn't something / doesn't something - it's something else / does something else", I know it's AI written
examples in this article:
"The platforms that replaced the newsrooms did not abolish mediation — they deepened it."
"No model invented that narrative. It learned it."
"The prejudice did not arise from code; it entered through the culture"
it's as if people didn't know how to write anymore