I recently watched a thought-provoking video essay, The New Aesthetics of Fascism by Ben, a content creator from Germany. Many believe they’d recognize fascism if it returned — expecting uniforms, parades, and brute force. However, today’s threat is much subtler, even as the underlying principles remain the same. Technology, AI, and the design philosophies shaping our digital lives are converging to create a softer, more insidious form of authoritarianism— one masked by seamless convenience, corporate branding, algorithmic decision-making, and the polished veneer of bureaucracy. This “friendly fascism” doesn’t announce itself. It blends in through frictionless interfaces and familiar language, making it easy to overlook because we are still searching for the old warning signs. The danger is how effortlessly it becomes part of everyday life, reshaping our reality without any fanfare.
Ben presents a visually compelling and wide-ranging narrative, weaving together history, art, meme culture, the manosphere, and current events to show how aesthetics, technology, and politics can intertwine in unexpected ways. Whether or not you agree with every conclusion, it is a fascinating watch for anyone interested in the deeper forces shaping our digital age. I highly recommend watching it in full for its visual storytelling, excellent score, and the broader context, which cannot be fully captured in writing. Below are some reflections on the themes that resonated with me.
Serving Looks, Serving Power
Today’s tech aesthetic is built on clean minimalism, sleek branding, and a constant promise of progress. On the surface, it feels modern and open, reflecting the spirit of efficiency and ease. But this visual language carries echoes of the early 20th-century futurists — artists who celebrated speed, machinery, and industrial power, often in the service of authoritarian politics. Their glorification of force and hierarchy was overt; today’s Silicon Valley offers a softer, friendlier face. We see smooth surfaces and calming interfaces designed to invite trust and lower resistance.
Yet beneath this polished appearance, similar patterns of control and hierarchy still operate. Technology shapes behavior and reinforces existing structures of power, guiding users down managed paths. The veneer of convenience conceals algorithms and interfaces optimized for institutional interests, shaping what we see, how we interact, and even what we believe.
Older forces of militarism and authoritarianism don’t simply vanish when new ideologies or styles arrive; they adapt and persist. Just as remnants of feudal authority lived on within capitalism, today’s tech world still draws on motifs of force and control from earlier authoritarian systems. Companies like Palantir and Anduril blend advanced technology with national security, positioning themselves as leaders in algorithmic warfare. Their branding leans heavily on themes of strength, vigilance, and existential threat; recycling classic tropes of heroism and dominance in a digital form. In areas like autonomous weapons and defense-focused AI, the fusion of futurist style and militarism is unmistakable, revealing how old logics of power reappear under the guise of innovation.
The desire for control, however, extends far beyond defense contractors. Mainstream consumer tech pursues a subtler form of managed order, most famously in Apple’s “walled garden.” Here, comfort and predictability are achieved by hiding complexity and enforcing uniformity. The message is: "You will like it here," but the cost is a narrowing of autonomy and alternatives. Security and seamlessness are traded for confinement within boundaries defined by the corporation, discouraging dissent or deviation from the prescribed experience.
Nowhere is this orchestration of behavior and discourse more visible than on social media, especially in the evolution of Facebook and Twitter. Both platforms have shifted from open public forums to tightly engineered environments where algorithms dictate what users see, how conversations unfold, and even which ideas are permitted to circulate. Over the last decade, Facebook’s algorithm has consistently amplified sensational, divisive, and often far-right content, particularly on issues like trans rights, creating echo chambers and facilitating the spread of misinformation and hate. Investigations have shown that white supremacist and extremist groups continued to thrive on Facebook for years, often evading moderation or benefiting from weak enforcement. More recently, Meta’s decision to phase out professional fact-checkers in favor of crowd-sourced moderation has raised concerns about increased exposure to radical content, especially among older users.
Twitter has followed a similar arc, especially since Elon Musk’s takeover. Moderation was scaled back, previously banned far-right and extremist accounts were reinstated, and algorithmic biases toward right-wing content became more pronounced. Research has documented a sustained spike in hate speech, including racist, anti-LGBTQ+, and anti-Muslim slurs, and algorithmic amplification of right-wing voices and news sources. Recent policy and design changes—including paid “visibility,” trending topics, and opaque content moderation—have increased the reach of ideas and personalities once considered fringe or unacceptable in polite society. Through repeated exposure, what was once shocking has become normalized, eroding public resistance and shifting the boundaries of mainstream debate.
There is a deeper cost to this comfort and control. When unpredictability and dissent are designed out of digital environments, users are separated from the possibility of real agency and debate. Conflict and ambiguity are reframed as technical glitches, flaws to be optimized away. Tech companies, presenting themselves as neutral engineers of usability, recast complex social and political questions as problems for algorithms and moderation queues, rather than matters for democratic contestation. The result is a worldview that privileges order, efficiency, and predictability at the expense of dissent, pluralism, and authentic engagement.
This aestheticization of technology is never neutral. What appears to be the triumph of user experience is, in practice, the centralization of power and the narrowing of acceptable behavior. Human unpredictability—essential to democracy and creativity—becomes something to be managed and controlled. In this way, the tech aesthetic expresses not just a commitment to making life easier, but a deeper commitment to control—one that reaches far beyond the surface of our screens and into the structure of everyday life.
Memes & Machines
Layered on top of AI’s cultural flattening is the weaponization of irony and meme culture. Today’s so-called “friendly fascism” does not march in lockstep or shout through megaphones. Instead, it slips in through jokes, edgy memes, and shared online references. What starts as playful or provocative humor can, over time, become a powerful vehicle for spreading exclusionary and even authoritarian ideas.
Meme culture makes controversial or extreme positions feel unserious. Everything is "just a joke" or "just trolling." When critics push back, meme creators and communities quickly hide behind irony. If you take something seriously, you are accused of lacking a sense of humor or being "cringe." This dynamic does more than deflect real critique—it actively discourages honest engagement about genuinely harmful ideologies.
The effects go even deeper. The repetition of ironic or outrageous memes gradually shifts what feels normal. What was once shocking becomes a running joke, then background noise, and eventually just another part of the discourse. As time passes, the line between irony and sincerity gets blurry. People are left asking: is this for real, or just another bit? That ambiguity is deliberate. It gives cover for radicalization and plausible deniability for those spreading harmful messages.
Meme culture also builds strong in-groups, bonding people through humor, shared references, and even collective mockery of outsiders. This sense of belonging can make individuals more susceptible to adopting the group’s views, while outsiders—those who protest or point out harm—are labeled as humorless, overly sensitive, or even enemies. In this way, meme communities enforce boundaries, incrementally radicalize members, and create an environment where even the most fringe beliefs can feel mainstream.
The result: irony, memes, and “shitposting” are not just digital noise—they’re tools for softening extremism, desensitizing the public, and helping dangerous ideologies move from the margins to the mainstream, often without anyone noticing until it’s too late.
A Profitable Distraction
Today's endless culture wars are not simply organic societal disagreements. They are strategically profitable phenomena, meticulously engineered and amplified through digital media. Social platforms, with their engagement-driven algorithms, thrive on outrage and controversy. The constant churn of cultural conflict, often over symbolic, identity-driven, or sensational issues, keeps users locked into perpetual cycles of emotional reaction. This produces valuable clicks, shares, and comments. For tech companies, each new controversy means more user activity and more advertising revenue.
The consequences, however, reach far beyond commercial gain. By constantly steering public discourse toward emotionally charged but ultimately inconsequential debates, culture wars act as a smokescreen. This draws attention away from the structural realities that create deep social tension, such as widening economic inequality, systemic exploitation, the erosion of labor rights, diminished social mobility, and creeping authoritarianism. The more the public is locked in battles over symbols, identity, or "hot button" topics, the less energy remains for building coalitions to challenge power or pursue meaningful reform. In this environment, urgent conversations about healthcare, housing, climate, or democracy are drowned out by the latest manufactured outrage.
Manufactured polarization also erodes social cohesion. It fractures communities and pits people against one another along lines of identity, culture, and values, rather than shared economic or civic interests. Algorithmic sorting of content and friends on social media encourages us to see the world through an "us versus them" lens, deepening mistrust and making it harder to find common ground. As communities fragment, collective action and solidarity; essential for democratic progress, become much more difficult to achieve.
Ironically, as culture wars deepen these divisions, they also make people more receptive to authoritarian messaging. Populist and demagogic leaders step into the breach, offering narratives that promise a return to stability, order, and traditional values. These simplistic stories, usually anchored in nostalgia or fantasies about a better past, offer emotional comfort in chaotic times. But while they promise safety, they often lay the groundwork for rolling back rights, increasing surveillance, and concentrating power, which are classic hallmarks of authoritarian drift.
We should not underestimate how effective this dynamic remains. In recent years, politicians and media figures have turned culture wars over “wokeness,” school curricula, immigration, policing, and public health into tools for division and distraction; not genuine debate. These manufactured controversies are deliberately amplified to avoid addressing systemic failures in economics, social mobility, or power structures. While the topics may vary, the aim is the same: keep the public agitated, divided, and disengaged. Even if the issues themselves are constructed or exaggerated, the consequences are very real: weakened democracy, policy inertia, and increased susceptibility to authoritarian narratives.
Crowdsourcing Resistance
It’s easy to miss the spread of “friendly fascism” in our lives. Its power isn’t loud or obvious; it is woven into the interfaces we trust, the algorithms that guide us, and the smooth promises of progress. This soft authoritarianism has already shaped the choices we make, the culture we consume, and the way our democracy functions. If we don’t push back by questioning, challenging, and organizing, its hold will only tighten.
If this got you thinking, go watch The New Aesthetics of Fascism. The questions it raises and the images it shows are too important to ignore.
If you’ve come across “friendly fascism” in your daily life—whether online, at work, or elsewhere—consider sharing what you’ve noticed or what concerns you. Your stories, warning signs, or ideas for resistance are welcome below. The more we name and map these patterns together, the better chance we have of understanding and responding to them.
CommonBytes
This column explores a central question: What should technology’s role be in a world beyond capitalism? Today’s technological landscape is largely shaped by profit, commodification, and control—often undermining community, creativity, and personal autonomy. CommonBytes critiques these trends while imagining alternative futures where technology serves collective flourishing. Here, we envision technology as a communal asset—one that prioritizes democratic participation, cooperative ownership, and sustainable innovation. Our goal? To foster human dignity, authentic connections, and equitable systems that empower communities to build a more fulfilling future.



Yes, we used to call it happy face fascism.
The issue seems to be clear from a Marxian perspective.
Perhaps I am wrong but I see the means of production, of which technology is, galloping ahead of the relations of production.
The domination by machines in all aspects of life, as the article alludes to, has groomed and will groom a new generation that knows less and less.
The explosive technology, in the hands of a few billionaires and militaries, has outpaced the changes needed in subjective thought or class consciousness.
This is why bio-technology, transhumanism, techno-fascism (Palentier) international banking and cartels, to name a few, combines to create a toxic techno fascist world where social control masquerades as convenience.
“Relax, the machines have got this.” 👹