Peter Thiel, one of Silicon Valley’s most influential investors and the man behind companies such as PayPal, Palantir, and the early stages of Facebook, is spending a month in San Francisco talking about… the Antichrist and the end of the world, in front of a sold-out audience that paid $200 to hear him speak.

In these lectures, Thiel links the “Antichrist” to a global government that wants to “put the brakes” on technology, especially artificial intelligence, presenting any regulation as a step toward the Apocalypse. This anti-rationalistic discourse, a combination of theological fear and economic ideology, is not merely rhetorical hyperbole; it is a strategy for normalizing a post-dark enlightenment. It transforms the debate on technology regulation from a political issue into a metaphysical war between the “good of innovation” and the “evil of slowdown” demonizing ultimately the regulators and the luddists (Luddism was a workers’ movement in 19th-century England that smashed machines to protest losing jobs — not out of fear of technology, but against its unfair use) -even the performative ones.

Technological progress is presented as an autonomous force, independent of human intentions or social responsibility by those who adopt the principles of dark enlightenment (The Dark Enlightenment is a modern anti-democratic ideology that rejects equality and reason, arguing that technology and hierarchy—not collective progress—should guide society) and being profited from this interpretation of the world. “Progress” no longer needs a moral foundation, only faith in its inevitable trajectory, with any criticism or insult to the priesthood of technological eschatology translated into heresy against progress. Technology as a social construct, with dimensions of social impact and political economy, should not be placed on a supernatural pedestal, ignoring dimensions such as social inequalities or the undermining of democratic oversight itself.

If everything is determined by the “inevitable” course of the machine, then no one is responsible for it. The answer? There isn’t one. At least, there isn’t one with the certainty of the inevitable course of technological progress, but rather with a perception of agnosticism. A proposal is emerging from France, the UK, and the US. The neo-Luddite response is not romantic nostalgia. It is a political critique of the ideology of acceleration. Contemporary neo-Luddism does not reject technology; it rejects the narrative of its omnipotence. It targets the unequal distribution of power, the dependence of innovation on capital and corporate interests, and the imposition of the “technological rhythm” as a normative rule for society. It is, in other words, a form of counter-power.

The historical origin of the term “Luddism” remains distorted, with early 19th-century workers not hating machines but the way they transformed labor relations to the benefit of employers. In essence, Luddites practiced “collective bargaining through rebellion.” Their violence was not blind; it was class-targeted. They viewed technology not as progress, but as a tool of exploitation. The historical misconception—that “Luddism” simply means fear of the new—serves precisely the same circles that today repeat the propaganda of “innovation without limits.”

In contrast to this culture of acceleration, neo-Luddites—artists such as Molly Crabapple or journalists such as Edward Ongweso Jr—do not propose a “return to nature.” They propose a return to consciousness. They remind us that behind the “cloud” there is materiality: mining, labor, energy consumption. Behind every “smart” system, there are power relations. Their resistance is not romantic; it is ecological, class-based, and political. 

The great irony is that the accelerationist discourse resembles theology more than Luddism itself. While neo-Luddites ask “who benefits?”, techno-prophets answer “everyone, sooner or later”. In reality, this “everyone” is the most successful mechanism for concealing inequalities in the history of technology. Acceleration is political only as long as it remains invisible; once you see it, it loses its appeal.

The rhetoric surrounding Artificial Intelligence follows the logic of the stock market: it promises that the next version will solve the problems created by the previous one. The entire ideology of accelerationism is based on an economic fantasy, that speed produces value on its own. Progress, therefore, does not need a social purpose; it suffices to increase GDP or the return on shares of technology companies. The rhetoric surrounding Artificial Intelligence follows the logic of the stock market: it promises that the next version will solve the problems created by the previous one. The entire ideology of accelerationism, The companies developing it—OpenAI, Google, Meta—present their tools as “democratizing knowledge,” while in practice they are gaining unprecedented control over the distribution of information and labor markets.

The critical shift is that AI does not merely replace labor; it restructures the very meaning of production. From weaving fabric in 1811, we have moved on to weaving data in 2025. The modern worker does not operate machines—he feeds algorithms with his own attention, voice, and image. User-generated content is the new labor, only unpaid.

Neoluddite criticism brings the concept of social utility back to the table, it is not a question of rejecting devices, but of demanding transparency and collective control. Who decides which algorithms are promoted? Who defines the boundaries between innovation and surveillance? When the cloud consumes as much energy as a small country, who benefits from progress? Neoluddites ask the question that governments avoid: progress for whom? The irony is that acceleration is presented as a response to the crisis of capitalism, when in fact it deepens it. Algorithms are replacing wage labor, but not the need for income; productivity is increasing, but wealth is becoming more concentrated. Every promise of automation is accompanied by greater inequality. 

Behind the spectacle of robots and algorithms lies the redistribution of power. Techno-capitalism—the coupling of technological and financial dominance—is not concerned with whether artificial intelligence will replace humans, but with how it will make them cheaper. According to MIT research, automation has contributed more to inequality in the US than taxes or globalization. Low-wage workers are losing their bargaining power, while tech companies are becoming de facto employers without obligations.

The same phenomenon is confirmed by a report from the Institute for Public Policy Research in the United Kingdom: the problem is not mass job losses, but the shift in income from wages to corporate profits. A report that highlighted these issues as early as 2017, with the present confirming the course of events. Algorithms that promise “productivity” translate into unequal distribution of time and value, while workers’ time is compressed as capital’s time multiplies. Automation does not eliminate jobs en masse, but it drastically reduces the wages of middle-skilled workers, creating a “new cloud peasantry.”

Public discourse around technology maintains a strange ritualistic optimism. From Silicon Valley to Brussels think tanks, “innovation” is presented as an end in itself, regardless of its social cost. Acceleration is christened neutrality. This is precisely what neo-Luddite criticism denounces: that behind the dogma of “progress for all” operates a mechanism of exclusion, where decisions are made by the few who own the software, data, and infrastructure. Technophobia, then, is an invention of the accelerators; a scarecrow to dismiss any criticism as “emotional.” A democratic technopolitics is not reactionary; it is defensive. It proposes accountability institutions for TN models, transparency in corporate investments, the right to algorithmic control, and citizen participation in setting the pace of acceleration. The future is not predetermined by the power of servers — it depends on whether we reclaim the concept of “human purpose.” Machines have no soul, but they gain power when we give it to them. If there is fear, it is of silence, not of technology.

Shape the conversation

Do you have anything to add to this story? Any ideas for interviews or angles we should explore? Let us know if you’d like to write a follow-up, a counterpoint, or share a similar story.