In the 1990s, Frederic Jameson famously said, “It seems to be easier for us today to imagine the thoroughgoing deterioration of the earth and of nature than the breakdown of late capitalism.” He was essentially talking about how this was presented as the natural state of the world, so self-evident that even catastrophic scenarios ranging from nuclear holocaust to climate catastrophe seemed more likely than a society without capitalism. He called this “a cultural inability to imagine the alternative,” a kind of collective blockage of utopian thinking. Today, we are experiencing a special situation in which it is easier to imagine the end of the world than the end of social media as we know it.

Social media today favors speed to the point of deregulation, extremism, emotionalism, and images at the expense of text and meaning, but it did not start out that way. When social media first appeared, the narrative was almost messianic, that a technology was coming that would abolish borders, a decisive factor in leaving homo sapiens behind and moving on to homo digitalis. This technology would give a voice to the invisible and make the world more democratic, with pioneers such as Friendster, MySpace, and later Facebook and Twitter, which began by fulfilling some of these wet dreams, but today the situation is different.

The era of that naivety is long gone; now only 7% of users say they trust platforms to convey accurate information, while 67% believe they have a negative impact on society, with this percentage rising to 69% among Gen Z, who are natives to the space and, for the most part, refuse to change their residence. Recently, pop culture, through Netflix’s Adolescence series, which became a global phenomenon, outlined how the toxicity of algorithms can lead young people to misogynistic or reactionary ideologies. not that these are not social problems and do not exist out there — but overexposure and the integration of people into communities where a particular influence is one-sided, consuming such content, trapping themselves in echo chambers without critical approaches that deviate from a narrative for the sake of remaining in these spaces for commercial reasons, is the problem with platforms.

Naturally, data leaks such as Cambridge Analytica and Frances Haugen’s described revelations about Facebook’s promotion of hate have led users to understand that platforms prioritize clicks and screen time over truth or the health of the public sphere. Although social media is marketed as public squares, it is more like commercial malls. Jürgen Habermas had already described the “structural transformation of the public sphere” in the 1960s, pointing out that when public discourse depends on the market, the very relationship between political power and the communicative power of citizens is threatened. Adding to the discussion the extraction of behavioral surplus, a new form of capitalist exploitation that turns every click and scroll into marketable data, the argument becomes even more compelling. While Frances Haugen’s revelations reinforced this criticism, showing that Facebook’s own internal documents confirmed that algorithms reinforce hatred and polarization, not out of malice, but because Facebook makes more money when people consume more content.

The result is a public discourse that appears to be addressed to everyone but is in fact mediated by algorithms that decide what we will see and when, based not on the quality of the discussion but on maximizing the time we spend on the platform, making communication a self-feeding loop, where we see not only events but also others’ reactions to them, creating an illusion of universal consensus or opposition. However, the answer is not to reject social networks outright — the existence of many parallel communities does not in any way mean the collapse of democracy; on the contrary, it is an opportunity to hear voices that were previously excluded from public discourse. The question is how to transform this polyphony into productive conflict that leads to new forms of consensus rather than fragmentation.

It suffices to understand the power of social media and how a digital public sphere, in the public interest, can bring about change, using the example of the Arab Spring. Citizen journalism, which flourished on the streets of Tunis with citizens broadcasting to the whole world via their mobile phones, provided the raw material that channels such as Al-Jazeera broadcast to a mass audience. This complementarity in the intensity of the message from social media to the mainstream media provided small, immediate images and turned them into a narrative that the whole planet could see. However, the incident was nothing more than a historical window that quickly closed, as states upgraded their surveillance tools and the platforms themselves changed their algorithms to limit dialectical and productive radicalism, ultimately favoring hatred and fragmentation.

“Social media today favors speed to the point of deregulation, extremism, emotionalism, and images at the expense of text and meaning, but it did not start out that way. When social media first appeared, the narrative was almost messianic, that a technology was coming that would abolish borders, a decisive factor in leaving homo sapiens behind and moving on to homo digitalis.”

Towards a prosocial media ecosystem

If our criticism so far shows us what is going wrong, then perhaps it is time for the next step towards a new structure. The solution lies neither in censorship nor in deregulation in the name of vague freedom, but in redesigning platforms so that they unite rather than divide. The core of such a change could be indicators that show which posts resonate with which communities and which ones divide them. Also, an algorithm that highlights commonalities could create consensus between different groups, while the creation of economic and business models that reward platforms when they build new intersecting communities instead of breaking the social fabric could be the carrot. At the same time, algorithm transparency should be more than a prerequisite — and not in the way it is today — but what does that mean?

Edgar Allan Poe, in The Purloined Letter, says that “the easiest way to hide something is to leave it in plain sight.” Accordingly, platforms publish extensive, overly technical texts about their policies and data: they create the impression of transparency, while in practice few can understand what they mean . This “transparent” volume of information replaces real accountability; which is precisely why it is not enough to demand transparency; we must define its purpose, with the ultimate goal of making the content understandable and the operation of the algorithms verifiable, rather than leaving users to figure out a puzzle on their own.

The alternative? Prosocial media. Platforms deliberately designed to cultivate empathy, mutual respect, space for dialogue, and prioritize users’ mental health do not chase engagement for engagement’s sake, but reward the quality and significance of interactions. Examples such as BeReal break the culture of curated profiles, while Glass removes public likes and popularity metrics. Dispo, on the other hand, in an interesting move, deliberately delays the appearance of photos to slow down the pace of content consumption.

VML’s research shows that there is real social demand for such solutions, with an audience — albeit rather limited compared to mainstream trends — leading to sustainability (at least for now, it remains doubtful in the long term) for alternative platforms such as Bluesky with its self-curation feed tools or Mozi, which encourages physical meetings and takes the user away from endless scrolling. Returning to the Bluesky platform, which started as an experiment after Twitter was deemed toxic during the Musk era and became an independent network that allows users to create their own feeds. CEO Jay Graber described it as a “choose your own adventure” experience, which aims to reduce toxicity and give users back control of their information flow. Even so, the issue is ultimately one of culture. We are all voluntarily present everywhere, and the culture of digital consumption must be transformed into a culture of digital dialogue.

Within this landscape, we must bear in mind that the “platforming” of information itself involves a shift in emphasis from journalistic diligence to the diffuse and unregulated user-generated content, which may ultimately work to the detriment of the democratic functioning of society when form becomes the decisive criterion for dissemination at the expense of content. The question becomes whether democratic information can survive meaningfully in a fragmented and commercialized public discourse. In short, we must decolonize the public sphere from platformization as we know it today.

 

“Edgar Allan Poe, in The Purloined Letter, says that “the easiest way to hide something is to leave it in plain sight.” Accordingly, platforms publish extensive, overly technical texts about their policies and data: they create the impression of transparency, while in practice few can understand what they mean. This “transparent” volume of information replaces real accountability; which is precisely why it is not enough to demand transparency; we must define its purpose, with the ultimate goal of making the content understandable and the operation of the algorithms verifiable, rather than leaving users to figure out a puzzle on their own.”

The Sphinx riddle

The discussion on whether to prioritize solutions must go beyond moderation measures and fragmented self-regulation; we need to consider the political economy of platforms so that they function as a public good rather than a tool for profit. This vision is in line with existing proposals that highlight common ground rather than anger, propose tools for collective curation that build consensus, and business models that reward social cohesion along with others that cultivate empathy, respect, and aversion to superficiality and concern for users’ mental health.

Democracy needs digital spaces, spaces that are open, transparent, and collective, allowing for the functioning of the dialectical triad of thesis, antithesis, and synthesis. I would like to conclude by saying that the moment is now, perhaps it is, I wished to be able to conclude by stating that the moment is now, and perhaps it is; I would like to believe that we are on the threshold of a decision, but we are more likely in a situation as described by the poet Dinos Christianopoulos in one of his poems, saying “Neither to die, nor to be healed, I only want to settle into my own destruction.” If we were at this threshold, the question of the sphinx would be whether to accept the cynical normality of an online ecosystem that thrives on division, or to rethink what we expect from the digital public sphere and invest in a new type of social media that puts the word “social” back at the center. Who knows what we will ultimately choose as a society, nevertheless, Oedipus gave the sphinx the response ” the man” and passed.

Shape the conversation

Do you have anything to add to this story? Any ideas for interviews or angles we should explore? Let us know if you’d like to write a follow-up, a counterpoint, or share a similar story.