Remember the Human

Reddit, the interface between digital niches and the mainstream, is one of the battle grounds for debates on freedom of expression and toxicity in the era of the great algorithm.

Navy children boxing | Harris & Edwing

Navy children boxing | Harris & Edwing, Library of Congress | No known restrictions on publication

As it has grown, Reddit has taken decisions on the moderation of toxic content. The first attempts at minimal moderation caused hostile reactions among users, although over the years Reddit as achieved an alternative, decentralised moderation model. It is a more horizontal, cooperative and human model than that of other platforms, and the result is the closest thing to a space where it is possible to converse with true freedom.

The experiment had very clear rules. A collaborative canvas of 1000×1000 pixel squares, which over a period of 72 hours all and any Reddit users could paint however they chose, pixel by pixel, with each user allowed to make one change every five minutes. The title: Place. It was April 2017, and of all the 250 million “redditors”, over a million took part. What could possibly go wrong?

Reddit is a massive network of user-created communities of interest dedicated to hundreds of millions of topics, ideas, memes, places, ideologies, sensations and debates. In 2017, when the team behind Reddit decided to launch the experiment Place, the question (and reason why the management were somewhat nervous) was whether the canvas would be filled up with swastikas and insults.

The result was a disperse, mutating microcosm of factions, alliances, creation and destruction, of communities battling it out for the same space, incursions, counter-incursions, toxic trolls, resistance fronts… In several areas of the canvas swastikas emerged, to be wiped out by armies of users or turned into the Windows 95 logo just minutes later.

The final result is a visual metaphor of what Reddit aspires to be – a flag fragmented into a thousand coexisting pieces. Place was a performance art representation of the solution offered by Reddit to the problems of the dark, corporate, algorithmic web of the Trump era. The canvas showed that a living ecosystem is one in which everyone enjoys equal conditions and freedom of expression and where community control puts a stop to hate.

But can such a model really work?

A hippy-yuppie fusion

Reddit is the result of the wave of digital entrepreneurship that followed the dot-com crisis. The idea of a news aggregator came up during the first startup bootcamp run by Y Combinator, a startup incubator backed by the tech guru Paul Graham. Graham’s idea was to offer young entrepreneurs from universities in the US just enough money to be able to develop their projects, protecting them from the major investment funds that had overrated and inflated so many web projects during the tech bubble that peaked around the year 2000.

Paul Graham’s startup school was launched with the aim of creating a new, fresher and more unrestrained approach to offering pragmatic solutions to specific problems. Examples of alumni companies include Dropbox, Twitch and Airbnb. What these digital projects have in common is that they combine an eccentric, revolutionary and easygoing image with an underlying philosophy that is decidedly capitalist. This ambiguous, paradoxical combination is a direct cultural byproduct of what the theorists Richard Barbrook and Andy Cameron have termed the Californian ideology.

The Californian ideology describes the social and cultural environment in which Silicon Valley as we know it emerged. According to the authors of the term, it combines the liberal and eccentric spirit of 1960s counterculture with a technological determinism focused on capitalist and commercial production. The coming together of hippy and yuppie. Revolution and capitalism. The digital ecosystem offers the promise of freedom, assuring us that any problem can be solved with an app, and that forging a better world is synonymous with becoming a multimillionaire – without a tie, of course. And all thanks to creative ingenuity… and the ingenuity of investors!

Reddit is the prototype of this kind of cool entrepreneurship. Its founders, Steve Huffman and Alexis Ohanian, launched it in 2005 when they were fresh out of university, before adding new features to make it more attractive. The idea was simple – a website where anyone could post interesting links and receive upvotes or downvotes. The more upvotes a link received, the more visible it would become.

Reddit began to attract users with profiles similar to those of its creators – young university students with an interest in the worlds of videogames, programming, science fiction, etc., with a nerdy and edgy sense of humour, who welcomed a space where they could find links about things of interest to them.

Huffman and Ohanian decided to divide the page into subpages, which could be created by users to generate microcommunities on their specific topics of interest, with no limits. And this is how subreddits were born. Quite by chance, they had invented a system with the ingredients needed to accommodate hundreds of thousands of small digital tribes with their own icons, languages and traditions. A space for everyone with peculiar tastes, for authenticity and niche interests. A heterogenous, chaotic and raw place in times of diaphanous homogenisation.

And that’s what Reddit is today – a site that is a myriad of sites. A space that is hard to explain or define.

If you like cute animals, you can join r/aww; if you’re from Ontario, there’s r/Ontario; if your thing is cringeworthy antics, you can check out r/Cringetopia; if your plans for revenge always occur to you too late, you can amuse yourself with the stories on r/PettyRevenge, and if you’re totally hooked on the latest Call Of Duty, then obviously there’s r/CODWarzone. In r/AskParents you can find mums and dads who share parenting advice, at r/EarthPorn you can see photos of incredible places from around the planet, and if you can’t handle too much excitement, then r/MildlyInteresting is the place for you. In r/nonononoyes there are scenes that seem to be heading towards catastrophe but turn out well in the end. Then there’s r/Transpassing, a community of people who share tips and advice on gender transition. r/jazz, r/Catalonia, r/mademesmile. r/nextfuckinglevel, r/oldpeoplefacebook, r/wholesomememes… the list isn’t endless, but it might seem that way.

No few toxic spaces have also been generated on Reddit, such as r/Jailbait, where users shared erotic images of teenagers, r/FatPeopleHate, a community dedicated to humiliating overweight people, and r/incels, a community of misogynous men that incited violence against women. Dozens of racist, antisemitic, sexist and far-right communities have found a home and a voice on Reddit.

In an attempt to protect and justify the platform, one of its founders Steve Huffman described it as a place for “open and honest” conversations, with “’open and honest’ meaning authentic, meaning messy, meaning the best and the worst and realest and weirdest parts of humanity”.

Freedom to harass?

As it has grown, Reddit has taken decisions on the moderation of toxic content. It has deleted certain hate communities on different occasions, although its approach on how to deal with trolls, conspiracy theories and harassment has varied depending on the moment and the climate.

When in 2011 Reddit decided to ban r/Jailbait, the company’s then CEO, Yishan Wong, excused the decision as if it had committed a sin, clarifying that it had only been deleted because it shared content that could be considered illegal under US law. “We stand for free speech”, he said at the time, “we are not going to ban distasteful subreddits”. At the beginning of the decade, this idea of absolute freedom of expression was a strong idea shared by all the social media sites, which were expanding significantly at the time. It was the time of the great digital transformations, the Obama era, in which the general feeling was that 2.0 really was democratising the world. The CEO of Twitter said: “We are the free speech wing of the free speech party”.

Social media platforms had to be “neutral”, which at that time was synonymous with a passive attitude regarding any type of user behaviour. When in 2012 Reddit announced that it would ban the sharing of suggestive or sexual content involving minors on the platform, the comment about the news with the most upvotes said: “Good job, mods. You’ve now opened up yourselves to outside influences over what content can and cannot be posted to Reddit”.

Two years later, when Ellen Pao took the decision to close the five most toxic subreddits, which included r/FatPeopleHate and r/shitniggerssay, a significant proportion of users reacted in violent indignation against Pao, sharing images of swastikas linked to her name so they would appear whenever someone did a Google searched for her. Many of them threatened to switch to a platform with no censorship.

Users reacted with hostility towards the first attempts at minimal moderation. Why were they so annoyed about the closure of forums that pushed humiliation, misogyny and racism? Why did they feel so under attack?

The crumbs of cyberutopia

The origin of this specific idea of unconditional freedom harks back to the cyberutopic discourse of the early years of internet.

In 1996, the digital activist and Grateful Dead band member John Perry Barlow began his Declaration of the Independence of Cyberspace like this:

Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.

In this declaration, penned in response to the approval of the United States Telecommunications Act, Perry Barlow epically declared that internet users were creating a world free from privilege and prejudice due to race, economic power, military force or station of birth… emulating the foundational tone of the declaration of Thomas Jefferson. In this new world, he said, anyone could express their beliefs without fear of being silenced, “no matter how singular”.

This libertarian, poetic stance has had a practical application in many projects that have tried to position the internet as a global alternative to capitalism. Cyberutopia has played out into fights for the free circulation of digital contents, free access to information, free communication between individuals, digital neutrality, the digitalisation of democracy and an autonomous digital currency, among many other things.

However, as pointed out by the critical thinker Evgeny Morozov, the cyberutopian project is essentially a capitalist project with North America roots which, rather than offering a real alternative to capitalism, is nothing more than capitalism in its most complete and developed state.

Despite their limitations, cyberutopian ideas have crystallised in the ethics of digital communities, which have realised that they need to defend their bastions of freedom of expression from the establishment and the big traditional corporations. This idea of a freedom of expression under permanent threat from external forces has filtered through into the worldview of platforms like Reddit. Who is this hostility aimed at? Who would be capable of destroying their space?

The usual suspects are the traditional media, which mock and fail to grasp the dynamics of the internet, the corporations that want to exploit the authenticity of communities for economic gain… and the new users, who trample the grass and breach the community’s netiquette and customs. This last threat is known as “Eternal September”.

The idea of “Eternal September” refers to the huge influx of new users that appeared on the platform Usenet in September 1993, the moment when the internet service provider AOL opened the floodgates, giving a whole wave of newcomers access to this discussion system, overwhelming the small community of veteran users and overturning the social norms they had created. Since then, the term has been used to refer to the fear of a social invasion of new, often uninformed users that undermine and damage the harmony of digital platforms that have their own dynamics, language and balance.

As the internet of the big platforms continues to develop, fear of corporate control of the internet and fear that the Eternal September will wash away its collective philosophy are the reasons why many Reddit communities show a hostile reaction to the influence of the establishment and reject any type of external intervention.

This anti-establishment attitude in favour of personal freedom, exploited by certain far-right groups, is what is being used to nourish a new conservative populism that reworks the old concepts of cyberutopia to fight against what they call the dictatorship of political correctness.

But what exactly is this freedom of expression that they defend tooth and nail? What kind of place would the internet be without any kind of human intervention or moderation?

Freedom, for whom?

One case that illustrates the confusion between freedom of expression and a lack of rules is that of Etika. Etika was an African American streamer who broadcast himself playing all types of video games on Youtube and Twitch. Someone suggested that he play the popular video game Minecraft, but on one of its “anarchy servers”. Minecraft has hundreds of thousands of public servers on which players from around the world explore the map, build bases and make and dismantle communities. Its anarchy servers are characterised by having no type of moderation system, rules or administration team. All the players can do what they like without fear of being banned.

A few minutes after Etika joined, the server’s chat began to fill up with racist insults and comments wishing him dead, and wherever he went, users built monuments in the form of swastikas. Etika, who had talked on several occasions about his mental health problems and the fact that he was going through a difficult period, committed suicide just a few weeks after entering the server. After his death, the same racist jokes kept appearing on the chat, as if nothing had happened.

Contrary to what John Perry Barlow asserted, the internet does not offer an alternative to the real world, but merely a long and often caricaturised shadow. All the structural problems of the physical world are reproduced and amplified in the digital world – it doesn’t resolve them as if by magic.

A clear example of how this occurs on a structural basis is explained by Safiya Umoja Noble in Algorithms of Oppression, in which she argues that search engines like Google reinforce pre-existing social ideas.

She explains how, when she did a search with the words “black girls”, hoping to find fun activities to do with her African American daughters, many of the results on the first page were pornographic videos. Similarly, the search results for racialised people brough up negative, humiliating comments, unlike those for white people. After publishing her findings, Google decided to manually correct the results.

As described in a conversation on Healthy Gamer GG, the mental health channel for streamers and gamers, the daily experience of many female streamers on Youtube and Twitch is a round of insults, disdain and sometimes even death threats. For them, freedom of expression involves creating moderation systems that ensure their safety in male-dominated environments, where they are often seen as a threat, and constant efforts to highlight their work and not allow sexist comments and reactions to be normalised.

To put it another way, real freedom of expression on the internet needs clear rules, the banning of toxic users, prevention and erasure of harmful content and constant monitoring of the health of the community. But who should these tasks fall to?

Social corporate deplatforming

Since the organised attack on the US Capitol, many of the big platforms like Twitter and Google decided to ban Donald Trump from their corporate spaces and issue decisive declarations of intent. Amazon and Google also cut off platforms for the organisation of hate communities, such as Parler.

We won’t allow our platform to be used to express attitudes that promote violence, they said. After years of turning a blind eye and looking the other way when Trump breached the rules and threatened or insulted other users, his synchronised deplatforming when he was already on his way out the White House smelt more of marketing and damage control than firm corporate ethics.

The problem of purely corporate moderation mechanisms is that they are dysfunctional and arbitrary by nature. It is corporate, economic and political interests that influence what decisions are taken and when. What’s more, the task of controlling all the minor details of user activity is often outsourced to exploited workers who are traumatised by violent messages and images and who sign confidentiality agreements on the real moderation criteria.

The moderatocracy of Reddit, an alternative model

In 2020, Reddit jumped onto the band wagon and changed its policy to make it more restrictive, more inclusive and more proactive with regards controlling its conversations. The changes included banning communities that promoted hate towards people on the grounds of their identity or vulnerability. This announcement was followed by the closure of more than two thousand communities, including the subreddit that helped Trump win the 2016 election, r/the_donald, and r/ChapoTrapHouse, the subreddit of the left-wing podcast.

All in all, over the years, Reddit has achieved an alternative moderation model that considers the rules of the platform and the community rules of each subreddit. This decentralised model spreads out the task of moderation between volunteer users, the Reddit team and different democratic social self-control systems, such as referendums to update the rules of a certain subreddit or mechanisms for anonymous reporting and complaints.

To help them in their work, Reddit stays in direct contact with moderators and offers them psychological support. In addition, a lot of communities have created subreddits dedicated wholly to detecting hateful discourse and toxic subreddits, so they can be quickly reported before they have time to grow.

The Reddit model isn’t perfect. It can generate mini oligarchies of all-powerful moderator users, and the voluntary work of the moderators can be used by the platform to diminish its responsibility of what users say or do. But this combination of corporate responsibility, community moderation carried out by volunteers and structural support is at least a more horizontal, cooperative and human model than that of other platforms.

The first rule of Reddit’s content policy, as on many other sites and services nowadays, is this:

Remember the human. Reddit is a place for creating community and belonging, not for attacking marginalized or vulnerable groups of people. Everyone has a right to use Reddit free of harassment, bullying, and threats of violence. Communities and users that incite violence or that promote hate based on identity or vulnerability will be banned.

A somewhat less seductive statement than the calls for freedom of the cyberutopians of the 90s, but in view of the experience of the last 30 years, the result is closer to a space where it is possible to converse with true freedom, and where swastikas are erased by users, not algorithms or opaque platforms, even if only to turn them into the Windows 95 logo.

All rights of this article reserved by the author

View comments0

Leave a comment