Digital Gentrification

Content platform algorithms reward lowest-common-denominator material and narrow the space for experimentation.

Group of dancers in clown costume performing in the original Wintergarten 'Pony Ballet'. Brisbane, Queensland, 1926

Group of dancers in clown costume performing in the original Wintergarten ‘Pony Ballet’. Brisbane, Queensland, 1926 | State Library of Queensland | Public domain

Songs written to match the next TikTok trend, series based on the dictates of platform metrics. Algorithms not only mediate our recommendations, they also have an increasingly large impact on the content and format of cultural creations. An article published courtesy of Caja Negra Editora.

In early 2018, musician and writer Damon Krukowski noticed that something strange was happening on Spotify with his old band, Galaxie 500. On the platform, the most popular song by the Boston band (together from 1987 to 1991) was “Strange,” which had never been released as a single, did not appear in any popular films or series, and was not included on any of Spotify’s playlists. It was also markedly different to the rest of the band’s output, even to those songs that seemed to be most popular with Galaxie 500 fans. Another thing that caught Krukowski’s attention was the fact that this was only happening on Spotify, not on the other music streaming platforms.

“Strange” was, indeed, peculiar in terms of the compositional style of the dream pop group. Written almost as a parody of the pop music of the time – the late 1980s – it sounded more like an old radio hit than the band’s own creation, which seemed to explain, at least partially, the song’s popularity on Spotify. When Krukowski wrote about this oddity in his newsletter, it caught the attention of Glenn McDonald, one of Spotify’s top data analysts (remember the name, because we’ll come back to him towards the end).

McDonald did some internal snooping into what was going on with “Strange” and discovered that the track sounded similar to songs from bands that were more mainstream and popular than Galaxie 500. “Strange” had managed to “slip into the algorithm” thanks to the characteristics of the music itself. Another reason was that, in 2017, Spotify introduced a feature called autoplay, which automatically plays a similar-sounding track at the end of each album or playlist.

To put it simply, anything that sounds generic and popular is much more likely to be recommended on today’s internet. As a result, it’s difficult for less conventional material to stand out. Although this sounds quite logical today, it clashes head-on with the spirit of the 1990s, when the internet seemed to open the doors to a new world, one ruled less by power brokers and more by the curiosity of internet users.

That utopian vision did not last long. The Galaxie 500 anecdote features in Filterworld, the book in which journalist Kyle Chayka analyses how recommendation algorithms are flattening culture. “This is how algorithmic normalization happens. Normal is a word for the unobtrusive and average, whatever won’t provoke negative reactions,” he writes. “Whichever content fits in that zone of averageness sees accelerated promotion and growth, like “Strange” did, while the rest falls by the wayside.”

Photo by Fath | Unsplash

Photo by Fath | Unsplash

This is not a particularly new phenomenon, but it has clearly expanded and accelerated in recent years and is affecting industries outside of music and film. Leaving our choices in the hands of algorithms (ones fed by our previous behaviours), coupled with the overwhelming amount of content being constantly produced, is creating a consumer culture where what works is what stands out. An “average culture,” one where experimentation and transgression are not allowed if you want to be visible.

A representative example of this state of affairs is the current array of hits, which get the chorus in early on to avoid being skipped by the anxious listener. Spotify only monetises tracks that notch up at least one thousand plays, in addition to which it has just introduced an option that allows the user to listen to the “best 20 seconds” of each track, so that everything is designed for immediate consumption.

Some have called it the “TikTokisation” of culture. And it makes sense – although it had already been around for a few years, TikTok became globally popular in 2020, the year of the lockdowns, when people spent long hours indoors and most socialising and entertainment was via social media. TikTok’s algorithm takes just a few minutes to detect the user’s interests, before offering a list of videos that appeal to them.

As a social network, the Chinese platform changed the rules of the game set by its predecessors. You can follow accounts, but its greatest strength lies in its categories. Users follow interests, not people. This not only differentiated it from the classic versions of Facebook, Twitter and Instagram, but it also forced these platforms to change their own algorithms and imitate TikTok’s formats (reels in the case of Instagram; the For you and Following tabs in the case of the platform now called X).

This new state of play has also changed the behaviour of influencers – the pop figures of platform capitalism. While before they stood out for their area of expertise (fashion, gastronomy, travel, fitness, film and a long etcetera), many of them now “chase” the algorithm to stay relevant, keeping their eye on what does and doesn’t work in order to adapt to a reality that is changing at breakneck speed. In terms not only of content, but also formats (images, vertical and horizontal video, texts, photo galleries) – a reality that threatens any chance of quality, as it’s very hard to stand out on all fronts. Which is why, in recent times, content creators have emerged who are experts in a particular subject before suddenly becoming coaches, yoga masters or financial gurus. As their original style begins to weaken, the survival instinct forces them to pursue content with greater engagement.

The technology that came to democratise knowledge and offer new ways of accessing information is going through a critical moment, an example of what the French urbanist Paul Virilio called “integral accidents” – a technology cannot exist without the accidents or side effects that arise after mass implementation. On the surface, recommendation algorithms are useful because they allow users to save time and get what they want, but their effects are far from neutral. The obvious consequence is the flattening mentioned at the beginning – songs written to match the next TikTok trend, series based on the dictates of on-demand platform metrics, serialised content for fragmented, fast and ultra-processed consumption.

Photo by Kivanc Erdirik | Unsplash

Photo by Kivanc Erdirik | Unsplash

In an investigation titled “Algorithms and taste-making: Exposing the Netflix Recommender System’s operational logics” (2021), Niko Pajkovic experimented with inventing three users based on different stereotypes, namely the die-hard sports fan, the culture snob and the hopeless romantic. In the first few days of his experiment, he began to notice changes in the homepage of each of these users (something to be expected), but he also made other discoveries – in the thumbnail images of each film or series, for example. The home screen of the sports fan showed “images that included movement and dynamic colours” (even though they were not strictly sports material), while the screen of the culture snob “was dominated by darker hues, black and white artwork images and plenty of actor headshots.”

This can even happen with the same product. In October 2018, some Netflix users accused the platform of using different thumbnails of the film Love Actually depending on the profile. For example, a poster featuring Chiwetel Ejiofor, who doesn’t have a leading role in the film, raised suspicions that black subscribers were being shown posters with black actors. Netflix was quick to dismiss the issue in a statement noting that its algorithm was not guided by “race, gender or ethnicity,” but solely by the user’s viewing history.

So-called “filter bubbles” have been around for at least 15 years, when Google started customising its search engine results in 2009, but even after several investigations, the power of their influence has never been fully defined. The effects of this echo chamber can be clearly perceived on news consumption, whereby when a user becomes interested in information that leans in a certain political direction, coupled with their behaviour (likes, shares, comments), the algorithm tends to show similar content.

This system of distribution and validation of web content is susceptible to generating polarisation, as we have been seeing in the sphere of politics for some time now. But just as on some issues it produces extreme opinions, in terms of cultural consumption it seems to go the other way. “While political bubbles silo users into opposing factions by disagreement, cultural recommendations bring them together toward the goal of building larger and larger audiences for the lowest-common-denominator material,” argues Chayka in his book.

Another example he uses to illustrate this problem is what he calls “Instagram coffee shops” – coffee shops that follow a pattern taken from the predominant aesthetic of this social network (pastel-coloured walls and armchairs, generic pictures on the wall and baristas ready to prepare their umpteenth flat white of the day), and which can be found in any city in the world, in a kind of global gentrification influenced by the culture of algorithms.

These algorithms do not have “taste” in the sense that a human might, which is why using the word “recommendation” is problematic. They are mathematical formulas programmed to detect patterns in collective human consumption and then link them to individual consumption. This is what Glenn McDonald explained in a recent interview in the newspaper El Diario. The “data alchemist,” as he is popularly known, was responsible for some of Spotify’s most popular algorithms, as well as creating Every Noise at Once, a massive music map with genres from across the world.

McDonald left the company last December, following staff cuts. “In public libraries there’s public ownership. Not at Spotify – there are just economic interests. But my work has never been driven primarily by business imperatives,” he said at one point in the interview, before adding that “we should all be afraid. Anyone who works with technology related to human activity should keep that in mind in everything they do. And listen to their conscience. I always felt relieved to work for a music streaming platform rather than a health insurance service.”

All rights of this article reserved by the author

View comments0

Leave a comment