The internet has represented a change of scale on all levels that has affected the way that we access, consume and relate to information. In a few years, we have passed from a process in which our search was motivated by curiosity towards an automated system that suggests to us contents that it supposes we will like. Meanwhile, the boundaries between medium, channel and source have all become blurred, and we all believe ourselves to be transmitters and receivers of contents. This new paradigm of supposed socialisation of information is occurring in a hyper-centralised scenario where the algorithms of companies such as Google, Facebook and Twitter, channel and influence a high percentage of the information that we consume. How does all this affect us? What are the potential risks and benefits?
The internet has represented a revolution and a change of scale in the creation and distribution of and access to information, modifying the cognitive ecosystem on a worldwide level.
The number and diversity of people with access to receiving and emitting information has grown exponentially. This can be considered a democratising and positive fact: the average cultural level of humanity is much higher than it was hundred years ago. Since knowledge is not a good in scarce supply, the more dissemination, the better for everyone. When the bases of any society improve, the entire system improves.
However, how does this massification affect the information hierarchy? Now that many more people are talking, how can we discern who has the authority to talk about a subject? How can we define which sources are reliable? Who determines which voices become canonical, in other words, how are new reference models defined?
In the current context, the canon, as traditionally set by an institutional elite, has died to make way for an ecosystem of canons where as many hierarchies as identities and motivations that are related in each individual or each society all converge. The institutions that to date decided who ought to be the reference points in the different fields of knowledge have lost power, which is now diversified between new voices.
In recent years, society has progressively increased its awareness when it comes to consuming. As citizens we want to know where the products that we acquire have come from and we expect the rights of people involved in their production processes to be respected. This has led to the birth of initiatives such as fair trade, organic food, responsible investment funds or medicines not tested on animals.
Ethics have demanded traceability in production systems. Today, any product purchased from the supermarket incorporates a batch number that enables us, if an incident arises, to detect its origin. In this case, technology has served to improve the quality of production and distribution systems, and also of consumer service.
However, can the same be said in the news consumption field? At the height of the hyperlink era, how can it be that we spend the day talking about false news, about internet fakes? Should it not be possible to “trace” pieces of information and be able to validate the sources from which data have been extracted, in such a way that consumers have peace of mind that what they are reading has passed through various filters and quality controls and, if an incident arises, they can contact the original producer? Was this not journalism’s role?
What holds the most power today? The information producer, the source, or the distribution channel?
The internet was born as an open project, one that enabled decentralised and horizontal communication between any two nodes on the network. Today, major incorporations such as Google and Facebook endeavour to concentrate information and users to the maximum, then retain them inside their environments, wanting to convert the network into a series of bunkers that are increasingly isolated from each other.
These companies want to be channel and source at the same time. When we seek weather information on Google, we do not notice which agency made the prediction (Meteocat or Aemet?). The data are presented as if the source were Google itself, while the real source appears increasingly more hidden. The same thing happens if we search for information on the stock market, the state of the traffic… or when we read the news.
These platforms demand our continual attention and do everything to ensure that we increasingly consume more information without having to leave them. Some projects like Instagram go even further and no longer allow the use of URL links.
The media have also joined in with this centralising effort. To avoid readers abandoning their website, the main newspapers no longer include in their news pieces hyperlinks to external sources.
It is necessary to be aware of this situation and to fight the growing monopoly in order to the internet continues being multi-channel and multi-source, guaranteeing, promoting and defending diversity on the web. And it is necessary to give some consideration to what the role of the fourth estate is and should be within this context.
Emotion and cognitive fact
The media are not only tending to concentrate information, but have also begun a bloody battle for clicks, because they see how their income increasingly depends on Google advertising. This means that journalists increasingly endeavour to produce click-seeking headlines, which appeal to the emotion instead of to reason and make the click irresistible.
Emotion per se is not a negative bias for the cognitive event, since curiosity has always been a source of knowledge. The risk appears when, to win clicks, many traditional media are forgetting their ethics and their style guides and are coming dangerously close to the ways of working of the sensationalist press. Popularity, number of visits, likes, retweets or similar, have caused a progressive crisis in argumentation, in favour of an increase in emotional contents, which are increasingly polarised. The headlines have lost neutrality in favour of scandals. Quality has fallen in favour of repetition.
Algorithm and active searching
Added to this new scenario is the fact that we have shifted from a “search” environment to a “feed” environment or contents channel. Now we no longer consult the newspapers, but rather the news reaches us via our timelines.
It is important to be aware that these channels, whether news-based or cultural, are not neutral. There is an algorithm behind them that filters, orders and presents to us those pieces of news or knowledge that it is probable that we will like most according to our behaviour history.
An algorithm is nothing more than a code designed by an organisation. Who designs it or how, and with what aims – commercial or political – is an aspect that should occupy a prominent position in the social debate. Three or four companies on a worldwide level are deciding, in an opaque way, what material we are consuming in the news and cultural sphere. Progressively they are leading us to stop searching, while trapping us in a “news bubble” made to measure for us, of for all those people who comply with our same pattern. We could see it with companies such as Cambridge Analytica and the 2016 elections in the USA.
Algorithms tend towards the convergence of patterns and try, by default, to simplify our complexity. They interpret us and include us in a determined pattern, a fact that is a direct attack on our individuality. The reinforce stimuli that work, making it difficult for our tastes or interests to evolve.
Before this oh-so-powerful tool, we have the responsibility of keeping our curiosity active, of getting out of our pattern, of going to discover new things to not end up “framed” within a determined social profile, however small and segmented it may be.
In the same way that years ago we made an effort to find information on subjects that interested us (music, books, etc.), now it is necessary to make an effort to escape it. Only this way can we break with and expand our limits and tastes. We have more patterns than ever, we can tour more paths than ever before, as long as we keep our curiosity alive.
Towards a global news ethic
We humans have always needed filters to access information, Teachers, books, manuals, the media… the disseminator is a basic necessary tool for accessing knowledge.
This task, today, is assumed increasingly by machines, with the potential benefits and risks that this involves. It is necessary to keep this in mind and act accordingly. For this, it is necessary for these technologies to be open by default, developed with free programming, to be able to detect and avoid economic or cognitive biases in their design.
In an ideal world, a good algorithm could become a “good disseminator”. The good disseminator translates from the top down, adapting the discourse to the level of the recipient and respecting the original source. An algorithm could do this task, but the technology is not neutral. For this reason, it is necessary for us to incorporate ethics into the technical decision-making process.
Now we are more aware than ever that as members of a society, we are a node in a network, where we play a role. The action is collective but we, as individuals, are responsible for it. It is necessary to continue defending an open, free and decentralised internet. It is necessary to struggle for this change of scale that is the internet to go in the correct direction. We cannot accept as the only reality the biased products of large corporations. It is necessary to encourage the individual responsibility of choosing and of discovery, and the responsibility and collective power that we have as a community of users. It is necessary to differentiate between source and channel, and to fight to ensure that there is not just one single channel.
It is necessary to continue to strive to know the canon, but with the freedom e of avoiding it. Divergence shows new possibilities until it establishes new paradigms. Let’s defend them.