Orwell in times of facial recognition

George Orwell’s vision of dystopia, "1984", describes situations similar to those of current times and advances ethical dilemmas still difficult to tackle today.

CC-BY-NC Martín López, 2019

Seven decades have now passed since the publication of 1984, the most quoted political fiction in contemporary culture. Advances in big data, surveillance and artificial intelligence make reading it today a disquieting exercise in current relevance. To mark the occasion of Orwell Day, which has been held at the CCCB since 2013, we re-read the novel with a new perspective. We find similarities with the most immediate present and ethical dilemmas that are still difficult to tackle today.

On 8 June 2019, the 70th anniversary was reached of the publication of 1984, the dystopian novel par excellence. Written by Eric Arthur Blair, real name George Orwell, it describes a future society in which a totalitarian regime controls all aspects of life through continuous surveillance. A nightmare scenario that made a strong impact at the height of the Cold War, but that has endured because of its inspired capacity to foresee phenomena such as post-truth, video-surveillance, and in a more subtle way, the forms of learning of machines and artificial intelligence.

The tracking of emotions

Behind Winston’s back the voice from the telescreen was still babbling away about pig-iron and the overfulfilment of the Ninth Three-Year Plan. The telescreen received and transmitted simultaneously. Any sound that Winston made, above the level of a very low whisper, would be picked up by it, moreover, so long as he remained within the field of vision which the metal plaque commanded, he could be seen as well as heard. There was of course no way of knowing whether you were being watched at any given moment. How often, or on what system, the Thought Police plugged in on any individual wire was guesswork. It was even conceivable that they watched everybody all the time. But at any rate they could plug in your wire whenever they wanted to. You had to live — did live, from habit that became instinct — in the assumption that every sound you made was overheard, and, except in darkness, every movement scrutinized.

George Orwell wrote 1984 on the Isle of Jura, Scotland, between 1947 and 1948. In the mid-20th century, television was still a minority medium, while video-surveillance cameras did not start to be commercialised in the United States until 1949, barely a year before the writer’s death. However, the novel anticipates a mass proliferation of screens, which is still expanding today, from multi-cinemas to smartwatches. The telescreen of 1984, vaguely described as “an oblong metal plaque, a kind of dulled mirror” – is a two-way instrument that emits messages but at the same time captures the most subtle of sounds and facial expressions. These devices are ubiquitous and implacable, as they register everything from an “unconscious look of anxiety” to a nervous tic or even a rumbling stomach.

Despite the fact that wariness towards video-surveillance is nothing new, it is surprising to contrast how recent developments in the field of artificial intelligence bring us closer to scenarios like those described in 1984. The most obvious are facial recognition systems, computer programs capable of identifying a person in images or videos and associating their face to pre-existing databases such as police records, Internet use habits, publications on social media, etc. This technology is legal, cheap and accessible, as shown by a report in The New York Times in which one of the editors constructs a fully-functioning system of this type for around sixty dollars.

The fear of the potential for these tools has already led the city of San Francisco to ban their use by the police and other public institutions. “We can have good policing without being a police state”, argued Aaron Peskin, the politician who championed the measure; “and part of that is building trust with the community based on good community information, not on Big Brother technology”. This debate is still a long way off in other places around the world, such as China, where the government of Xi Jinping is laying the groundwork for a quasi-universal technological control.

CC-BY-NC Martín López, 2019

From the Great Helmsman to Big Brother

Although facial recognition and other technologies are starting to be thrown into doubt in the United States, their use in China is following a disturbing pattern. Advances in big data and artificial intelligence, combined with the extensive network of video-surveillance cameras already present in the country, make fertile ground for the testing and implementation of advanced experiments in social control. The most emblematic is the so-called Social Credit System. Although still in its testing phase, this system aspires to combine the personal information and behaviour of the population and companies to assign them a trustworthiness level, which can then be reduced or increased. Giving money to an NGO scores points, as does caring for elderly people or leaving a hotel room tidy. In contrast, not paying a fine subtracts points; this can also be the result of behaving badly on public transport or smoking at a hospital.

A good social credit score can mean reductions in utility bills or using certain services without having to leave a deposit, for example. A low score may be penalised by limited access to bank loans or to good schools for the family, among many other possibilities. In the worst of cases, and for those behaviours that the government considers especially harmful, people may enter a blacklist with bigger restrictions such as being unable to buy plane tickets or purchase a property. How people are added to and removed from these blacklists is an unknown, above all taking into account the fact that the Chinese judicial system is not independent in line with international standards. For this reason, some people are finding themselves trapped in a legal limbo, with the sensation, to paraphrase Winston Smith from 1984, that “nothing is illegal since there are no longer any laws”.

The voices defending the Social Credit System argue that in reality it rewards people and companies who contribute to the social good. However, the mere existence of a mechanism of this scale implies a collecting of data on a mass scale that necessarily invades all aspects of people’s personal lives. A compiling of information that combines prior governmental records, tracking by Internet provider companies and increasingly sophisticated surveillance systems, trained to recognise faces, voices and even the way that people walk.

The perfect prison

Despite the fact that Orwell’s novel does not describe a reputation-based system like that of China, it does envision a future in which, as well as infringements being punished, social correctness is rewarded. Thus, it is good to celebrate the victories of Big Brother, to report traitors, to show public hatred towards the enemy or to join the Junior Anti-Sex League. If everyone has the certainty “that every sound you made was overheard, and, except in darkness, every movement scrutinized”, the only way to survive is to modify one’s way of behaving, both in public and in private.

Prominent among authors who have studied the social effects of surveillance is Michel Foucault, with his review in the 1970s of the “panopticon” concept. The panopticon, let’s remember, is an architectural structure for prisons designed by Jeremy Bentham that allows prison guards to observe the inmates at all times. The latter, in turn, live aware that they are being watched, but they never know exactly when. For Foucault, this construction is the perfect disciplinary system, whose philosophy has spread to the whole of society thanks to corrective institutions such as schools or hospitals. The panoptic society isolates people to analyse them individually and constantly, is a mechanism offering the ability “to see constantly and to recognise immediately”. Its greatest effect is to lead each person to “a state of conscious and permanent visibility that assures the automatic functioning of power”. As in the case of prisons, where “it is at once too much and too little that the prisoner should be constantly observed by an inspector: too little, for what matters is that he knows himself to be observed; too much, because he has no need in fact of being so”.

Little can be added to the words of Foucault seen from the perspective of 1984, a tale whose narrative tension is based on a suffocating sensation of control. Unfortunately, nor is it complicated to find parallels in a world that is already aware of the revelations of Edward Snowden. Let’s remember that in 2013, this ex-analyst from the United States National Security Agency confirmed suspicions that his government and those of other countries had been collecting mass data on millions of people in an invasive way, including communications by chiefs of State.

CC-BY-NC Martín López, 2019

Suspicions for the future

Despite Orwell being a renowned author during his lifetime, he did not live long enough to see the success of his final novel. He died of tuberculosis in London in 1950, barely a few months after its publication. This also prevented him from confirming how some of his predictions were getting ever closer to the truth. As well as living surrounded by telescreens, people in the novel work with the “speakwrite”, a device to which they dictate texts, similar to today’s voice recognition systems. The author even intuited automated creativity, illustrated by gadgets such as the “versificator”, which combines rhyming words to create songs without any human intervention.

Curiously, both the inventions and the scenarios thought up by Orwell are related with the perverse use of machines or with their potential to destroy humanity. A fear as ancestral as the myth of Prometheus, but that connects with the dilemmas that are generated by today’s technological acceleration. At what point does an innovation cause a political and cultural change? Do we take technology’s social impact sufficiently into account? What does a society entirely based on interpersonal communication bring with it?

Isaac Asimov said that 1984 was a novel more talked about than read. This is something that can only occur when a story becomes an icon of popular culture. A condition that, in Orwell’s case, seems to have no expiry date, as he sketched out a fine metaphor for the totalitarianisms of the 20th century that is also useful for analysing the most controversial aspects of the digital revolution.

View comments0

Leave a comment

Orwell in times of facial recognition