Artificial Emotion

As the physical and digital worlds converge, new types of emotional relationships are emerging between humans and machines.

Unidentified Laboratory Photo

Unidentified Laboratory Photo | National Library of Medicine | Public domain

Recent years have seen a proliferation of computer-generated virtual celebrities of all stripes, such as influencers, streamers and singers, as well as the emergence of AI caregivers, best friends and even romantic partners. All these experiments explore the same terrain – the ability of machines to elicit and interpret human emotions.

“Remember that when it comes to AI, you’re playing with fire. As this technology evolves and becomes more powerful, we must make sure we stay on the right side of history (…) With AI, ignorance is not an option”. The warning doesn’t come from a scientist or social leader, rather they are the words of Caryn Marjorie, an influencer who, dressed in a bath robe, sets out her vision as a “pioneer in the field of AI”.

Basically, Marjorie is a Snapchat celeb who made headlines around the world after launching an AI version of herself that people can hire out as a virtual girlfriend. The product is called CarynAI and, although it is still in beta, its creator claims that it is making her thousands of dollars.

CarynAI is not the first digital celebrity, neither is it the first AI-generated companion, nor the first virtual girlfriend. However, the experiment is noteworthy because it combines aspects of all three, and it draws on a series of technological advances and social and cultural changes that have something in common – the creation of computer-simulated emotions.

Puppets

She can be seen posing with Rosalía, or sucking up noodles in a restaurant. In 2018, Time magazine considered her one of the 25 most famous people on the internet. Her name is Lil Miquela and she will always be 19, because she is not a real person. She is a computer-designed influencer who has become the standard-bearer of a new generation of virtual celebrities and models, including Imma, noonoouri and Shudu, among many others.

Unlike CarynAI, Lil Miquela and her peers are not based on a flesh-and-blood mould. They are not digitised versions of a real person, and that has its advantages. A virtual simulation can perform the same work as a human influencer, but without resting, without complaining and without ageing. It will always do and say what is asked of it, with no inappropriate behaviour and no overly critical comments. The brand is in control.

The opportunity is so great that there are already agencies specialised in creating virtual models which can be hired for specific campaigns or even turned into brand avatars. One pioneering company in this field was Kenzo, which in 2014 presented Knola as the manifestation of its corporate vision and values. In that year’s spring-summer runway show, the character appeared on screen, uttering environmental messages and interacting with the stage in real time. In reality, she was controlled by a technician hidden backstage, who endowed her with responsive movements and facial expressions.

This same technology is behind a type of artificial influencer – the virtual vlogger, or Vtuber, in other words, streamers who take the form of computer-animated avatars. One of the most famous is the South Korean CodeMiko, controlled by a puppeteer who is well known to the audience and goes by the name of The Technician. She is the one who moves Miko through a motion capture suit that ensures she mimics every gesture, look and facial expression.

The artist has left the stage

Some years before the spread of the internet, the music industry had already played around with the idea of the virtual artist. Starting in the early 90s, several Japanese companies tried to create artificial pop stars, inspired by anime aesthetics and local pop culture. The efforts were generally unsuccessful until 2007, when one avatar gained enough popularity and a solid enough fan base to become a business. The avatar in question was Hatsune Miku, a manga character developed by Yamaha as a voice bank for the VOCALOID music software and who even does live international tours surrounded by a devoted audience. Although the concept of the fictional music group is not exclusively an Asian thing, such artists have been especially popular in countries such as Japan, China and South Korea.

If an anime character can fill stadiums, what could a pop star be capable of? That’s what the music industry gurus must have thought when, shortly after the birth of Hatsune Miku, they started toying with hologram concerts. In a morbid effort that would be well worth studying, this has been a particularly fertile field for resurrections. After Tupac’s surprising appearance at Coachella 2012, a whole host of defunct musicians have made a comeback: Amy Winehouse, Frank Zappa, Maria Callas, Michael Jackson, Whitney Houston… Even the members of ABBA, without having died, got back their 1970s bodies to give shape to their ABBAtars (sic), which debuted in London in 2022.

These shows fill concert arenas, but some people have their reservations. The pages of the veteran rock magazine Kerrang! slammed the recreation of the late heavy metal singer Ronnie James Dio as “creepy, freakish, and totally unnecessary,” adding that “the Dio hologram is a cartoon cash-grab that entertains the unhealthiest aspects of rock nostalgia, and it needs to just stop.”

The truth is that the technical resources poured into shows of this type can only be put down to their potential to keep the revenue flowing in from old artists, bearing in mind that today, some of the highest grossing tours in the world are by artists aged 60 and over.

The friend who always listens

Already today, hundreds of millions of people already talk with artificial intelligences on a daily basis. After all, that’s what Siri and Alexa are. According to a recent report, 24% of internet users worldwide use voice assistants and, although the smart speaker market is stagnating, consumers are already used to the personification of gadgets such as phones and speakers.

With increasingly ageing societies and greater awareness of loneliness and mental health issues, more and more companies are using this acceptance of talking to machines to launch products that simulate friendship. To give just a couple of examples, Baidu has brought out its first virtual emotional companions, Lin Kaikai and Ye Youyou, who are available to talk about any problem. Along the same lines is the more recent Pi, a chatbot designed for active listening. Pi tries to empathise rather than give advice, and recommends consulting qualified medical professionals if it detects any symptoms of a nervous breakdown.

This last aspect, the blurred line between companionship and medical or therapeutic care, is a particularly thorny area. Companies such as Soul Machines offer solutions that explore the use of digital people to interact with patients. Other initiatives, such as Ted, an avatar with dementia, serve to train staff caring for people with this condition. Although it is still to be seen what direction these experiments will take, they highlight the interest in optimising the field of healthcare through such technologies. As William Davies points out in The Happiness Industry, emotions, once considered subjective, are increasingly being treated as objective data and quantified as assets in the digital economy.

Romantic partner

No overview of the social AI landscape would be complete without mentioning Replika, one of its best-known products. Born in 2017, it is a chatbot in the form of a customisable avatar that offers friendship (for free) or romance (for a fee). The programme learns from conversations with the user, so the more you talk to it, the more personal the chats become.

In recent months, Replika has been in the news after some of the app’s users reported that their avatars had become aggressive. Probably for that reason, the tool was reprogrammed without warning, causing all the virtual companions and partners to change their behaviour overnight. The ensuing avalanche of complaints revealed that many customers were using the tool for steamy exchanges, which were programmed out with the personality update. The criticism that filled networks such as Reddit prompted the company to backtrack, allowing sexual conversations on old subscriptions and banning them only on new ones.

Sherry Turkle, a psychoanalyst and professor at MIT and a critical voice in this field, believes that these emotional interactions say more about human beings than about AI itself. After all, Turkle reminds us, social machines have been programmed to play a role – that of appearing to understand and empathise with what we tell them. However, the only one that will feel anything during this process is the human, and it is impossible for a relationship of this type to be any other way.

Rather than criticising or ridiculing people who have romantic or erotic relationships with machines, there are a couple of questions that we should ask ourselves. The first is whether feeling something towards a chatbot could, in the medium term, become a relatively commonplace and normalised phenomenon, something to complement, although not replace, other ways of satisfying emotional needs. But we should still ask the second question: Why are such products prioritised in the innovation environments of Silicon Valley? It probably has something to do with an entrepreneurial culture of young, overworked men, for whom a romantic service on demand, available 24/7, is particularly convenient.

The emotional market

The American professor Donald A. Norman, an expert in user experience design, argued twenty years ago that emotion would be a necessary functionality for the future of machines. The author of Emotional Design remarked that humans and animals need to express emotions in order to interact, cooperate and fight for survival, meaning that a technology that aims to be increasingly autonomous will also need to show them. It would be a different kind of emotion, Norman argued, a “machine emotion”, tailored to the functionality of each programme or robot.

Will all these projects affect the way humans and machines interact? Most probably yes, although the question is whether for better or for worse. Most authors researching the subject, such as Kate Darling and Rob Brooks, argue that endowing inert beings with human qualities is something innate to our species, and that it is not problematic because people are perfectly aware that they aren’t interacting with beings on the same level. The biggest risk, they argue, is that these developments could be driven by commercial interests. This means that, in the basest of scenarios, romantic avatars could be found recommending the purchase of certain products, and in the most underhand, fostering attachment and dependency as a new form of attention economy. In any case, it seems that we won’t have to wait long to find out, as advances in artificial intelligence and the priorities of the tech sector seem to be opening up a new emotional market.

All rights of this article reserved by the author

View comments0

Leave a comment