Technogrief: Love, Loss and Labour in an Age of Digital Domesticity

The possibility of establishing emotional relationships with home technology raises new dilemmas and questions.

Actress Ruth Roman and her son with a toy robot. New York, 1956

Actress Ruth Roman and her son with a toy robot. New York, 1956 | Al Ravenna, Library of Congress | No known copyright restriction

Although it may seem dystopian, home devices designed for emotional work are a useful and important part of life for more and more people. This phenomenon raises questions about the limits of companionate care and highlights the dangers of allowing the tech industry to enter our most intimate spheres.

I click a link and see a weeping American child resting her head upon her father’s chest. “Enjoy the few days that you have left with her, OK?” he says, stroking her fair hair, his own eyes pricked with tears. “I don’t want her to leave,” says the girl, burying her face deeper into her father’s shoulder. A dramatic instrumental soundtrack swells around them. “I know baby. I know you like her.” He kisses the top of the girl’s head, rubs her back. “She’s your friend. But it’s gonna be OK, alright? Enjoy the time you have now, OK? I’m sorry baby.” It appears that there is soon to be a death in the family, and emotions are running high.

I feel sorrowful and uncomfortable consuming this little girl’s grief. It is not the kind of content that typically crosses my feed, and I do not follow the creator; rather, I have been directed here by a social media contact. But why? As the intimate father-daughter moment unfolds, I start to realise that things are not as they seem. “There’s honestly nothing that we can do about it,” explains Dad, gently, “because the company ran out of money. And you know, who knows, maybe somebody will be able to buy the company and make it work again. But right now I just want you to play with her as much as you can, OK?”

It is not a cherished grandmother who is passing away, then, or even a beloved pet. Rather, the child is in pre-emptive mourning for her companion robot, and this is why I have been forwarded scenes of her grief. The video cuts to the girl sitting cross-legged on the living room floor, surrounded by signifiers of everyday life – discarded shoes, toys and craft projects, a lit Christmas tree and an oblivious, tail-wagging dog. And there, in the middle of it all, is Moxie, a “robot with a large face that maintained eye contact with children, had seven points of articulation, and was made from a soft-touch material for hugging.”[1]

While the phrase “domestic technology” is still more commonly  associated with white goods such as washing machines, dishwashers and refrigerators, these kinds of robots are nevertheless a real growth area for devices in the home. Indeed, the global personal robotics market reached $21.5 billion in 2019, with one market intelligence firm suggesting that we can expect this to reach $51.5 billion by 2030. Personal robots have a range of uses – security and physical assistance, for example. But the category predicted to witness the fastest growth over the next ten years is companion robots – robots intended to keep the user company, or to evoke a sense of companionship.[2] Designed to generate a relational connection, these devices can be thought of as doubly domestic; they are situated not just within the physical space of the home, but within the affective space of intimacy, togetherness, and (dare one say it?) love.

At a basic level, Moxie operates as a conversational chatbot using cloud processing for a large language model (LLM). It is designed to be maximally responsive in ways that map onto human communication. The “robot and its AI algorithms not only hear and process what you have to say but can detect your facial expressions and tone of voice. This will result in responses from Moxie to your comments and questions, along with accompanying arm, eyebrow and mouth movements.”[3] Unlike some other social robots, this device does not showcase any additional bonus features; it does not offer to read your voicemail messages, or order your dinner, or control your smart home. Rather, it is marketed as a much more focused tool – one precision engineered to foster a deep, companionate bond and to support the emotional development of five- to-ten-year-olds.

It is counterintuitive, perhaps, to countenance the idea that domestic technologies can successfully undertake emotional labour – that they cannot only do love, but do it well.[4] Indeed, the notion that companionate care might be roboticized or outsourced to technology is often viewed as a dystopian proposition – a nightmare vision of inauthentic feeling and lack of real connection. As the writer and activist Emily Kenway puts it, “Do you want [your carer] to recognise your expression as sadness because it matches what their internal database says sadness looks like, but have no inner recognition of what that might actually feel like? Do you want to be serviced, or cared for? They’re not the same thing.”

An early critique of Fully Automated Luxury Communism similarly insists that care robots and other forms of automation cannot meet the “requirement for human interaction” which will form “a key part of the provision of care in any future communist society.” Robots can’t love you back, these comments suggest, and as such the emotional labour they provide will always be a poor substitute for the care that humans can deliver. But recognising that these robot relationships are parasocial – that they are not (not yet?) reciprocal or premised on mutual recognition – is not to discount them as ineffectual. In fact, if TikTok videos of technogrief can teach us anything at all, it is that these devices are problematic precisely because they can work so well.  That is to say, it is not the failures of this tech but rather its pleasures, joys and successes that make it so troubling under current social conditions.

Moxie’s death was occasioned by the fact that “an expected funding round fell through at the last minute,”[5] causing its makers to tank. Commentators have attributed the company’s financially perilous situation to, amongst other things, the “cost of the cloud-based LLM that the company uses to run the robot.”[6] The alternative – to have the processing done on the device itself – can be costly and difficult:

For a start, you need some serious hardware to run LLMs locally, because it’s extremely processor intensive. You also need to store a lot of local data, the Large Language Models (LLMs) and other information required for the AI to function, which could be considerable in some cases […] Storing that data, collecting more over time, and processing it, takes a lot of resources to pull off. Certainly too much of the $799 sale price of the Moxie to properly pull off.[7]

These seem like justifiable reasons to rely on the cloud, until one considers how vulnerable it leaves not only the device but also its users. After all, “Moxie cannot run with local processing, and it cannot be used offline at all.”[8] It therefore risks being reduced to a very expensive plastic ornament in the face of bad luck and financial uncertainty (with little children left weeping over their soon-to-be-departed robo-friends as a result).

The most obvious reaction to this situation is to highlight the massive influx of issues that accompanies the rise of companion robots, and relationships between humans and AI agents more generally. As parts of the emotional labour currently associated with domestic attachment are taken up by AI-powered friends, lovers, caregivers and confidantes, we will increasingly face the risks and consequences of inanimate intimacies. Indeed, as Annabel Blake remarks, the whole Moxie situation “raises questions that would have seemed absurd to parents/guardians a decade ago: How do we mourn machines? What rights should children have to memories they shared with their AI ‘friends’? And perhaps most crucially, how do we prepare young people for a world where relationships increasingly blur the line between the ‘artificial’ and the organic?”

More than this, however, Moxie’s demise reminds us that the tech industry (and the capitalist system of which it is increasingly assumed to be paradigmatic) is not an appropriate mechanism via which to strive for collective human flourishing. Just as with an increasing portion of analogue, human-to-human care, its motivation lies in profit not people. Tamara Kneese notes that Silicon Valley is “open to failure, often encouraging it as a badge of honour in a system that privileges hubris and risk-taking more than dependability. Fail fast, fail often; move fast and break things: these are the mantras of the tech world.” This provides a particularly troubling context in which to situate the emergence of technologies aimed at emotional labour and companionate bonding.

As the online discourse of machine mourning makes clear, these devices can and do mean things to people, regardless of the fact that the feeling clearly isn’t mutual. I agree with Seth Lazar’s assertion that as “AI systems become ever better at simulating everything that we care about, a fully worked-out theory of the value of the real, the authentic, will become morally and practically essential.” Whereas he ultimately comes down on the side of authenticity, championing the moral worth of the “real thing” over its simulation, I feel compelled to defend inanimate intimacies, in all their artificiality. That parasocial care is putatively less authentic does not diminish its potential utility. If making the user feel a certain way is the whole point of a technology, then the user’s subjective experience represents the sole metric of its success; if the user feels comforted, calmed, encouraged or supported, the fact that the device feels nothing in return is perhaps of little consequence.

After all, these devices are tools not friends; affect machines harnessed for their ability to induce feelings and states of mind, rather than conscious beings to recognise and be recognised by. If people are willing and able to form personally consequential attachments to companion robots, and these attachments are not a fallback or a sticking plaster in the face of inadequate provision of interpersonal care, then a lack of reciprocity seems to me to be a non-issue. This requires a wider culture of emotional and material abundance, albeit abundance of a kind that does not delegate the bulk of techno-affective experimentation to profit-making enterprises. For as long as it can be manipulated, transformed, or snatched away in deference to the bottom line, anxieties around technologized care will continue to be sensible and well-founded.

As of right now, it seems that Moxie might have a second chance, thanks to an “eleventh-hour open-source attempt to keep the robot running.”[9] Its maker recently announced that it’s planning “to release a local server application, ‘OpenMoxie,’ which will run on a computer on the home network.”[10] Following an “over-the-air update for Moxie itself, owners can then run the server app, which Moxie will connect to for processing.”[11] There are no guarantees this will work, as the company is at pains to point out, and its success will depend on a massive collective effort. The Moxie Community Reddit page is currently awash with pleas for assistance and posts about time-consuming technical woes (signs of a dramatic proliferation of high-tech housework). But all is not lost; people are stepping up to help each other and to keep Moxie going for those who love it. Given the amount of thought and energy going into salvaging this niche bit of kit, it would seem that some companions really are worth the effort.

This article is part of a series curated by Marta Echaves on the future of work.


[1] Wuerthele, M. (2024). “The death of a robot designed for autistic children proves Apple’s on-device AI is the right path.” Apple Insider.

[2] Prescient and Strategic Intelligence, 2020. “Personal Robots Market Research Report: By Offering (Hardware, Software), Type (Cleaning Robots, Entertainment & Toy Robots, Educational Robots, Handicap Assistance Robots, Companion Robots, Personal Transportation Robots, Security Robots) – Global Industry Analysis and Growth Forecast to 2030

[3] Lee, B. Y., 2024. “Moxie: How This AI Robot Is Designed To Teach Kids Emotional Intelligence”, Forbes.

[4] There is a lot to be said about emotional labour in this context – what it is and isn’t, and who can or cannot perform it. My thoughts on this can be found elsewhere. See Hester, H. (2024). “The Automated Heart: Digital Domesticity and Emotional Labour Saving”, New Vistas 10.2.

[5] Maxwell, T. (2024). “Moxie’s $799 Robot Companion for Children Is Going to Die”, Gizmodo.

[6] Wuerthele (2024).

[7] Wuerthele (2024). The substantial price tag attached to Moxie reminds us that, as things stand, the issue of techno-grief is primarily a problem for the privileged. One has to be able to afford a companion robot in the first place in order to later mourn its loss. That being said, Moxie was also available for monthly rental at one point, and could even be checked out free of charge from some forward-thinking US public libraries. Efforts to equalise access to this tech and its benefits will also result in more people being exposed to their associated risks, meaning that techno-grief is more than just a niche issue.

[8] Wuerthele (2024).

[9] Owen, M. (2024). “Moxie robot may be saved by a last-minute open-sourcing effort”, Apple Insider.

[10] Ibid.

[11] Ibid.

All rights of this article reserved by the author

View comments0

Leave a comment

Technogrief: Love, Loss and Labour in an Age of Digital Domesticity