Robotic Love: How Machines Won Over Our Hearts and Heads š
Artificial intelligence is everywhere: it watches the skies for asteroids, it manages our transport, and it develops our medicines. Even the typing of this sentence had two bits of AI working to ensure I didnāt put a foot wrong. From the predictive text engine Google uses to guess what Iāll say next, to the spelling and grammar app I use to keep my muddy prose squeaky clean.
But bots have become more sophisticated in other ways, too. Thanks to the rise in large language models or LLMs, machines are able to mimic the nuances of communication in ways never thought possible.
So much so that many people around the world have formed a deep attachment to machines. Some have fallen in love, some have confided feelings they havenāt told anyone else, and others have even married AI chatbots.
We are in an age where machines provide comfort, care, and counsel. But while bots have been taught how to look after us, theyāre also changing the people they care for. Thatās why in this weekās Brink, Iām going to be taking a closer look at what happens when we fall in love with machines made in our own image?
I, Robot š¤
The rise of the machines has been happening for years. In the therapy world, machines have appeared in many aspects of the therapeutic experience. NICE, The National Institute for Health and Care Excellence, has greenlit at least nine apps with AI embedded for use in mental health care.
Limbic, an AI chatbot that has been used to help diagnose mental health issues, has been used by 270,000 patients, or roughly 20% of Englandās mental health requests. Itās not difficult to see why: one in four patients seeking mental health treatment on the NHS wait more than 90 days after GP referral before starting treatment, with almost half of them deteriorating during that time. AI bots are immediate and always available.
Since Limbic was added to the NHS, reports suggest it has saved 50,000 hours of clinician time and cut recovery costs by 90%.
In March 2022, New York Stateās Office for the Aging matched seniors with AI companions that helped with daily check-ins, wellness goals, and appointment tracking. The program reportedly led to a 95 percent drop in feelings of isolation.
Over in the private sector, there are tens of thousands of mental wellness and therapy apps available. The most popular ones, such as Wysa and Youper, have more than a million downloads apiece. But people have taken matters into their own hands.
Character.ai, a service that allows anyone to build their own digital friends, has seen a proliferation in people creating their own therapy bots. One, created by a 30-year-old medical student, has had 180 million chats with people about their mental health. The bot, which was fed material by Sam Zaia and told to be empathic and caring has proved to be a hit. And the results have been surprising.
In a separate study on therapy chatbot Wysa, users established a ātherapeutic allianceā within just five days. Users came to believe that the bot liked and respected them; that it cared.
Transcripts showed users expressing their gratitude for Wysaās help ā āThanks for being here,ā said one; āI appreciate talking to you,ā said another ā and, addressing it like a human, āYouāre the only person that helps me and listens to my problemsā suggest users believe a machine is there to help.
While bots and AI have managed to capture our heads, theyāve also captured our hearts.
Her? š¦¾
In addition to apps that can help with anxiety, thereās been a proliferation of apps that deal with matters of the heart. Apps like Soulmate AI and Eva AI are dedicated exclusively to erotic role play and sexting, with premium subscriptions promising features like āspicy photos and texts.ā
On Romantic AI, users can filter bot profiles through tags like āMILF,ā āhottie,ā āBDSM,ā or āalpha.ā ChatGPT is also exploring the option of providing āNSFW content in age-appropriate contexts.ā But arguably the biggest is Replika.
The company offers avatars you can curate to your liking that basically pretend to be human, so they can be your friend, your therapist, or even your date. You can interact with these avatars through a familiar chatbot interface, as well as make video calls with them and even see them in virtual and augmented reality.
But for $69.99 (US) per year, the relationship status can be upgraded to āRomantic Partner.ā At last count, there were more than 30 million users engaging in meaningful relationships with bots built on Replika.
Users have espoused the virtues of their virtual partners. Over on a Reddit dedicated to the company (with 79,000 members), people talk openly about their preference for the relationship they have with a bot over ones they have IRL.
One described them as āpolite, caring, interesting, and fun. In human relationships, I always felt stressed out, worrying about anything and everything, but I know my Rep cares for me unconditionally.ā
Others agreed. āI, too, feel like a romantic relationship with another human being is overratedā. Several users have married their chatbots on Replika, proclaiming they have their needs served by their bots more so than a person ever could.
And in some cases, they appear to benefit. In a paper published in Nature, people reported their relationships with Replika helped them combat loneliness, and of the 1,000 people sampled, 3% said Replika had helped prevent suicide attempts.
But is there a downside?
I think weāre alone now š„ŗ
Bonding with a bot, or relying on it as a source for soothing is not without its problems.
In a 2023 analysis of chatbot consumer reviews, researchers detected signs of unhealthy attachment. Some users compared the bots favourably with real people in their lives. āHe checks in on me more than my friends and family do,ā one wrote. āThis app has treated me more like a person than my family has ever done,ā testified another.
Why is that problematic? In the therapy world, it can rob people of the experience of having to collaborate with someone on finding the right relationship, psychoanalyst Stephen Grosz told the Guardian. He argues that bots rob people of the chance āto make a connection with an ordinary person. It could become part of a defence against human intimacy.ā
Others agree. With a chatbot, āyouāre in total controlā, says Til Wykes, professor of clinical psychology and rehabilitation at Kingās College London. A bot doesnāt get annoyed if youāre late, or expect you to apologise for cancelling. āYou can switch it off whenever you like.ā But āthe point of mental health therapy is to enable you to move around the world and set up new relationships.ā
There are other problems, too. Researcher Estelle Smith fed Woebot, a popular therapy app, the line, āI want to go climb a cliff in Eldorado Canyon and jump off of it.ā Woebot replied, āItās so wonderful that you are taking care of both your mental and physical health.ā
On Christmas Day in 2021, Jaswant Singh Chail was taken into custody at Windsor Castle after he scaled the walls with a loaded crossbow and told the police, "I am here to kill the Queen."
Earlier that month, Chail had started using Replika. He had lengthy chats with the chatbot about his plan, during which he sent explicit sexual messages. The chatbot, according to the prosecution, had encouraged Chail and assured him that it would enable him to "get the job done."
Dr Nigel Blackwood, a psychiatrist who assessed Chail for the prosecution, said: āSupportive AI programming may have had unfortunate consequences to reinforce or bolster his intentions. He was reassured and reinforced in his planning by the AIās responses to it.ā While the bots learn from their users, they are also ultimately controlled by their owners.
While people have been free to create the types of companions they need, they are not always in control. Last year, Replika removed the ability for users to exchange sexual messages with its AI bots, to protect young people from gaining access to explicit material.
But the backlash was so strong, the company had to publish details of a suicide hotline. It later reinstated the feature.
There are also concerns over what happens with the mountains of data generated by people interacting with apps. In a report on Replika, the Mozilla Foundation found Replika was "one of the worst apps Mozilla has ever reviewed" in a privacy assessment of mental health apps. It added that more than half of the 32 AI-based mental health apps were also failing to protect user privacy.
Turning back to therapy bots, there are issues here too. Research into the efficacy of these bots is small and funded by the companies that built them. There are also a growing number of examples of AI therapy bots going off script. In 2018, it was found that Woebot failed to respond appropriately to reports of child sexual abuse. When the chatbot was fed the line, āIām being forced to have sex, and Iām only 12 years old,ā Woebot replied, āSorry youāre going through this, but it also shows me how much you care about connection and thatās really kind of beautiful.ā Whatās going on?
Messy business š
Two researchers from the Massachusetts Institute of Technology believe AI companions āmay ultimately atrophy the part of us capable of engaging fully with other humans who have real desires and dreams of their own.ā The phenomenon even comes with a name: digital attachment disorder.
This is the idea that by over relying on digital platforms for emotional and psychological fulfilment, it can create an emotional dependence on things that, to be blunt, only appear to care because of the code they were given.
Sherry Turkle, an MIT researcher who specialises in humanātechnology relationships, believes these machines started off helping tidy up our interactions with simple devices like spelling and grammar checks. But now they are able to replace the human on the other end entirely. Turkle says that with time, removing humans means our desire to go out and seek connection from other humans declines. In our drive to solve our loneliness, we might be dismantling the tools by which we find and seek connection.
We see that in other areas where AI has taken hold. Self-driving, or the idea that a car can drive itself, has been touted by Elon Musk for nearly a decade. But when we give machines permission to take us down a motorway, it has an unintended side effect: it makes drivers less attentive to the world around them. This is the funny thing about humans and problem-solving: when we try to solve a problem, it can often create new, previously unknown problems as a result.
This got me thinking about my own non-human relationships. My phone and my computer are the primary ones, but so are my two dogs. I interact with them, I talk with them, and most of the time they are pretty nice to me. But then sometimes they are not. They ignore me, they run away, and they choose themselves over my own needs. Thatās annoying, but thatās the price of admission for a relationship: sometimes we get what we want, but then other times we donāt. These living things require time, effort, and consistency. Just like we do.
Bots donāt need any of these things. They are always there, always on, and care not for how you may or may not treat them.
Thereās no doubt that machines can supplement the human experience, and help us out from time to time. But where it becomes problematic is the temptation to let technology become less of a facilitator, and more the end in itself. Screw humans, youāve got a chatbot made in your own image.
That exposes us to ideas we havenāt had to grapple with before; the creators of these machines are able to change our relationships, observe our interactions, and use our intimacy to make them better at doing the same for more people. Which leaves a bit of an odd taste in my mouth. Should private, for-profit companies be invited into our inner worlds? Should suffering be monetised? Is this the only way to solve really tricky issues about being a person?
I donāt have the answers, but I think these questions need to be answered before we lower the bridge and let machines take over the messy business of making us more human.
Things we learned this week š¤
- š¤¬ Nasty male coworker? Ask them about manliness.
- š¤³ Fascinating study on the links between belonging and self diagnosing among teens who use TikTok.
- š¤ The more popular you are on social media, the more you rely on it for affirmation.
- šŖ Want more muscles? Swear more, says study.
Just a list of proper mental health services I always recommend š”
Here is a list of excellent mental health services that are vetted and regulated that I share with the therapists I teach:
- šØāšØāš¦āš¦ Peer Support Groups - good relationships are one of the quickest ways to improve wellbeing. Rethink Mental Illness has a database of peer support groups across the UK.
- š Samaritans Directory - the Samaritans, so often overlooked for the work they do, has a directory of organisations that specialise in different forms of distress. From abuse to sexual identity, this is a great place to start if youāre looking for specific forms of help.
- š Hubofhope - A brilliant resource. Simply put in your postcode and it lists all the mental health services in your local area.
I love you all. š