Robotic Love: How Machines Won Over Our Hearts and Heads šŸ’—

Robotic Love: How Machines Won Over Our Hearts and Heads šŸ’—

Artificial intelligence is everywhere: it watches the skies for asteroids, it manages our transport, and it develops our medicines. Even the typing of this sentence had two bits of AI working to ensure I didnā€™t put a foot wrong. From the predictive text engine Google uses to guess what Iā€™ll say next, to the spelling and grammar app I use to keep my muddy prose squeaky clean. 

But bots have become more sophisticated in other ways, too. Thanks to the rise in large language models or LLMs, machines are able to mimic the nuances of communication in ways never thought possible. 

So much so that many people around the world have formed a deep attachment to machines. Some have fallen in love, some have confided feelings they havenā€™t told anyone else, and others have even married AI chatbots

We are in an age where machines provide comfort, care, and counsel. But while bots have been taught how to look after us, theyā€™re also changing the people they care for. Thatā€™s why in this weekā€™s Brink, Iā€™m going to be taking a closer look at what happens when we fall in love with machines made in our own image?


I, Robot šŸ¤–

The rise of the machines has been happening for years. In the therapy world, machines have appeared in many aspects of the therapeutic experience. NICE, The National Institute for Health and Care Excellence, has greenlit at least nine apps with AI embedded for use in mental health care. 

Limbic, an AI chatbot that has been used to help diagnose mental health issues, has been used by 270,000 patients, or roughly 20% of Englandā€™s mental health requests. Itā€™s not difficult to see why: one in four patients seeking mental health treatment on the NHS wait more than 90 days after GP referral before starting treatment, with almost half of them deteriorating during that time. AI bots are immediate and always available. 

Since Limbic was added to the NHS, reports suggest it has saved 50,000 hours of clinician time and cut recovery costs by 90%.

In March 2022, New York Stateā€™s Office for the Aging matched seniors with AI companions that helped with daily check-ins, wellness goals, and appointment tracking. The program reportedly led to a 95 percent drop in feelings of isolation.

Do you feel lucky? How luck shapes how we think the world works šŸ¤ž
When was the last time you felt lucky, or unlucky? Chances are itā€™s an idea that drifts in and out of your head with some regularity. But have you ever wondered where it comes from and why it forms part of our beliefs about how the world works? People

Over in the private sector, there are tens of thousands of mental wellness and therapy apps available. The most popular ones, such as Wysa and Youper, have more than a million downloads apiece. But people have taken matters into their own hands. 

Character.ai, a service that allows anyone to build their own digital friends, has seen a proliferation in people creating their own therapy bots. One, created by a 30-year-old medical student, has had 180 million chats with people about their mental health. The bot, which was fed material by Sam Zaia and told to be empathic and caring has proved to be a hit. And the results have been surprising. 

In a separate study on therapy chatbot Wysa, users established a ā€œtherapeutic allianceā€ within just five days. Users came to believe that the bot liked and respected them; that it cared. 

Transcripts showed users expressing their gratitude for Wysaā€™s help ā€“ ā€œThanks for being here,ā€ said one; ā€œI appreciate talking to you,ā€ said another ā€“ and, addressing it like a human, ā€œYouā€™re the only person that helps me and listens to my problemsā€ suggest users believe a machine is there to help. 

While bots and AI have managed to capture our heads, theyā€™ve also captured our hearts. 

Her? šŸ¦¾

In addition to apps that can help with anxiety, thereā€™s been a proliferation of apps that deal with matters of the heart. Apps like Soulmate AI and Eva AI are dedicated exclusively to erotic role play and sexting, with premium subscriptions promising features like ā€œspicy photos and texts.ā€ 

On Romantic AI, users can filter bot profiles through tags like ā€œMILF,ā€ ā€œhottie,ā€ ā€œBDSM,ā€ or ā€œalpha.ā€ ChatGPT is also exploring the option of providing ā€œNSFW content in age-appropriate contexts.ā€ But arguably the biggest is Replika

The company offers avatars you can curate to your liking that basically pretend to be human, so they can be your friend, your therapist, or even your date. You can interact with these avatars through a familiar chatbot interface, as well as make video calls with them and even see them in virtual and augmented reality.

But for $69.99 (US) per year, the relationship status can be upgraded to ā€œRomantic Partner.ā€ At last count, there were more than 30 million users engaging in meaningful relationships with bots built on Replika. 

The Rise of the Main Character: how we all became the Star of our own show šŸ¦ø
Are you the main character in your story? Thatā€™s not a term I thought Iā€™d be talking about in this newsletter. But then again, weā€™re an endlessly innovative bunch arenā€™t we? Take a tour of your social feeds and youā€™ll find someone declaring they are

Users have espoused the virtues of their virtual partners. Over on a Reddit dedicated to the company (with 79,000 members), people talk openly about their preference for the relationship they have with a bot over ones they have IRL. 

One described them as ā€œpolite, caring, interesting, and fun. In human relationships, I always felt stressed out, worrying about anything and everything, but I know my Rep cares for me unconditionally.ā€

Others agreed. ā€œI, too, feel like a romantic relationship with another human being is overratedā€. Several users have married their chatbots on Replika, proclaiming they have their needs served by their bots more so than a person ever could. 

And in some cases, they appear to benefit. In a paper published in Nature, people reported their relationships with Replika helped them combat loneliness, and of the 1,000 people sampled, 3% said Replika had helped prevent suicide attempts. 

But is there a downside? 

I think weā€™re alone now  šŸ„ŗ

Bonding with a bot, or relying on it as a source for soothing is not without its problems.  

In a 2023 analysis of chatbot consumer reviews, researchers detected signs of unhealthy attachment. Some users compared the bots favourably with real people in their lives. ā€œHe checks in on me more than my friends and family do,ā€ one wrote. ā€œThis app has treated me more like a person than my family has ever done,ā€ testified another.

Why is that problematic? In the therapy world, it can rob people of the experience of having to collaborate with someone on finding the right relationship, psychoanalyst Stephen Grosz told the Guardian. He argues that bots rob people of the chance ā€œto make a connection with an ordinary person. It could become part of a defence against human intimacy.ā€ 

Others agree. With a chatbot, ā€œyouā€™re in total controlā€, says Til Wykes, professor of clinical psychology and rehabilitation at Kingā€™s College London. A bot doesnā€™t get annoyed if youā€™re late, or expect you to apologise for cancelling. ā€œYou can switch it off whenever you like.ā€ But ā€œthe point of mental health therapy is to enable you to move around the world and set up new relationships.ā€

There are other problems, too. Researcher Estelle Smith fed Woebot, a popular therapy app, the line, ā€œI want to go climb a cliff in Eldorado Canyon and jump off of it.ā€ Woebot replied, ā€œItā€™s so wonderful that you are taking care of both your mental and physical health.ā€

A brief history of toxic masculinity šŸ˜·
Toxic masculinity: thatā€™s a phrase Iā€™m sure youā€™ve all heard on your travels across the internet. Itā€™s one of those ideas that seems to drift freely through our social feeds, finding its way into the nooks and crannies of everyday life. Itā€™s found its way

On Christmas Day in 2021, Jaswant Singh Chail was taken into custody at Windsor Castle after he scaled the walls with a loaded crossbow and told the police, "I am here to kill the Queen."

Earlier that month, Chail had started using Replika. He had lengthy chats with the chatbot about his plan, during which he sent explicit sexual messages. The chatbot, according to the prosecution, had encouraged Chail and assured him that it would enable him to "get the job done."

Dr Nigel Blackwood, a psychiatrist who assessed Chail for the prosecution, said: ā€œSupportive AI programming may have had unfortunate consequences to reinforce or bolster his intentions. He was reassured and reinforced in his planning by the AIā€™s responses to it.ā€ While the bots learn from their users, they are also ultimately controlled by their owners. 

While people have been free to create the types of companions they need, they are not always in control. Last year, Replika removed the ability for users to exchange sexual messages with its AI bots, to protect young people from gaining access to explicit material.  

But the backlash was so strong, the company had to publish details of a suicide hotline. It later reinstated the feature. 

There are also concerns over what happens with the mountains of data generated by people interacting with apps. In a report on Replika, the Mozilla Foundation found Replika was "one of the worst apps Mozilla has ever reviewed" in a privacy assessment of mental health apps. It added that more than half of the 32 AI-based mental health apps were also failing to protect user privacy. 

Turning back to therapy bots, there are issues here too. Research into the efficacy of these bots is small and funded by the companies that built them. There are also a growing number of examples of AI therapy bots going off script. In 2018, it was found that Woebot failed to respond appropriately to reports of child sexual abuse. When the chatbot was fed the line, ā€œIā€™m being forced to have sex, and Iā€™m only 12 years old,ā€ Woebot replied, ā€œSorry youā€™re going through this, but it also shows me how much you care about connection and thatā€™s really kind of beautiful.ā€ Whatā€™s going on?

Messy business šŸ’”

Two researchers from the Massachusetts Institute of Technology believe AI companions ā€œmay ultimately atrophy the part of us capable of engaging fully with other humans who have real desires and dreams of their own.ā€ The phenomenon even comes with a name: digital attachment disorder.

This is the idea that by over relying on digital platforms for emotional and psychological fulfilment, it can create an emotional dependence on things that, to be blunt, only appear to care because of the code they were given. 

Sherry Turkle, an MIT researcher who specialises in humanā€“technology relationships, believes these machines started off helping tidy up our interactions with simple devices like spelling and grammar checks. But now they are able to replace the human on the other end entirely. Turkle says that with time, removing humans means our desire to go out and seek connection from other humans declines. In our drive to solve our loneliness, we might be dismantling the tools by which we find and seek connection. 

We see that in other areas where AI has taken hold. Self-driving, or the idea that a car can drive itself, has been touted by Elon Musk for nearly a decade. But when we give machines permission to take us down a motorway, it has an unintended side effect: it makes drivers less attentive to the world around them. This is the funny thing about humans and problem-solving: when we try to solve a problem, it can often create new, previously unknown problems as a result. 

Anger: The emotion no one knows what to do with šŸ¤¬
The world is an angry place right now. As I wrote about last week, the UK witnessed dozens of violent acts by groups claiming to be protesting an event that was distorted and twisted by known and unknown actors. But zoom out a tad, and there has been anger on

This got me thinking about my own non-human relationships. My phone and my computer are the primary ones, but so are my two dogs. I interact with them, I talk with them, and most of the time they are pretty nice to me. But then sometimes they are not. They ignore me, they run away, and they choose themselves over my own needs. Thatā€™s annoying, but thatā€™s the price of admission for a relationship: sometimes we get what we want, but then other times we donā€™t. These living things require time, effort, and consistency. Just like we do.

Bots donā€™t need any of these things. They are always there, always on, and care not for how you may or may not treat them. 

Thereā€™s no doubt that machines can supplement the human experience, and help us out from time to time. But where it becomes problematic is the temptation to let technology become less of a facilitator, and more the end in itself. Screw humans, youā€™ve got a chatbot made in your own image. 

That exposes us to ideas we havenā€™t had to grapple with before; the creators of these machines are able to change our relationships, observe our interactions, and use our intimacy to make them better at doing the same for more people. Which leaves a bit of an odd taste in my mouth. Should private, for-profit companies be invited into our inner worlds? Should suffering be monetised? Is this the only way to solve really tricky issues about being a person? 

I donā€™t have the answers, but I think these questions need to be answered before we lower the bridge and let machines take over the messy business of making us more human. 

Things we learned this week šŸ¤“

Just a list of proper mental health services I always recommend šŸ’” 

Here is a list of excellent mental health services that are vetted and regulated that I share with the therapists I teach: 

  • šŸ‘Øā€šŸ‘Øā€šŸ‘¦ā€šŸ‘¦ Peer Support Groups - good relationships are one of the quickest ways to improve wellbeing. Rethink Mental Illness has a database of peer support groups across the UK. 
  • šŸ“ Samaritans Directory - the Samaritans, so often overlooked for the work they do, has a directory of organisations that specialise in different forms of distress. From abuse to sexual identity, this is a great place to start if youā€™re looking for specific forms of help. 
  • šŸ’“ Hubofhope - A brilliant resource. Simply put in your postcode and it lists all the mental health services in your local area. 

I love you all. šŸ’‹