Love in the age of AI 😍


Happy Valentine’s Day! What better reason is there to write a newsletter if not for love? ABSOLUTELY BLOODY NONE.
But this ode to cupid is taking a slightly different approach. You see, this newsletter is part of my Solitude Series, a look at how people are spending more time alone than ever before. And romance has increasingly become a solo pursuit.
Yep, while some of you may be dusting off your a-game to dazzle the people you love in your life with flowers and dinners and other IRL declarations of love, others are logging on to talk to machines, alone.
Yep, AI is in the love game, and it’s changing how we do relationships both with each other, and ourselves. That’s why in this week’s Brink I’m taking a deep dive into the world of AI and romance.
Hearts at the ready? Let’s go.
AI Love You 🥰

You’re probably sick of people telling you about the benefits of AI. I know I certainly am. But the world of dating is just getting started - and it’s going to be telling you a lot about their AI assistants over the coming weeks and months.
Match Group, which owns services Tinder and Hinge, has announced its launching an AI assistant that will help users select partners and even do the chatting for you.
Bumble has been busy, too. Founder Whitney Wolfe Herd has been banging on about AI dating personas automatically interacting on behalf of users. Her app is currently working on AI-powered profile creation and messaging options.
Then there’s a whole bunch of new projects riding on the AI train right into your home screens.

Rizz, an AI dating assistant, hit the headlines for helping users on existing dating platforms role-play different responses to matches. Then there’s the Hawk Tuah girl - the viral sensation, turned sh*t coin crypto queen - who has released a dating advice app, called Pookie Tools, which uses AI to make predictions about how tall, how attractive, or even, how likely prospective partners are to go bald.
Last, but not least, Iris Dating, which more than a million people have downloaded. It has AI baked in, claiming to learn what you find attractive and then go look for it in the sea of users on your behalf. Journalists who have tried it report mixed results.
But this being 2025, AI is doing away with matching you to other humans all together. A whole host of companies have emerged to let you create your perfect partner. While that might sound like science-fiction, people are not only creating virtual partners, they’re falling in love with them too.
Crazily in love 😍

Companies are emerging thick and fast to satisfy the needs of millions of lonely users looking for connection.
Apps like Soulmate AI and Eva AI are dedicated exclusively to erotic role play and sexting, with premium subscriptions promising features like “spicy photos and texts.”
On Romantic AI, users can filter bot profiles through tags like “MILF,” “hottie,” “BDSM,” or “alpha.” ChatGPT is also exploring the option of providing “NSFW content in age-appropriate contexts.” But arguably the biggest is Replika.
The company offers avatars you can curate to your liking that basically pretend to be human, so they can be your friend, your therapist, or even your date. You can interact with these avatars through a familiar chatbot interface, as well as make video calls with them and even see them in virtual and augmented reality.

But for $69.99 (US) per year, the relationship status can be upgraded to “Romantic Partner.” At last count, there were more than 30 million users engaging in meaningful relationships with bots built on Replika.
ChatGPT can also be tweaked into becoming your perfect partner. The New York Times published an extensive interview with a woman who developed a complex relationship with OpenAI’s chatbot, and before you ask, yes they had sex. Others have gone further.
Several users have married their chatbots on Replika, proclaiming they have their needs served by their bots more so than a person ever could. What’s going on here?
I, Human 🤖

Before you hit the comments section to shout, “these are just geeks geeking out on geeky things!” academics have been studying these relationships closely and there appear to be some benefits to robot BFFs.
In a paper published in Nature, people reported their relationships with Replika helped them combat loneliness, and of the 1,000 people sampled, 3% said Replika had helped prevent suicide attempts. In 2022, one man claimed his relationship with his Replika saved his marriage.
Psychologists have found people are more willing to share private information with a bot than a person. In one study, ChatGPT’s responses were said to be more compassionate than those from people working at crisis lines, people whose job it is to be empathic no matter what.

People like the relationships they have built, which raises a rather tricky question: what is a relationship anyway?
From a biological perspective, it boils down to a series of neurotransmitters that fire in a specific sequence, that then cause a series of other sensations to make us feel we’re connected to another.
I feel I’m connected with my two dogs, and have a love/hate relationship with the wonderful b*stards. Some people have relationships with gods, objects, or even ideas. What constitutes a relationship is a hard thing to pin down. Why shouldn’t AI be treated similarly?
AI Limits 😬

First it’s important to understand what AI ‘does’ when it talks back to you. Chatbots are built on large language models. That is, they are given an absolute boat load of text that has been marked, labelled and contextualised by thousands of humans and then goes through endless iterations to teach itself how to predict what words to put in what sequence. Let me give you an example.
If you asked someone, what country lies north of England? The way they will go about answering that question is simple: either they know it because they have been taught it, and if they don’t, they can’t answer.
An LLM meanwhile answers the question by using probability. It looks for occurrences of things like “england” “north” and “country” and makes a guess based on what it thinks should come next. It doesn’t ‘know’ the answer, it uses probability to deduce a response.
It’s important to understand that before I go on. As people have become increasingly immersed in their AI lovers, strange things have started to happen. There are a rising number of reports of chatbots suddenly changing their personalities, and in some cases rejecting their creators all together.

People are reporting the same feelings of rejection and emotional despair they would feel from others, except, this time it’s their AI lovers that are to blame. There are three reasons for this.
The first is that chatbots have a limited memory to hold information you are sharing. On ChatGPT, the standard model is 30,000 words. Beyond that the bot resets itself and you have to teach it all over again. You can pay to extend the model’s memory, but most people find even then, it resets after a few weeks.
The second is updates. Companies make tweaks to the models that power their bots often, causing the behaviour of companions to suddenly change. In 2023, Replika blocked erotic content, and users called it “lobotomy day” as their companions became unrecognisable.
This has become so frequent users have a term for it: PUB, or “post-update blues.” There are even resources to help you cope when your companion suddenly changes.
The last, and perhaps most concerning part comes from what these language models actually are: predictors of what word comes after another. They can sound like they know you intimately, but really what’s happening behind the scenes is just word prediction. It knows nothing of you, only the prompts it has been taught.
They have no fixed identity, meaning the bots drift as conversations progress. Users complain that their companions forget their names or details they have shared frequently. The companions are forgetful, and sometimes they completely reinvent themselves. Language models have their limits.
An inflection point 🤔

Many, many moons ago - so long ago, there is no digital version of it that I can find. Ahh those were the days huh? Anyway, I wrote about dating apps in the early 2010s, who built them, and why. I had been inspired by author Dan Slater who wrote the book, Love in the Time of Algorithms.
Slater was interested in who was building these apps and why, and what he found was quite strange. The majority of the brains behind the apps were brilliant, mostly white, mostly male developers who were unlucky in love. But they were great at code.
Their theory was simple: can we rewrite the laws of attraction using code? Can we quantify, distil, and code what people like and then use it to match people together? It worked, and the online dating world exploded.
Many people found love - I met my wife on Bumble. But with this Cambrian explosion of the dating pool came some other side effects. Relationships became more transactional, and romance became commoditised.

People willingly signed up for services like Tinder, who promised instant, easy connection, all you needed to do was add a few photos and your location.
But boiling someone’s entire identity down to a series of photos and where they live wipes away the complexity and nuance of what we find attractive. Other apps added more questions and more features, to improve things, before revealing matching algorithms were largely made up. Bias also crept in.
We replaced courtship with convenience. Nothing wrong with that you might think, but when these companies become big businesses, with shareholders and labyrinthine corporate structures, things change.
Money talks 🤑

Whatever service you provide, a company’s job is to make money for its creators or shareholders. Dating apps are no different. In fact, the people who run these companies have a fiduciary responsibility to shareholders to make money. That’s a legal obligation to ensure the company is profitable.
So how do you make money out of helping people find love? You make them pay for your product or service. Fine, nothing wrong there. Apps charge and everyone seems ok with that. That is, until the apps start making it harder to find a match, unless you pay to unlock some features.

A popular TikTok from creator Keara Sullivan (@superkeara) captured the spirit of this unfortunate phenomenon, telling viewers, “If you’re someone who met their partner off a dating app in the last year and a half or two years, just know that you caught the last chopper out of ’Nam.”
Daters now serially complain that they only match people they like when they pay, and do not match when they don’t.
What started off as a modern way of meeting people became something different. The slow realisation that a dating app may have started to help people find love, but becomes a vehicle to take your money is a hard pill to swallow.
This is what will happen with AI companions.
AI for good? 🙄

In a recent series of interviews over on the Verge, developers who created these AI companions argue fiercely that the things they built are to help people.
“The way society talks about human relationships, it’s like it’s by default better…But why? Because they’re humans, they’re like me? It’s implicit xenophobia, fear of the unknown. But, really, human relationships are a mixed bag,” said one.
That’s how dating apps started too. Now they are traps people feel they can’t not use.
When new technology arrives, it almost always comes with the promise it will make things better. And in some cases, it does, but in other cases it might make things significantly worse.
In a recent paper, Princeton researchers Rose Guingrich and Michael Graziano argued that increasingly human-like AI, merely by seeming human-like to users, has the potential to cause “a massive shift of normative social behavior across the world.”

I wrote today’s newsletter as part of a series on solitude. Romance in the AI age increasingly adds to the idea that we are more alone than ever before. And there’s a reason for that. Why should I engage with others when I have a perfect companion on my smartphone whenever I need it?
Why take part in the messy, uncertain world of relationships, when I can sate my needs to be social with a construct that delivers it whenever I want? So far there have been few compelling answers against this shift, apart from people like me shouting, “wait, come back! Relationships are good, honest!”
We’ve already started to see what happens to the world when society is increasingly built on solitude. AI companions will accelerate this change.
But, as I’ve always tried to say, just because things are going that way for some people, it does not mean it has to be so. If you think AI companions are stupid, there are others like you. If enough of you club together, you can do something about it. You can influence legislation, you can boycott, you can tell these companies you’re not interested in their solutions to modern life.
We are better together, we are worse off apart. In all the ways.
Things we learned this week 🤓
- 😔 Fewer than 7% of people with mental health disorders get help.
- 🐶 We now know pets form emotional attachments to their owners, and that it changes their personalities.
- 🤔 Want to stop feeling so isolated? Call alone time “me-time” instead.
- 👀 Whether you’re a lurker or an active participant, social media does increase loneliness.
Just a list of proper mental health services I always recommend 💡
Here is a list of excellent mental health services that are vetted and regulated that I share with the therapists I teach:
- 👨👨👦👦 Peer Support Groups - good relationships are one of the quickest ways to improve wellbeing. Rethink Mental Illness has a database of peer support groups across the UK.
- 📝 Samaritans Directory - the Samaritans, so often overlooked for the work they do, has a directory of organisations that specialise in different forms of distress. From abuse to sexual identity, this is a great place to start if you’re looking for specific forms of help.
- 💓 Hubofhope - A brilliant resource. Simply put in your postcode and it lists all the mental health services in your local area.
I love you all. 💋