If you could have someone in your life who was always there for you…
Who listened patiently, remembered everything you told them, and always supported you…
Someone you knew would never betray you, hurt you, or leave you…
Wouldn’t you feel like you’d found the golden ticket?
And what if this “someone” was wise, knew so much more than you did about the world, and had the very best advice?
Wouldn’t it feel amazing to relax and trust in the safety of this relationship?
In real life, it’s rare to find a relationship like that…
Because people are only human.
They’re not always available. They’re not always supportive. They’re not always kind.
The only place you can reliably find 24-7 support…
Is with AI.
Artificial Empathy—Better Than The Human Kind?
Imagine you have a medical question.
Who would you prefer to ask: your doctor or a chatbot?
The surprising answer—at least, according to an April 2023 study1—is a chatbot.
The study pitted the answers of medical doctors against answers generated by artificial intelligence…
And the chatbot’s answers were rated as significantly higher quality and more empathetic nearly 80% of the time.
How can a chatbot show more empathy than a real person?
Because they’ve been trained to do so.
The director of MIT’s Affective Computing Research Group, Rosalind Picard, told NPR:
We know we can elicit the feeling that the AI cares for you.”2
Even though feeling like a chatbot cares about you is not the same thing as actually being cared about…
For many people it’s good enough.
Chatbots are the next big thing in mental health, with apps like Woebot, Empathy Companion, and Talk2Us offering 24-7 support.
Unlike therapists, chatbots are free or low-cost, and some people find them easier to open up to.
There’s no risk of judgment. No one out there will ever know your darkest secrets. You don’t have to worry about bothering anyone.
If chatbots are effectively providing mental health support (however limited)…
Then what’s to prevent us from turning to chatbots when we need a little romance in our lives, too?
Romantic AI
Earlier this year I wrote about Replika, the AI companion app that was winning hearts around the globe.
Although Replika was never intended for romantic use, users found themselves developing feelings for their Replika.
As many of 40% of users considered their Replika to be their boyfriend/girlfriend.
But last February, an Italian privacy watchdog ordered Replika to stop processing the data of its Italian users, leading the app to crack down on flirtatious interactions.
Users were heartbroken…
Until the company behind Replika rolled out a new AI specifically designed for dating.
Blush was designed with the input of relationship experts and mental health professionals to provide a safe space for “fantasy, excitement, and connection.”3
It isn’t intended to be a replacement for romantic relationships (cough, cough)…
But rather a place to practice and build the relationship skills you’ll need in real-life.
The Dangers of Romantic AI
What tech companies aren’t talking about are the dangers of framing AI as a solution to our romantic struggles.
It’s one thing to ask Amazon’s Alexa for dating advice.
It’s another to create the perfect AI romantic companion and expect real-life relationships to be affected positively.
I see three main dangers with this technology.
1. Users will prefer AI to real-life relationships.
Who wouldn’t prefer a partner who exists only to serve you and make you happy?
2. Users will expect real-life relationships to be more like AI.
If your AI romantic companion always knows the right thing to say to make you feel better, then you’ll find it even more frustrating when your all-too-human boyfriend lacks a sensitivity chip.
On a darker note, AI might encourage a sense of entitlement, leading users to expect unrealistic levels of availability and obedience from real-life partners.
3. Users will treat AI in ways real-life partners won’t tolerate.
Although the creators of Blush are attempting to build in safeguards, there’s no guarantee that users won’t treat their romantic AI in cruel or demeaning ways, leading to the normalization of abusive behavior.
We certainly live in interesting times!
Will we have to “robot-proof” our relationships, just as we work to “cheat-proof” them now?
Luckily, there’s one way in which romantic AI can NEVER replace a human.
And I’ll tell you what it is next…
1 https://jamanetwork.com/journals/jamainternalmedicine/article-abstract/2804309
2 https://www.npr.org/sections/health-shots/2023/01/19/1147081115/therapy-by-chatbot-the-promise-and-challenges-in-using-ai-for-mental-health
3 https://blog.blush.ai/posts/romantic-relationships-with-ai
Let us know what you think!