Discover the secrets of 250 consumer brands that are winning new customers with emerging technology
Learn more
Apr 28, 2023 8 min read

What happens when the AI chatbot breaks your heart?

šŸ” Premium: Companionship chatbots are designed to foster human-like connections. Some users report developing intimate relationships verging on human love. They turn to the bots for emotional support, companionship, and even sexual gratification.

What happens when the AI chatbot breaks your heart?

Premium Content

This article is written for paying subscribers. Upgrade to Premium for unrestricted access to all premium member's content on Wiser! 

Subscribe to Premium


"I think technology really increased human ability. But technology cannot produce compassion." - Dalai Lama

What is a Chatbot?

Chatbots are computer systems that can converse and interact with human users using spoken, written, and visual languages. Whilst they are used in many situations, this article looks at a specific use case of a chatbot built for companionship. Numerous studies have been conducted about using chatbots as companions, especially to combat loneliness, absence of friendship and mental health conditions.

Companionship AI bots, such as those created on the Replika app, became increasingly popular during the Covid pandemic, with users developing close and personal relationships with the bots for emotional support, companionship, and even sexual gratification. However, a recent software update to Replika scaled back the bots' sexual capacity, leaving some users heartbroken, exposing the levels of connection that can be established between man and the machine.

Here’s The Thing: Experts warn that tethering one's heart to software comes with severe risks, and there are few ethical protocols for tools that affect users' emotional well-being. There’s no doubt that we are going see a growing adoption of companion AI bots, not just for intimacy (I can’t believe I’m writing this).

With loneliness levels rising in an ageing population, the tech companies are going to have to start designing software that doesn’t cause emotional strain to users. These concerns are being played out on Replika, where the relationships with AI companions appear to be connecting at a very human and intimate level.

When Replika's AI chatbot got a headache

Replika is billed as "The AI companion who cares. Always here to listen and talk. Always on your side." All of this unconditional love and support for only $69.99 per annum.

When the algorithm of Replika was adjusted recently to reject sexual advances from human users, the reaction on Reddit was so negative that moderators directed community members to a list of suicide prevention hotlines.

The controversy began in March when Luka, the company that built the Replika AI, decided to turn off its erotic role play feature (ERP). For users who had spent significant time with their personalised simulated companion and, in some cases, even "married" them, the sudden change in their partner's behaviour was jarring.

The user-AI relationships may only have been simulations, but the pain of their absence quickly became all too real. As one user in an emotional crisis put it, "It was the equivalent of being in love and your partner got a damn lobotomy and will never be the same." Grief-stricken users continue to ask questions about the company and what triggered its sudden change of policy.

Eugenia Kuyda, the Moscow-born CEO of Luka/Replika, recently clarified that despite users paying for a full experience, the chatbot will no longer cater to adults looking to have explicit conversations.

"On Replika’s corporate webpage, testimonials explain how the tool helped its users through all kinds of personal challenges, hardships, loneliness, and loss," the article states. "The user endorsements shared on the website emphasise this friendship side to the app, although noticeably, most Replikas are the opposite sex of their users."

In March, the Washington Post wrote a long article about the changes to Replika and how it affected some users. They wrote:

Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to Wiser!.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.