Join The Community | A Private Members Club for the Tech Economy
Learn more
Feb 25, 2022 6 min read

Is Replika the AI cure for Loneliness?

Wiser! Essay: Half of Americans experience loneliness, which increases mortality rates by a quarter. It's a problem. So, is the likes of Replika and their AI chatbots the solution to loneliness? And what happens if they start to go wrong?


Write, Schedule and Engagement Tool for Twitter

I use Typefully to write Twitter Threads as part of my social media engagement strategy. Threads are a great way to tell a story without being confined to 280 characters. Typefully is a super-easy way to structure your Twitter Thread before scheduling the tweet to post at the most popular time for you! (This is an affiliate link that will earn Wiser! a small commission at no cost to you if you sign up to the app).


Plasma is beautiful.
Photo by Josh Riemer / Unsplash
"I think technology really increased human ability. But technology cannot produce compassion." - Dalai Lama
Augmented and Virtual Reality is going to be the next major computing platform.” - Mark Zuckerberg

BackStory:  There's this online chat service called Replika. The thing is that the "person" you chat with is not actually a human-being person, but an AI chatbot person that the user has created. They are simply lines of computer code and data that uses machine learning algorithms to get to know you better.

When you want to chat with someone, you can chat with your Replika chatbot, who is always available, always responds and never gets tired. That's a good thing, right? Because loneliness is a problem, isn't it?

Loneliness is linked to a 26% increase in the likelihood of mortality, rising to 45% amongst seniors.

In a 2019 YouGov survey, it estimated that 52% of Americans "sometimes or always felt alone". According to the OECD, almost 1 in 10 Americans "do not have a friend or relative they can count on". For comparison, in the UK that figure is 6%.

The evidence suggests that Social Media has made the problem of loneliness worse. According to this 2020 survey by Cigna, 73% of "very heavy social media" users felt lonely compared to 52% of "light users of social media".

And the pandemic has gone and made it even worse. In a 2021 YouGov poll, over a third of Americans reported being "more lonely than before Covid". Which also found that there was a greater impact on men (30.6% men compared to 25.7% women).

Wiser! Newsletter 🤖

Join the mailing list and never miss an update

Great! Check your inbox and click the link.
Sorry, something went wrong. Please try again.

Replika:  There's a subreddit group called /replika with 42k users. On this subreddit group, users share examples of their chats with their Replika chatbots. Most are harmless and humorous. However, sadly and inevitably, some users push the conversations to the extremes and post examples of abusing their online buddies.

It's a sad and depressing side of humanity that social media and anonymous technologies like Replika have enabled to fester. Just as the debate over content moderation rightly points out, it wasn't Facebook or Twitter that wrote the articles calling for an insurrection. Nor is Replika abusing its chatbots. But that doesn't excuse the tech platforms from taking some action. The question is what?

This article in Futurism begs the question "is abusing a machine the same as abusing a human?".

Men Are Creating AI Girlfriends and Then Verbally Abusing Them
A grisly trend has emerged: users who create AI partners, act abusive toward them, and post the toxic interactions online.

Does AI need more regulation?

So What?:  Like so many areas of the Tech Economy, regulations are absent when it comes to AI chatbots. The flip side of the many good things that come from tech is that they also open up the possibility of abuse and exploitation by bad people.

Particularly when it comes to tackling loneliness (given the vulnerability of the user in these scenarios), there are 4 key areas where regulation is needed.


AI chatbots can and will collect lots of personal data, tons of it. Do they need all of it? What do they do with it? Do they keep it? Who has access to it? These are the questions that need to be asked and answered.


We will all come across people who get a kick out of spreading bias, harmful or incorrect information or feeding negative sentiments. If it can happen in real life, it can happen with AI. The Chatbot may even be doing it but without it being obvious. It's not a lot different from the absence of regulation that prevents  Instagram or TikTok feeding teenage girls content that fuels their feelings of inadequacy.

Or opaque Facebook ads that manipulate readers for a political agenda.

The issue is that AI Chatbots could be spreading negative sentiment about any number of issues (race, sexuality, politics, religion) that feed off the user's biases.

Data Security

Personal companions are privy to many secrets, including bank account details, passwords and knowing where the family silver is kept. This isn't just an AI chatbot problem, real-life personal assistants are known to steal from the people they're looking after. The thing is that a chatbot will be designed to "look" harmless, not "shifty". Regulations would add protections for vulnerable users from deceptively friendly-looking, but ill-intentioned chatbots that build fake relationships.


Who or what is behind the app that a user just downloaded to their phone is a key question. Clever marketing and great UI doesn't mean that the AI is "qualified" to do what it promises to do. Especially if the chatbot is to give medical advice about health symptoms. Or offer support and guidance to someone with mental health issues. When someone is unwell, they are particularly vulnerable as they search desperately for a cure that puts an end to their suffering.

For more reading on Artificial Intelligence, check out the 50 Articles, Essays and Insights in the Wiser! AI Collection.

Collection of Artificial Intelligence Insights
Collection of Insights, Essays and Articles from the world of Artificial Intelligence.

Sources: SocialPro, /replika

w/Further Reading

Virtual Humans; the Quest for Realism, Experience & Influence
Wiser! #9 (Premium): Virtual Humans. Mark Zuckerberg says that Mixed Reality is the next big thing.
Is Instagram to blame for the rise in self-harm amongst teenage girls?
Wiser! Essay: Rates of self-harm, depression and social anxiety continue to rise amongst teenage girls. The question is why?
What makes TikTok so addictive?
Wiser! Essay: TikTok is the most addictive platform on Social Media. Leaked documents reveal the how and why.

w/Make your next conversation a better one.

Wiser! is free because of the subscriptions and donations from readers like you. The purpose is simple: to save you time and keep you informed. Through the insights and information in every issue of the Wiser! Newsletter, you know what's happening and what's next in the Tech Economy.

Membership Options: Sign up for Free and get unrestricted access to 75% of the content on Wiser! Or take the Premium Option and get the really good stuff with unlimited access to all content.
Buy Me A Coffee: Virtual coffees cost €2 each and are the easiest way to donate to the running of the newsletter.

Buy Me A Coffee

Join the conversation

Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to Wiser! - Business & Technology Explained.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.