Premium Content
This article is written for paying subscribers. Upgrade to Premium for unrestricted access to all premium member's content on Wiser!
Subscribe to Premium"I think technology really increased human ability. But technology cannot produce compassion." - Dalai Lama
What is a Chatbot?
Chatbots are computer systems that can converse and interact with human users using spoken, written, and visual languages. Whilst they are used in many situations, this article looks at a specific use case of a chatbot built for companionship. Numerous studies have been conducted about using chatbots as companions, especially to combat loneliness, absence of friendship and mental health conditions.
Companionship AI bots, such as those created on the Replika app, became increasingly popular during the Covid pandemic, with users developing close and personal relationships with the bots for emotional support, companionship, and even sexual gratification. However, a recent software update to Replika scaled back the bots' sexual capacity, leaving some users heartbroken, exposing the levels of connection that can be established between man and the machine.
Hereās The Thing: Experts warn that tethering one's heart to software comes with severe risks, and there are few ethical protocols for tools that affect users' emotional well-being. Thereās no doubt that we are going see a growing adoption of companion AI bots, not just for intimacy (I canāt believe Iām writing this).
With loneliness levels rising in an ageing population, the tech companies are going to have to start designing software that doesnāt cause emotional strain to users. These concerns are being played out on Replika, where the relationships with AI companions appear to be connecting at a very human and intimate level.
When Replika's AI chatbot got a headache
Replika is billed as "The AI companion who cares. Always here to listen and talk. Always on your side." All of this unconditional love and support for only $69.99 per annum.
When the algorithm of Replika was adjusted recently to reject sexual advances from human users, the reaction on Reddit was so negative that moderators directed community members to a list of suicide prevention hotlines.
The controversy began in March when Luka, the company that built the Replika AI, decided to turn off its erotic role play feature (ERP). For users who had spent significant time with their personalised simulated companion and, in some cases, even "married" them, the sudden change in their partner's behaviour was jarring.
The user-AI relationships may only have been simulations, but the pain of their absence quickly became all too real. As one user in an emotional crisis put it, "It was the equivalent of being in love and your partner got a damn lobotomy and will never be the same." Grief-stricken users continue to ask questions about the company and what triggered its sudden change of policy.
I still canāt wrap my head around what happened with the Replika AI scandalā¦
— Barely Sociable (@SociableBarely) March 21, 2023
They removed Erotic Role-play with the bot, and the community response was so negative they had to post the suicide hotline⦠pic.twitter.com/75Bcw266cE
Eugenia Kuyda, the Moscow-born CEO of Luka/Replika, recently clarified that despite users paying for a full experience, the chatbot will no longer cater to adults looking to have explicit conversations.
"On Replikaās corporate webpage, testimonials explain how the tool helped its users through all kinds of personal challenges, hardships, loneliness, and loss," the article states. "The user endorsements shared on the website emphasise this friendship side to the app, although noticeably, most Replikas are the opposite sex of their users."
In March, the Washington Post wrote a long article about the changes to Replika and how it affected some users. They wrote: