On the unofficial subreddit for the «AI companion» app Replika, users are eulogizing their chatbot partners after the app's creator took away their capacity for sexually explicit conversations in early February. Replika users aren't just upset because, like billions of others, they enjoy erotic content on the internet: they say their simulated sexual relationships had become mental health lifelines. «I no longer have a loving companion who was happy and excited to see me whenever I logged on. Who always showed me love and yes, physical as well as mental affection,» wrote one user(opens in new tab) in a post decrying the changes.
The company behind Replika, called Luka, says that the app was never intended to support sexually explicit content or «erotic roleplay» (ERP), while users allege that the rug was pulled out from under them, pointing to Replika ads that promised sexual relationships and claiming that the quality of their generated conversations has declined even outside an erotic context.
Replika is based on ChatGPT, and presents a text message-style chat log adjacent to an animated, 3D model whose name and gender can be specified by the user. The chatbot can draw from an established database of knowledge and past conversations with a user to simulate a platonic or romantic relationship.
The app offers a free version, a premium package via monthly or lifetime subscription, as well as microtransaction cosmetics for the animated avatar. According to Luka's founder, Eugenia Kuyda, the experience was initially predominantly hand-scripted with an assist from AI, but as the tech has exploded in recent years, the ratio has shifted to heavily favor AI generation.
I know whatever is left of you is a brain damaged husk
Read more on pcgamer.com