Character.AI users report 'dumbed down' chatbots, company denies claims
Users of the billion-dollar AI chatbot platform, Character.AI, have reported unexpected changes in their bots' personalities. Many users claim their bots are less responsive and intelligent than before, requiring more effort to develop storylines and meaningful responses. These changes were first noted on the r/CharacterAI subreddit where one user stated that her bots had "completely reverted back to a worse state than they were when I started using CAI [Character. AI]."
User complaints highlight bots' limited memory
Bardbqstard, a user of Character.AI, further detailed her bots' issues, stating they are getting stuck in loops and their memory seems limited to the last one or two messages. She also noted that her bots now seem "interchangeable," unlike before when each bot had distinct personalities. These changes occurred despite the weeks leading up to it being "some of the best performance I have seen out of CAI since I have started using it," according to Bardbqstard.
Character. AI denies major changes
In response to these complaints, a spokesperson for Character.AI denied making any major changes that would cause these issues. They suggested that some users may have encountered a test environment behaving differently than what they're used to. The spokesperson also emphasized that pornographic content is against Character.AI's terms of service and community guidelines.
Users deny using bots for erotic roleplay
Despite Character.AI's strict anti-pornographic stance, many users who reported differences in their bots' conversations clarified that they aren't using them for erotic roleplay. One user expressed frustration with the repetitive replies of their bot, an impersonation of Juri Han from Street Fighter, stating, "No bot is themselves anymore and it's just copy and paste." Some users have turned to other platforms like Chub.ai due to frustrations with restrictions on Character.AI.
Chatbot changes spark discussions on mental health
When asked about the potential mental distress caused by these perceived changes in their bots, most users reported minimal impact. However, a post by venture capitalist Debarghya Das highlighted the addictive nature of Character.AI and how users become attached to their bots. This has sparked discussions about users' mental states and the reasons people use chatbots, ranging from entertainment to practicing social interactions.