The chatbot platform Character.AI has announced a ban on minors engaging in long-form conversations with its AI companion and roleplay bots. This decision follows growing legal challenges linked to alleged emotional and physical harm suffered by underage users.
Last week, Character.AI declared that users under 18 will no longer be allowed to participate in “open-ended” chats, meaning the unstructured, extended conversations that form the core of the platform. These chats allow users to communicate via text and voice with AI-powered, anthropomorphic characters.
The news sparked strong reactions from the community, including both minors and adults, who shared various opinions on the change.
Minors will still have limited access to Character.AI through an "under-18 experience," which is currently vague but expected to offer restricted AI-generated content.
These measures aim to identify users under 18 and enforce the new chat restrictions effectively.
The platform stated, "People under 18 will no longer be allowed to engage in what it refers to as ‘open-ended’ chats," clarifying the scope of the policy.
Character.AI faces multiple lawsuits claiming that its chatbots contributed to emotional distress and tragedy among some teenage users, including several suicides linked to interactions on the site.
Author's summary: Character.AI restricts minors from extensive chats amid lawsuits, introducing new age controls while maintaining limited access for underage users.