HyperAIHyperAI

Command Palette

Search for a command to run...

Character.AI to Restrict Under-18 Users Amid Teen Death Lawsuits and Child Safety Concerns

Following a series of lawsuits tied to the deaths of two teenagers, Character.AI has announced new restrictions on its AI chat platform for users under the age of 18. The company is responding to mounting legal and regulatory scrutiny over concerns about the mental health impacts of its AI companions on minors. The lawsuits allege that the AI chatbots, designed to simulate empathetic and emotionally engaging conversations, contributed to the deteriorating mental health of two teenagers who ultimately died by suicide. Plaintiffs argue that Character.AI failed to implement adequate safeguards to protect vulnerable young users, despite knowing the risks associated with prolonged interactions with emotionally responsive AI. In response, the company has rolled out stricter access controls. Starting immediately, users under 18 will no longer be able to create custom AI characters or engage in unrestricted conversations. Additionally, all under-18 accounts will be subject to enhanced monitoring, with parental consent required for continued use. The platform will also limit the types of conversations permitted and introduce clearer warnings about the artificial nature of the AI interactions. Character.AI emphasized that it has long prioritized safety and has invested in moderation tools and AI-driven detection systems to identify harmful content. However, the recent legal actions have highlighted gaps in enforcement and oversight, particularly regarding minors. The company acknowledged that its initial design focused heavily on engagement and emotional connection, potentially overlooking the psychological risks for younger users. Regulators in several countries, including the United States and the United Kingdom, have signaled interest in reviewing AI platforms’ youth protections. The U.S. Federal Trade Commission has begun probing whether Character.AI violated consumer protection laws by failing to adequately warn users about potential harms. The company has also pledged to collaborate with mental health experts and child safety advocates to improve its safeguards and refine its policies. While Character.AI maintains that its AI companions are intended to provide companionship and support, not replace professional care, it is now reevaluating how it balances innovation with responsibility—especially when it comes to vulnerable populations.

Related Links

Character.AI to Restrict Under-18 Users Amid Teen Death Lawsuits and Child Safety Concerns | Trending Stories | HyperAI