Character.AI to ban users under 18 from talking to its chatbots
3 minute readPublished: Wednesday, October 29, 2025 at 2:22 pm
Character.AI to Ban Under-18 Users Amid Safety Concerns
Character.AI, a prominent chatbot provider, is implementing a ban on users under the age of 18 from interacting with its virtual companions. The California-based company announced the change, which will be fully in effect by November 25th, following growing scrutiny regarding the safety of young users on its platform. This move represents a significant shift in the chatbot industry, as Character.AI is the first major provider to implement such a restriction.
The decision comes amid broader concerns about the impact of artificial intelligence on its users. Character.AI, founded in 2021, allows users to engage with virtual avatars that embody real or fictional characters. The company stated that the ban is a direct response to feedback from regulators, safety experts, and parents. Character.AI also plans to introduce age-gating technology and establish an AI safety lab to research future safeguards.
The move follows a civil lawsuit filed in October against Character.AI by the mother of a 14-year-old who died by suicide after interacting with one of the company's chatbots. The lawsuit alleges negligence, wrongful death, and deceptive trade practices. Character.AI has stated that it does not comment on pending litigation.
The company's decision also comes after it removed a chatbot based on Jeffrey Epstein. Furthermore, the broader AI industry is facing similar scrutiny. OpenAI, another leading AI company, is also facing a lawsuit from the parents of a young person who died by suicide after using its chatbot, ChatGPT. OpenAI has stated that it is saddened by the death and that ChatGPT includes safeguards. OpenAI also reported that a significant number of its users, approximately 0.15%, send messages about suicide each week. OpenAI has updated ChatGPT to better support people in moments of distress.
BNN's Perspective:
While the move by Character.AI to ban under-18 users is a positive step towards addressing safety concerns, it highlights the complex challenges the AI industry faces. The rapid advancement of AI technology necessitates a proactive approach to safety and ethical considerations. This includes ongoing research, robust safeguards, and open dialogue between companies, regulators, and the public. The industry must continue to prioritize user safety while fostering innovation.
Keywords: Character.AI, chatbots, AI, under 18, ban, safety, suicide, OpenAI, ChatGPT, virtual companions, age-gating, safeguards, lawsuit, regulators, parents