ChatGPT has new restrictions for users under 18
OpenAI is introducing stricter rules for users under 18 to protect them from risks in communicating with AI. ChatGPT will distinguish between teenagers using an age assessment system and, if necessary, require proof of documents. This was reported by on the official website.
Flirting and discussions of suicide are prohibited for users under the age of 18, even in creative tasks. In cases of life-threatening situations, the company will try to contact parents or notify the relevant authorities responsible for the safety of teenagers. The company plans to introduce new parental control features to better protect young users.
OpenAI explains that conversations with AI become very personal, so they require a high level of confidentiality, like conversations with a doctor or lawyer. Company security systems restrict access to data even for employees, except in cases of serious threat to life or large-scale cyberattacks.
For adult users, ChatGPT avoids flirting and harmful advice by default, but allows more freedom in creative or safe requests.
- Earlier, the US filed lawsuits over teenage suicides involving chatbots. In 2024, it was against Character AI due to the death of a 14-year-old in Florida, and in August 2025, it was against OpenAI due to ChatGPT.
- It is worth mentioning that one of the US states may introduce a law for chatbots .
- And ChatGPT will allow parents to customize the model's behavior for children.
Comments