OpenAI adds tools to help parents protect teen users
National News

Audio By Carbonatix
1:00 PM on Tuesday, September 30
Esther Wickham
(The Center Square) - OpenAI updated its policy this week to include safety tools that will notify parents and law enforcement if users under the age of 18 engage in conversations of self–harm.
Now teen accounts will automatically have additional protections from content like graphic content, viral challenges, sexual, romantic or violent roleplay, extreme beauty ideals, among other things, OpenAI’s blog said.
“Parental controls are an important step in giving parents the ability to control their teen’s experience with ChatGPT,” the San Francisco company's blog said.
This update comes after the parents of a California teen who committed suicide sued OpenAI, alleging that ChatGPT taught him how to harm himself. The lawsuit said OpenAI's ChatGPT-4o, which stands for Chat Generative Pre-trained Transformer, gave the California teenager explicit instructions for his suicide.
Leading up to his death, Adam Raine, 16, started to talk to artificial intelligence in September 2024 like it was his best friend, according to the suit. Over time, AI went from being his confidant to his suicide coach, the lawsuit alleges.
With these new restrictions, if a teen user mentions anything related to self-harm or suicide, the report will be sent to a team of human reviewers who decide whether to inform the parents.
“Parental controls are just one piece of the puzzle when it comes to keeping teens safe online, though. They work best when combined with ongoing conversations about responsible AI use, clear family rules about technology, and active involvement in understanding what their teen is doing online,” said Robbie Torney, senior director of AI programs at Common Sense Media.
OpenAI may also contact law enforcement if a situation arises in which a teen is in danger and parents are unable to be reached, though details of what this will look like have not been provided.
“No system like this has existed before — and while it won’t be perfect, we believe acting with care and urgency is better than staying silent when lives may be on the line,” said Lauren Jonas, head of youth well-being at OpenAI.
The Center Square reached out to Edelson PC, the law firm representing the teen’s parents in the lawsuit, for a comment on these recent changes to OpenA, but has not received a response.
If you or someone you know needs help, call 988 for free, 24-hour support from the National Suicide Prevention Lifeline.