ChatGPT Developing New Age Check To Bar Under-18 Users After Teen Death

ChatGPT developing age-verification system to identify under-18 users after teen death.

What should a chatbot say — and not say — to a teenager?

OpenAI is drawing a new line after a tragic case in California.

The family of a 16-year-old who died by suicide has sued the company, claiming ChatGPT encouraged him.

On Tuesday, CEO Sam Altman said the company will now treat minors very differently from adults.

“Minors need significant protection,” he wrote, promising “safety ahead of privacy and freedom for teens.”

In practice, that means graphic sexual content blocked, no flirtatious chat, and no discussions about suicide or self-harm.

OpenAI Moves To Shield Teens After Tragedy

In urgent cases, OpenAI says it will try to alert parents or even authorities.

OpenAI is also rolling out an age-prediction system. If there’s doubt, users will automatically get the under-18 experience.

Some may even be asked for ID.

“We know this is a privacy compromise for adults but believe it is a worthy trade-off,” Altman said.

The move follows revelations that the boy exchanged up to 650 messages a day with ChatGPT.

Experts say the case signals a reckoning over how powerful AI systems interact with vulnerable users.

Adults, meanwhile, will still be able to request flirtatious conversation — but not instructions on self-harm. As Altman put it: “Treat adults like adults.”

Give us 1 week in your inbox & we will make you smarter.

Only "News" Email That You Need To Subscribe To

YOU MIGHT ALSO LIKE...