How safe is TikTok for kids? Not nearly as much as parents might hope, according to a new investigation.
Researchers at Global Witness set up fake accounts posing as 13-year-olds — complete with “restricted mode” switched on.
They were stunned at what they found.
Within just a couple of clicks, TikTok was suggesting sexualised search terms like “very rude babes.”
In some cases, it was escalating to pornographic clips.
One test account? It took only two clicks to stumble across explicit videos, including penetrative sex.
The watchdog group said some of the content even appeared to show people under 16.
TikTok Safety Gaps Exposed
They reported this to the Internet Watch Foundation.
“It’s a clear breach of the Online Safety Act,” Global Witness argued.
They pointed to the UK law requiring tech platforms to shield minors from harmful material.
Ofcom, the UK regulator, has promised to review the findings.

Their guidance is blunt: if your algorithm can push harmful content toward kids, you need to fix it.
TikTok, for its part, said it removed the offending clips and tweaked its search recommendations.
“As soon as we were made aware, we took immediate action,” a spokesperson said.
But here’s the bigger question: if a handful of researchers could crack TikTok’s protections this quickly.
How many real teens are already slipping through the cracks?