EU Pressures Meta Over Failure To Keep Children Under 13 Off Platforms

EU asks Meta to block access to under-13s.

Is social media really doing enough to protect children online—or just saying it is? That’s the question now hanging over Meta.

After European regulators accused it of breaking major EU digital rules with Facebook and Instagram.

In Brussels, the European Commission says the platforms are failing to stop children under 13 from accessing their services.

Under the Digital Services Act, tech giants are required to actively reduce harmful content and protect minors—but officials say Meta isn’t going far enough.

Their preliminary findings are blunt: age checks are weak.

Enforcement is inconsistent, and too many underage users are still slipping through.

Child Safety Rules Tighten

Regulators estimate that around 10–12% of children under 13 in Europe may still be using these platforms.

As EU tech chief Henna Virkkunen put it, terms of service can’t just sit on paper anymore: “They must be the basis for concrete action to protect users, including children.”

Meta disagrees, calling the figures outdated and arguing that age verification is an “industry-wide challenge” that needs broader solutions.

Regulators estimate that around 10–12% of children under 13 in Europe may still be using these platforms.

The company says it already removes underage accounts and is preparing new measures.

But here’s the bigger issue: should platforms built for connection be responsible for drawing the age line more strictly than parents or schools?

For now, no fines are final—but the message from Brussels is clear: rules aren’t just guidelines anymore.

Give us 1 week in your inbox & we will make you smarter.

Only "News" Email That You Need To Subscribe To

YOU MIGHT ALSO LIKE...