PARIS (BN24) — A French parliamentary commission investigating the mental health impact of TikTok and other social media platforms is calling for a national ban on access for children under the age of 15, alongside the introduction of a digital curfew for older teenagers.

The commission, which began its work in March following lawsuits by families who say social media contributed to their children’s suicides, released its findings Thursday. It recommends that teens aged 15 to 18 be restricted from using social media between 10 p.m. and 8 a.m., and that platforms be held accountable if they fail to enforce safeguards.
Laure Miller, a member of parliament and the commission’s rapporteur, told AFP that the proposed under-15 ban would serve as a strong message to parents and children alike that “social media is not harmless” before that age.
The inquiry was triggered after seven families filed a lawsuit against TikTok in late 2024, accusing the platform of promoting harmful content. During the investigation, the commission heard testimony from families of victims, social media executives, and influencers. TikTok, owned by China-based tech firm ByteDance, is used widely by French youth and has faced scrutiny over its content moderation practices.
Among those who spoke to the commission was Geraldine, 52, whose daughter Penelope died by suicide at the age of 18. Geraldine later discovered self-harm videos on her daughter’s TikTok account. “It’s difficult for us as parents to moderate all this,” she told AFP, requesting anonymity beyond her first name.
In response to criticism, TikTok has maintained that protecting young users is its “top priority,” claiming to remove more than 95% of inappropriate content within 24 hours—and 90% before it is even viewed.
The commission’s report warns that if platforms fail to meet legal obligations outlined in the European Union’s Digital Services Act, France should consider extending the full ban to all minors under 18 within the next three years.
Further recommendations include a nationwide information campaign on social media risks and the potential creation of a legal offense termed “digital negligence” for parents who fail to oversee their children’s online activity responsibly.
Miller emphasized the need for robust age verification systems during account registration. She cited recent European Commission guidance as opening the path to national regulation, though she acknowledged persistent barriers, including technical limitations and privacy concerns.
The initiative comes amid growing pressure from multiple EU countries—among them France, Spain, and Greece—for stronger regulatory measures targeting social media’s impact on minors, particularly regarding addiction, cyberbullying, and online hate speech.



