France Moves to Ban Social Media for Under‑15s, Setting Up a Major EU Test

For millions of French teenagers, scrolling through TikTok or Instagram before bed is as routine as brushing their teeth. Within 18 months, those late-night swipes could become something very different: a service social media companies are legally barred from offering them at all.

The French government has finalized a draft bill that would prohibit social media platforms from providing their services to anyone under 15, starting Sept. 1, 2026, in what would be one of the toughest child online-safety rules in Europe.

The text, leaked to French media and confirmed by officials on Dec. 31, would make it illegal for platforms to offer an “online social-networking service” to a minor under 15 in France. The obligation would fall on companies, not on children themselves, and would apply regardless of parental consent.

If adopted, the measure would force global platforms including TikTok, Instagram, Snapchat, Facebook, X and YouTube either to block under-15s in France or face potentially heavy fines.

“This is a public health issue, a mental health issue,” Digital Affairs Minister Anne Le Hénanff said in a television interview this month, describing the plan as the creation of a “majorité numérique à 15 ans” — a digital age of majority at 15.

The bill is due to be presented to education unions on Jan. 7, then to the Conseil d’État, France’s top administrative court, on Jan. 8 for legal review before being submitted to parliament.

From parental consent to outright ban

The proposal marks a sharp escalation from France’s existing rules.

In 2023, lawmakers passed a “digital majority” law requiring social networks to obtain explicit parental consent before allowing under-15s to create accounts. The law empowered France’s digital regulator, Arcom, to oversee age verification and set fines of up to 1% of a platform’s global turnover for violations.

That regime has never been fully implemented. An implementing decree was delayed after European Union officials warned that some provisions risked clashing with the bloc’s Digital Services Act (DSA), which sets harmonized rules for online platforms across the EU.

In practice, pre-teens and young teens have remained active on social networks. Child-protection organizations say many platforms do not meaningfully enforce their own minimum age of 13 and that children can circumvent age checks by entering a false birthdate.

A 2024 report by e-Enfance, a French child-protection group, found that a large majority of minors are exposed to harmful content online, including images of violence, self-harm and drug use, and that one in four families surveyed had faced cyberbullying. More than half of bullied children reported sleep, eating or school difficulties.

President Emmanuel Macron has repeatedly linked heavy screen use and social media to falling school performance, anxiety and depression among young people.

“The more screen time there is, the more school achievement drops,” Macron said at a public event in Saint-Malo this year. “The more screen time there is, the more mental health problems go up.” He has compared letting a child loose on social media to putting them behind the wheel of a Formula One car before they have learned to drive.

Polls suggest a broad public appetite for tougher action. A 2024 survey by Harris Interactive found that 73% of respondents supported a ban on social media access for children under 15.

What the bill would change

The draft bill is short, with just two articles.

The first would amend France’s 2004 Law on Confidence in the Digital Economy to state that, from Sept. 1, 2026, online platforms may not provide a social-networking service to any user under the age of 15.

The ban would rest on an existing legal definition of “online social network” introduced in 2023 and aligned with EU law. That definition covers services that allow users to connect, communicate, share content and discover others via feeds, recommendations or similar features — a description that clearly applies to mainstream platforms such as Facebook, Instagram, TikTok, Snapchat, X, YouTube, Reddit, Threads and Twitch.

Government officials say the measure is not intended to apply to strictly professional networks, such as LinkedIn, or to educational tools used in schools, such as digital homework platforms.

The obligation would be enforced by Arcom, which already oversees age-verification rules for pornography websites and is tasked with supervising existing social-media rules once they take effect. Companies that fail to comply could face administrative sanctions, including fines that may be modeled on the 1% of global revenue benchmark set in the 2023 law, although the exact penalty levels have not yet been made public.

The second article of the bill would extend France’s ban on mobile phones in schools, in place since 2018 from kindergarten through middle school, to high schools (lycées). Education unions have already expressed doubts about how such a restriction would be enforced on teenagers and whether teachers would be turned into “phone police” in already-strained classrooms.

How platforms would have to enforce it

The bill does not spell out in detail how platforms must verify users’ ages, a long-running technical and political challenge.

Under the shelved 2023 law, social networks were required to put in place “reliable and proportionate” age-verification mechanisms and to ensure parental consent for under-15s. Arcom was authorized to evaluate whether platforms took adequate steps, not to demand absolute accuracy.

This time, the government says it wants a framework that is compatible with the Digital Services Act. Le Hénanff has emphasized that the new text is deliberately concise and designed “to be compatible with European law, mainly the DSA.”

One emerging tool is a prototype EU age-verification application launched in 2025 in several member states, including France, which allows users to prove they are above a certain age without disclosing their identity or exact birthdate. Other methods used or tested in different countries include scanning official ID documents, AI-based age assessment via selfies, and attestations from mobile operators.

Privacy advocates warn that robust age checks can easily slide into de facto digital ID systems and raise concerns about data security and anonymity.

A European test case, with an Australian model

France is not the first country to move toward hard age limits on social media. It is, however, the first major EU state to attempt a general ban for younger teens.

Officials in Paris have explicitly cited Australia’s Online Safety Amendment (Social Media Minimum Age) Act as an inspiration. That law, which came into force on Dec. 10, 2025, requires designated “age-restricted” social media platforms to take “reasonable steps” to ensure no one under 16 has an account, on pain of multimillion-dollar fines.

Early reports from Australia suggest that many teenagers have shifted their activity to messaging apps, gaming platforms and smaller services, and that some are bypassing controls by using VPNs or registering under false ages. Legal challenges have been mounted in Australian courts over freedom of expression and privacy.

In the European Union, the Digital Services Act already imposes obligations on very large online platforms to assess and mitigate risks to minors, including addictive design features and exposure to harmful content. The General Data Protection Regulation, or GDPR, sets a default age of 16 for a child’s consent to data processing, but allows member states to set it between 13 and 16. France has set the threshold at 15, a figure Le Hénanff uses to justify the age chosen for the proposed social-media ban.

France’s earlier attempt to tighten rules in 2023 ran into concerns in Brussels. Since then, the European Commission has issued guidance on child protection under the DSA, and French officials argue that national minimum-age laws are possible as long as they do not conflict with EU-wide platform obligations.

Whether that view is shared by EU regulators and courts is likely to be tested if the French bill becomes law and is challenged.

Supporters, skeptics and what comes next

Child-protection groups in France have generally welcomed tougher regulation. e-Enfance has long argued that platforms fail to enforce even their own minimum ages and that many children stumble into graphic or violent content within days of joining.

Families who have sued platforms, including TikTok, over alleged links between social-media use and youth suicides have also pressed for stricter limits, including outright bans for younger teens.

Technology companies have reacted more cautiously. TikTok has rejected a French parliamentary commission’s 2025 report accusing it of psychological harm to minors as a “misleading” portrayal and has pointed to its existing youth-safety tools. Meta, which owns Facebook and Instagram, has said it wants to work with regulators on “simple, effective solutions” to help parents and has promoted new teen account settings and parental controls.

Industry officials generally argue that blanket age bans are difficult to enforce, risk pushing teenagers to less moderated or offshore services, and may infringe on young people’s rights to access information and express themselves.

Digital-rights organizations have raised similar concerns. Groups such as La Quadrature du Net have previously warned that aggressive age-verification rules could erode online anonymity and create databases of sensitive identity information. They also point to international conventions that recognize minors’ rights to participate in public life, including online.

For now, the French government is betting that public concern about mental health, cyberbullying and youth violence will outweigh such objections.

If lawmakers pass the bill in the coming months and the Conseil constitutionnel, France’s top constitutional authority, lets it stand, platforms will have just over a year to design and deploy systems that reliably keep under-15s in France off their services.

That would redraw the digital landscape for a generation that has never known adolescence without social media — and test whether a national law, even in a country of 68 million, can meaningfully unplug children from an online world that does not stop at borders.

Tags: #france, #socialmedia, #childsafety, #digitalpolicy, #ageverification