U.K. Launches Consultation on Under-16 Social Media Ban and Limits on ‘Addictive’ Features

The U.K. government has opened a national consultation on sweeping new curbs to children’s use of social media, including the option of banning under‑16s from major platforms and forcing tech companies to strip out “addictive” design features such as infinite scroll and streaks.

Launched Jan. 19 by Technology Secretary Liz Kendall and Education Secretary Bridget Phillipson, and backed by Prime Minister Keir Starmer, the review marks the clearest signal yet that Britain is willing to consider following Australia’s lead in imposing some of the world’s toughest age‑based restrictions on social media.

In a media appearance trailing the plans, Starmer said “no option is off the table” when asked whether the government would support an outright ban on under‑16s using social media.

A broader push to “reset” children’s digital lives

The consultation sits at the center of a wider package on children’s digital lives that also includes tougher guidance for schools to become effectively phone‑free and new, evidence‑based screen‑time advice for parents of 5‑ to 16‑year‑olds.

Ministers say the goal is to “reset” children’s relationship with smartphones and social platforms amid rising concern about mental health, bullying and exposure to harmful content.

“Through the Online Safety Act, this government has already taken clear, concrete steps to deliver a safer online world for our children and young people,” Kendall said in a statement announcing the review. “These laws were never meant to be the end point, and we know parents still have serious concerns. That is why I am prepared to take further action.”

What the consultation asks—and when a decision could come

The nationwide process, with events for parents, young people and civil society groups, asks whether the current minimum age to use social media is too low and what “the right minimum age” should be. While the official document does not fix a number, ministers and lawmakers have repeatedly framed the central option as an under‑16 bar.

Officials are also seeking views on:

  • Tightening age assurance so any legal age limits can be enforced.
  • Removing or limiting features that “drive addictive or compulsive use,” specifically naming infinite scrolling feeds and streak‑based engagement tools.

The government has promised to publish its response in summer 2026.

Political pressure—and competing views on enforcement

The move comes amid mounting pressure on Starmer from within Labour and across Parliament. More than 60 Labour MPs signed a letter urging him to back a legal ban on social media for under‑16s, arguing Britain risks being “left behind” as other countries move faster.

Separately, an amendment to the Children’s Wellbeing and Schools Bill tabled in the House of Lords by former Conservative schools minister Lord Nash would require platforms to prevent under‑16s from using social media within one year of the bill becoming law. The proposal has cross‑party backing, including from Labour and Liberal Democrat peers.

Supporters of a ban point to bereaved families who say harmful online content contributed to their children’s deaths. Esther Ghey, the mother of murdered teenager Brianna Ghey, and Ian Russell, whose daughter Molly died by suicide after viewing self‑harm material online, have both urged ministers to go further in restricting children’s access and holding platforms to account.

At the same time, the government faces questions over how any age limit could realistically be enforced and what it would mean for privacy and free expression.

Age checks could expand far beyond adult content

Under the Online Safety Act—which received royal assent in 2023 and is being phased in through 2026—platforms that host pornography and other high‑risk content already have to put in place strict age‑verification, including identity checks and facial age‑estimation.

Kendall’s department has framed that system as a model for wider age assurance, saying millions of users now encounter age checks when accessing adult sites.

Extending similar checks to general social media, however, would dramatically expand their reach. Civil‑liberties campaigners argue it would amount to near‑universal age‑gating of major parts of the internet, forcing teenagers and adults alike to prove how old they are to maintain access.

The Open Rights Group, a U.K. digital‑rights organization, called the proposed under‑16 ban “a damaging and ineffective response,” saying platforms have “no way to enforce age‑based access without forcing users to prove their age.” The group warned that expanded age checks could drive the collection of sensitive data such as ID documents and biometric scans and risk excluding young people—particularly those from marginalized groups—from online discussion and support networks.

Health and children’s organizations are more divided, largely supporting tougher regulation but warning against unintended harms from blanket bans.

Barnardo’s chief executive Lynn Perry said the charity welcomed the consultation, noting that “while social media can offer connection, community and learning, it is clear that platforms are failing to protect children from harm.” She urged ministers to act quickly so that “the flood of harmful content cannot be allowed to continue unchecked.”

The Royal College of General Practitioners described the review as a “welcome step forward,” with chair Professor Victoria Tzortziou Brown saying family doctors were seeing the impact of digital harms on children’s “mental health, sleep, neurodevelopment, behaviour, social relationships and family functioning.”

By contrast, the Mental Health Foundation cautioned that a ban “has potential but also carries risks.” Chief executive Mark Rowland said such a measure could “block vulnerable children from seeking help and accessing supportive online communities, or drive them to riskier online spaces.”

Australia’s example—and unresolved definitions

International precedents loom large. In December 2025, Australia brought into force an amendment to its Online Safety Act that effectively bans minors under 16 from holding accounts on most major social‑media platforms, including Facebook, Instagram, TikTok and Snapchat.

Companies must take “reasonable steps” to keep under‑16s off their services or face fines that can reach tens of millions of Australian dollars.

Australian authorities say about 4.7 million child accounts have been deactivated, removed or restricted since the law took effect. But platforms have told Australian lawmakers the rules are difficult to enforce, and campaigners there have lodged a High Court challenge on privacy and free‑speech grounds.

U.K. ministers say they will study the Australian model closely. Any decision to follow it would force the U.K. to confront questions about what counts as “social media”—including whether gaming chats, encrypted messaging apps and community functions on sites such as YouTube or Wikipedia would be included—and how far to push age checks.

Targeting platform design—and phone-free schools

Alongside age limits, the British consultation raises the possibility of targeting platform design. Regulators in several U.S. states have sought to curb “addictive” feeds and late‑night notifications for minors, including laws in New York and California that would limit algorithmic recommendations and push alerts without parental consent. Some measures have been challenged in U.S. courts on free‑speech grounds.

Britain already has an Age‑Appropriate Design Code, enforced by the Information Commissioner’s Office, requiring services likely to be used by children to default to high privacy and limit data collection. New curbs on infinite scroll and streaks would go further by directly intervening in core engagement mechanisms used by platforms worldwide.

In schools, the government is moving more quickly. Updated guidance says pupils should not have access to phones during lessons, breaks or lunchtimes, and instructs Ofsted inspectors to examine whether schools are enforcing phone‑free policies.

“My message to headteachers is you now have all the backing—and the backing of my inspectors—to ban mobile phones in schools immediately,” Ofsted chief inspector Sir Martyn Oliver said. “They chip away at children’s attention span, distract from learning and can be detrimental to children’s wellbeing.”

Evidence remains contested

The evidence base for sweeping restrictions remains disputed. Some studies have linked heavy social‑media use to poorer mental health outcomes among adolescents, while other large‑scale research has found no simple, direct relationship between time spent online and increases in anxiety or depression.

A major U.K. trial under way in Bradford is testing whether limiting secondary school pupils’ access to TikTok, Instagram and other platforms to one hour per day, with a night‑time curfew, improves wellbeing. Results are not expected until 2027.

For now, ministers insist the consultation is an open process rather than a prelude to a predetermined ban. The outcome—and how far the government decides to go on age limits and design changes—will shape how millions of British children experience the internet and how far the state is willing to intervene in their digital lives.

Tags: #uk, #socialmedia, #onlinesafety, #children, #ageverification