MediaTech

The Information Commissioner鈥檚 Office has today published an open letter to social media and video鈥憇haring platforms operating in the UK, calling on them to strengthen age assurance measures so young children can鈥檛 access services that are not designed for them.

The open letter sets out the ICO鈥檚 expectations that platforms with a minimum age must move beyond relying on children to self-declare their ages, which they can easily bypass.

Instead, platforms should make use of the viable technology that is now readily available to enforce their own minimum ages and prevent these children from accessing their services, the ICO said.

The regulator has also written directly to platforms, starting with TikTok, Snapchat, Facebook, Instagram, YouTube and X, to ask them to demonstrate how their age assurance measures meet these expectations.聽

Paul Arnold, ICO CEO, said: 鈥淥ur message to platforms is simple: act today to keep children safe online. There鈥檚 now modern technology at your fingertips, so there is no excuse not to have effective age assurance measures in place.

鈥淧latforms need to be ready to demonstrate what they鈥檙e doing to keep underage children out and safeguard those children that are old enough to access their services.鈥

This call to action forms part of the next phase of the ICO鈥檚 Children鈥檚 code strategy, which has already made significant progress in improving children鈥檚 privacy standards across social media and video-sharing platforms, but the ICO wants companies to go further on age assurance.聽

Platforms must be able to tell which users are children so they can benefit from the protections they鈥檙e entitled to.

The ICO recently fined Reddit 拢14.47 million and MediaLab (owner of Imgur) 拢247,590 for failing to implement age鈥慳ssurance measures and for processing children鈥檚 personal information unlawfully in a way that potentially exposed children to inappropriate, harmful content.

Tesla wins electricity supply licence in Britain

The ICO also remains concerned about how social media and video鈥憇haring platforms process children鈥檚 data to generate recommendations, especially when this leads to harmful content or increases the risk of addiction to platforms.

In March 2025, the ICO opened an investigation into TikTok鈥檚 processing of children鈥檚 data in its recommender systems. In December 2025, it requested information from Meta about the processing of children鈥檚 data on Instagram鈥檚 recommender systems.

Protecting children online requires coordinated action across the regulatory system. The ICO continues to work closely with Ofcom, which enforces the Online Safety Act.

Both regulators will publish an updated joint statement in March 2026, which outlines the main areas of interaction between online safety and data protection as they relate to age assurance.

The ICO said it also supports Ofcom鈥檚 call for platforms to enforce minimum ages and make sure their algorithms are configured to prevent children from encountering harmful content.

FCA bans Sendsii from operating after HMRC suspension