ICO Open Letter to Tech Firms on Age Checks and Child Data Protection
Summary
The UK's Information Commissioner's Office (ICO) has issued an open letter to social media and video-sharing platforms, urging them to strengthen age assurance measures to prevent underage children from accessing services. The ICO expects platforms to move beyond self-declaration and utilize available technology to enforce minimum age requirements.
What changed
The ICO has published an open letter to technology firms, specifically social media and video-sharing platforms operating in the UK, calling for enhanced age assurance measures. The letter emphasizes the need to move beyond self-declaration of age, which can be easily bypassed by children, and to implement viable technological solutions to enforce minimum age limits. This initiative is part of the ICO's Children's Code strategy and follows recent fines issued to Reddit and MediaLab for data protection failures related to children's privacy.
Platforms are expected to demonstrate how their age assurance measures meet these expectations. The ICO is also concerned about the processing of children's data for recommendation systems, which can lead to exposure to harmful content or addiction. Companies should review and strengthen their age verification processes to comply with data protection regulations and safeguard children online. Failure to do so may result in further enforcement action, as evidenced by recent fines.
What to do next
- Review and strengthen age assurance measures to prevent underage access to services.
- Implement technological solutions beyond self-declaration for age verification.
- Be prepared to demonstrate compliance with age assurance expectations to the ICO.
Penalties
null
Source document (simplified)
Open letter issued to tech firms to strengthen age checks and protect children’s data
- Date 12 March 2026
- Type News
We have today published an open letter to social media and video‑sharing platforms operating in the UK, calling on them to strengthen age assurance measures so young children can’t access services that are not designed for them.
The open letter sets out our expectations that platforms with a minimum age must move beyond relying on children to self-declare their ages, which they can easily bypass.
Instead, platforms should make use of the viable technology that is now readily available to enforce their own minimum ages and prevent these children from accessing their services.
We have also written directly to platforms, starting with TikTok, Snapchat, Facebook, Instagram, YouTube and X to ask them to demonstrate how their age assurance measures meet these expectations.
“Our message to platforms is simple: act today to keep children safe online. There’s now modern technology at your fingertips, so there is no excuse not to have effective age assurance measures in place.
“Platforms need to be ready to demonstrate what they’re doing to keep underage children out and safeguard those children that are old enough to access their services.”
- Paul Arnold, ICO Chief Executive Officer
This call to action forms part of the next phase of our Children’s code strategy, which has already made significant progress in improving children’s privacy standards across social media and video-sharing platforms, but we want companies to go further on age assurance. Platforms must be able to tell which users are children so they can benefit from the protections they’re entitled to.
We recently fined Reddit £14.47 million and MediaLab (owner of Imgur) £247,590 for failing to implement age‑assurance measures and for processing children’s personal information unlawfully in a way that potentially exposed children to inappropriate, harmful content.
We also remain concerned about how social media and video‑sharing platforms process children’s data to generate recommendations, especially when this leads to harmful content or increases the risk of addiction to platforms. In March 2025, we opened an investigation into TikTok’s processing of children’s data in its recommender systems. In December 2025, we requested information from Meta about the processing of children’s data on Instagram’s recommender systems.
Protecting children online requires coordinated action across the regulatory system. We continue to work closely with Ofcom, which enforces the Online Safety Act.
Both regulators will publish an updated joint statement in March 2026, which outlines the main areas of interaction between online safety and data protection as they relate to age assurance.
We also supports Ofcom’s call today for platforms to enforce minimum ages and make sure their algorithms are configured to prevent children from encountering harmful content.
Click to toggle details
Notes to editors
- The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
- The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the United Kingdom General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR), Privacy and Electronic Communications Regulations 2003 (PECR) and a further five acts and regulations.
- The ICO can take action to address and change the behaviour of organisations and individuals that collect, use and keep personal information. This includes criminal prosecution, non-criminal enforcement and audit.
- To report a concern to the ICO telephone our helpline 0303 123 1113 or go to ico.org.uk/concerns.
Related changes
Source
Classification
Who this affects
Taxonomy
Browse Categories
Get Data Protection alerts
Weekly digest. AI-summarized, no noise.
Free. Unsubscribe anytime.
Get alerts for this source
We'll email you when ICO News & Blogs publishes new changes.