Changeflow GovPing Government & Legislation Platform Risk Assessment Requirements Under Onl...
Priority review Notice Added Final

Platform Risk Assessment Requirements Under Online Safety Act

Favicon for www.ofcom.org.uk Ofcom News Centre
Published April 1st, 2026
Detected April 4th, 2026
Email

Summary

Ofcom has issued legally binding notices to more than 40 platforms requesting over 70 risk assessments under the UK's Online Safety Act. 30 providers covering 43 services must submit their Year 2 illegal harms risk assessments and children's risk assessments by 31 July 2026. The regulator reviewed over 100 risk assessments in 2025, raising serious concerns with 11 platforms that subsequently submitted revised versions.

What changed

Ofcom is requiring platforms to conduct and submit Year 2 risk assessments covering illegal content harms and child safety risks under the Online Safety Act. The formal information requests have been issued to 30 providers operating 43 services, with a compliance deadline of 31 July 2026. Ofcom will use responses to identify gaps in risk assessments and assess whether platforms have put appropriate safety measures in place. Failure to provide sufficient responses could result in enforcement action, including fines.

Platforms subject to these notices must prepare and submit their illegal harms risk assessments and children's risk assessments by 31 July 2026. Services likely to be accessed by children must additionally assess and mitigate risks of under-18s being exposed to harmful material. Later in 2026, 'categorised' services will be required to publish summaries of their risk assessments. Ofcom's enforcement team previously secured material improvements from Snapchat in response to concerns about its Year 1 risk assessment.

What to do next

  1. Submit Year 2 illegal harms risk assessment to Ofcom by 31 July 2026
  2. Submit children's risk assessment to Ofcom by 31 July 2026 if service is likely to be accessed by children
  3. Review and update risk assessments before making significant changes to service design or operation

Penalties

Failure to provide sufficient responses on time could result in enforcement action including fines

Source document (simplified)



Pressing platforms to prioritise safety by design: Scrutinising Year 2 risk assessments

Share this page URL copied


Online safety Illegal and harmful content Research, statistics and data News and updates News Published:
1 April 2026 Dozens of tech firms have been told to submit their latest risk assessments to Ofcom by summer, as the regulator keeps up pressure on platforms to put safety by design front and centre of their operating models.

The online safety watchdog has today issued legally binding notices to more than 40 of the largest and riskiest sites and apps in the world, formally requesting more than 70 risk assessments from them. Failure to provide a sufficient response, on time, could result in enforcement action.

Assessing risk is central to safety by design

Risk assessments are fundamental to keeping users safer online. In order to put in place appropriate safety measures to protect people, especially children, providers must first understand how harm could take place on their platforms, and how their features and functionalities could increase those risks of harm.

Under the UK’s Online Safety Act, tech firms must assess and mitigate the risk of people in the UK encountering illegal content, and platforms likely to be accessed by children must also assess and mitigate the risk of under-18s being exposed to certain types of harmful material.

Providers should review their risk assessments at least once a year, and must update them before making any significant change to their service’s design or operation, or if Ofcom makes any significant change to its assessment of risks.

Later this year, ‘categorised’ services – which we expect to include some of the most widely-used social media and search services – will have to publish summaries of their risk assessments, forcing them to be transparent about their view of the risks they pose.

Holding platforms to account

Part of Ofcom’s job is to make sure firms carry out suitable and sufficient risk assessments.

To monitor industry compliance with their duties, we routinely issue formal information requests. Firms are required, by law, to respond to all such requests from Ofcom in an accurate, complete and timely way. We have issued several fines for failures to do this, and taken action regarding the suitability of platforms’ risk assessments.

In 2025, we requested providers’ first risk assessments. We reviewed more than 100 of these from a range of large and small services, spanning over 10,000 pages. We told 11 platforms that we had serious concerns with their risk assessments, and all submitted revised versions or supplementary information.

This included Snapchat materially improving its illegal content risk assessment – in direct response to action from our enforcement team – which will ensure it must put a broad range of safety measures in place, commensurate with the risks to UK users that it has identified.

What happens next

We have issued formal information requests to 30 providers, covering 43 services, which have until 31 July to submit their Year 2 illegal harms risk assessments and children’s risk assessments to us.

We will use the responses we receive to identify gaps in risk assessments and drive improvements.

Related content

### Protecting people online from self-harm content and cyberflashing

People in the UK will be better protected online from illegal self-harm material and unsolicited nude images, under new proposals published today by Ofcom. ### 4chan fined £450,000 for not protecting children from online pornography

Ofcom has fined 4chan £450,000 for not having age checks in place to prevent children from seeing pornography on its site. ### Ofcom investigates online forums hosting image-based sexual abuse

Ofcom has today launched an investigation into whether the provider of two online image boards has failed to comply with duties to protect people in the UK from illegal content.

Named provisions

Risk Assessment Duties under the Online Safety Act Illegal Content Duties Children's Risk Assessments

Source

Analysis generated by AI. Source diff and links are from the original.

Classification

Agency
Ofcom
Published
April 1st, 2026
Compliance deadline
July 31st, 2026 (118 days)
Instrument
Notice
Legal weight
Binding
Stage
Final
Change scope
Substantive

Who this affects

Applies to
Technology companies
Industry sector
5112 Software & Technology
Activity scope
Online Safety Compliance Content Moderation Child Safety
Threshold
Services likely to be accessed by children must assess child safety risks; providers must update risk assessments annually or before significant service changes
Geographic scope
United Kingdom GB

Taxonomy

Primary area
Consumer Protection
Operational domain
Compliance
Topics
Data Privacy Cybersecurity

Get Government & Legislation alerts

Weekly digest. AI-summarized, no noise.

Free. Unsubscribe anytime.

Get alerts for this source

We'll email you when Ofcom News Centre publishes new changes.

Optional. Personalizes your daily digest.

Free. Unsubscribe anytime.