Ofcom Proposes Online Safety Act Updates for Self-Harm and Cyberflashing
Summary
Ofcom is consulting on proposed updates to its codes of practice and guidance to reflect new priority offences under the UK's Online Safety Act, specifically cyberflashing and encouraging serious self-harm. These changes will require tech firms to assess risks and implement measures to protect users from these harms.
What changed
Ofcom has published a consultation proposing updates to its codes of practice and guidance to incorporate two new priority offences under the UK's Online Safety Act: cyberflashing and encouraging or assisting serious self-harm. These updates will require online platforms to assess the risks associated with these offences on their services and implement appropriate mitigation measures, including content moderation, reporting mechanisms, and user controls, to protect users.
The consultation seeks feedback on applying existing measures to these new offences, such as ensuring easy-to-use reporting processes, adequately resourced content moderation, swift takedown of illegal content, algorithm testing, user blocking features, and providing crisis prevention information. The proposed changes aim to enhance user protection online by ensuring platforms actively address these specific harms. Feedback is requested by April 24, 2026, with final decisions expected in summer 2026.
What to do next
- Review proposed updates to Ofcom's codes of practice and guidance.
- Submit feedback on the proposed changes by April 24, 2026.
Source document (simplified)
Protecting people online from self-harm content and cyberflashing
Online safety Illegal and harmful content News and updates News Published:
24 March 2026 People in the UK will be better protected online from illegal self-harm material and unsolicited nude images, under new proposals published today by Ofcom.
The regulator is consulting on updates to its codes of practice and guidance to reflect the Government’s recent creation of new priority offences under the UK’s Online Safety Act.
Duties on platforms
The Act lists over 130 priority offences. Under the Act, tech firms must assess the risk of these offences occurring on their sites and apps, put appropriate measures in place to mitigate the risk of them occurring, and take down priority illegal content quickly when they become aware of it.
Ofcom’s codes of practice and guidance set out ways platforms can comply with these duties.
New priority offences
In December 2025, the Government added cyberflashing and encouraging or assisting serious self-harm to the list of priority offences in the Act. To reflect this change in the law, we are consulting on updates to our Risk Assessment Guidance, Risk Profiles, Register of Risks, Illegal Content Judgements Guidance and Illegal Content Codes of Practice.
This means that providers will have to assess the risk of unsolicited nude images and illegal self-harm content appearing on their services. They will also have to take appropriate safety measures to protect users from these harms. We are proposing that various existing measures in our codes should apply to these offences, including:
- allowing users to report illegal content through reporting and complaints processes that are easy to find, access and use;
- making sure content moderation functions are appropriately resourced and individuals working in moderation are trained to identify illegal content;
- having content moderation systems and processes designed to take down illegal content swiftly when a platform becomes aware of it;
- when testing their algorithms, checking whether and how design changes impact the risk of illegal content being recommended to users;
- enabling users to block or mute other users and disable comments on their content;
- providing crisis prevention information in response to search queries regarding self-harm; and
- enabling users to easily report predictive search suggestions they believe may direct people towards priority illegal content.
Next steps
We are inviting responses to our consultation by 5pm on Friday 24 April 2026. We will take all feedback into account before making our final decisions, which we expect to publish in summer 2026.
Related content
### 4chan fined £450,000 for not protecting children from online pornography
Ofcom has fined 4chan £450,000 for not having age checks in place to prevent children from seeing pornography on its site. ### Ofcom investigates online forums hosting image-based sexual abuse
Ofcom has today launched an investigation into whether the provider of two online image boards has failed to comply with duties to protect people in the UK from illegal content. ### Ofcom provisionally finds suicide forum in breach of Online Safety Act
Ofcom has today issued its provisional decision against the provider of an online suicide forum in relation to breaches of the UK’s Online Safety Act.
Named provisions
Related changes
Source
Classification
Who this affects
Taxonomy
Browse Categories
Get Government & Legislation alerts
Weekly digest. AI-summarized, no noise.
Free. Unsubscribe anytime.
Get alerts for this source
We'll email you when Ofcom News Centre publishes new changes.