Ofcom Investigates Image Boards for Illegal Content Failures
Summary
Ofcom has launched an investigation into two online image boards for alleged failures to protect UK users from illegal content, including non-consensual intimate images and child sexual abuse material. This action falls under the Online Safety Act 2023, which imposes duties on user-to-user service providers.
What changed
Ofcom, the UK's communications regulator, has initiated a formal investigation into the provider of two image-based online forums suspected of failing to comply with their duties under the Online Safety Act 2023. The investigation will assess whether the provider conducted adequate risk assessments and implemented proportionate measures to prevent users from encountering priority illegal content, specifically non-consensual intimate images (NCII) and child sexual abuse material (CSAM). Ofcom has chosen not to name the specific services or their provider due to the sensitive nature of the content hosted.
This investigation signifies a substantive enforcement action by Ofcom, highlighting the critical importance of compliance with the Online Safety Act for user-to-user service providers. Regulated entities must ensure robust systems are in place to identify and mitigate risks associated with illegal content. While no specific compliance deadline or penalty is mentioned in this announcement, past enforcement actions under the Act have resulted in fines and service withdrawals. Companies operating such platforms should review their current risk assessments and mitigation measures to ensure alignment with their legal obligations.
What to do next
- Review risk assessments for illegal content, particularly NCII and CSAM.
- Evaluate and update measures to prevent users from encountering priority illegal content.
- Ensure compliance with duties under the Online Safety Act 2023 regarding user protection.
Source document (simplified)
Ofcom investigates online forums hosting image-based sexual abuse
Online safety Illegal and harmful content Protecting children News and updates News Published:
6 March 2026 Ofcom has today launched an investigation into whether the provider of two online image boards has failed to comply with duties to protect people in the UK from illegal content.
Due to the nature of these sites, we have decided not to name them or their provider.
Tackling online harms against women and girls
It is illegal in the UK to share non-consensual intimate images (NCII) or child sexual abuse material (CSAM). Under the UK’s Online Safety Act, providers of ‘user-to-user’ services are required to assess and mitigate the risk of UK users encountering this type of content on their platforms. [1]
This is something that disproportionately impacts women and girls, and making sure sites and apps tackle this is one of Ofcom’s highest priorities. [2]
When the new duties on tech firms came into force last year, we immediately launched a programme of enforcement action against services that are used to distribute CSAM. As a result, some have deployed automated tools to detect and swiftly remove this vile content, while others have withdrawn from the UK.
In total, under the Act we have launched investigations into nearly 100 platforms – including X, when Grok was used to create and share demeaning sexual deepfakes of women and children. We have issued nearly a dozen fines for non-compliance, including against a nudification site, which has withdrawn from the UK.
We also recently announced that we will be fast-tracking our decision on proposed new requirements for tech firms to use technology to block non-consensual intimate images at source, bringing it forward to May.
New investigation into image boards
Our job is to judge whether platforms have taken appropriate steps to comply with their legal obligations – it’s not to tell platforms which specific posts or accounts to take down.
We have engaged extensively with victims, survivors and advocacy groups, and carried out an initial assessment of two sites used to facilitate image-based sexual abuse. Today, we have opened a formal investigation to establish whether the provider of these sites has failed to comply with its duties under the Act:
- to conduct a suitable and sufficient illegal content risk assessment;
- to use proportionate measures to prevent individuals encountering priority illegal content – including NCII and CSAM; [3]
- to use proportionate systems and processes to minimise the length of time priority illegal content is present;
- to swiftly take down illegal content when it becomes aware of it;
- to specify in its terms of service how individuals are to be protected from illegal content; and
- to operate content reporting and complaints procedures in relation to illegal content. We will provide an update on this investigation as soon as possible.
Ofcom’s investigation process
The Online Safety Act sets out the process Ofcom must follow when investigating a company and deciding whether it has failed to comply with its legal obligations. [4]
Our first step is to gather and analyse evidence to determine whether a breach has occurred. If, based on that evidence, we consider that a compliance failure has taken place, we will issue a provisional decision to the company, who will then have an opportunity to respond to our findings in full, as required by the Act, before we make our final decision'
Enforcement powers
If our investigation finds that a company has broken the law, we can require platforms to take specific steps to come into compliance or to remedy harm caused by the breach. We can also impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater.
In the most serious cases of ongoing non-compliance, we can make an application to a court for ‘business disruption measures’, through which a court could impose an order requiring payment providers or advertisers to withdraw their services from a platform, or requiring internet service providers to block access to a site in the UK.
UK jurisdiction
As in other industries, companies that provide an online service to people in the UK must comply with UK laws. The Online Safety Act is concerned with protecting people in the UK. It does not require platforms to restrict what people in other countries can see. [5]
Notes to editors
- User-to-user services are where people may encounter content – including images, videos, messages or comments – that has been generated, uploaded or shared by other users. See 66D, subsections 5-7 of the Sexual Offences Act, as inserted by the Online Safety Act, for the definition of intimate image abuse, and Schedule 6 explains the child sexual exploitation and abuse offences that are priority offences under the Act.
- In November, we launched new industry guidance demanding that tech firms step up to deliver a safer online experience for women and girls in the UK.
- Ofcom’s illegal harms codes of practice set out safety measures providers can implement to comply with their duties, such as: having user reporting and complaints processes for illegal content that are easy to find, access and use; adequately resourcing and training content moderation teams as appropriate to deal with illegal content; and having content moderation systems that are designed to take down illegal content swiftly when they become aware of it. The responsibility is on platforms to decide whether content is illegal, and they can use Ofcom’s Illegal Content Judgements Guidance when making these decisions.
- Our Online Safety Enforcement Guidance can be found here.
- More information on jurisdiction is available here.
Related content
### Ofcom fast-tracks decision on measures to block illegal intimate images
Ofcom has announced that it will be fast-tracking its decision on proposed new requirements for tech firms to use technology to block illegal intimate images at source. ### Ofcom launches investigation into X over Grok sexualised imagery
Ofcom has today opened a formal investigation into X under the UK’s Online Safety Act, to determine whether it has complied with its duties to protect people in the UK from content that is illegal in the UK. ### Ofcom and IWF reinforce partnership in fight against online child sexual abuse
A new agreement between Ofcom and the Internet Watch Foundation has strengthened the UK’s commitment to cracking down on online child sexual abuse imagery.
Related changes
Source
Classification
Who this affects
Taxonomy
Browse Categories
Get Government alerts
Weekly digest. AI-summarized, no noise.
Free. Unsubscribe anytime.
Get alerts for this source
We'll email you when Ofcom News Centre publishes new changes.