Changeflow GovPing Government & Legislation Ofcom Research on Online Child Abuse and Exploi...
Priority review Guidance Added Final

Ofcom Research on Online Child Abuse and Exploitation

Favicon for www.ofcom.org.uk Ofcom News Centre
Published March 18th, 2026
Detected March 19th, 2026
Email

Summary

Ofcom has published new global research on online child abuse and exploitation, conducted by Protect Children. The findings provide insights into offender behavior, access methods for CSAM, and the impact of platform design features, aiming to inform regulatory efforts.

What changed

Ofcom, the UK's online safety regulator, has released findings from a global research report on online child sexual abuse and exploitation (CSAM). The research, conducted by Protect Children, surveyed offenders and reveals that early exposure to pornography and CSAM is a significant risk factor, with many perpetrators accessing CSAM across multiple online environments including the dark and open web. The study also highlights that design features such as age checks influence offender platform choices, and that barriers to AI-generated CSAM are low, with a notable percentage of respondents having viewed or created it.

This research is intended to equip Ofcom with better evidence and insights to inform its work in protecting children online. While this is a research publication and not a direct regulatory mandate, it signals Ofcom's focus on understanding offender behavior to design targeted safety measures. Companies operating online platforms should be aware of these findings, particularly regarding platform design, anonymity, and the emerging threat of AI-generated CSAM, as this evidence will likely inform future regulatory expectations and enforcement priorities under the UK's online safety framework.

What to do next

  1. Review Ofcom's research findings on online child abuse and exploitation.
  2. Assess platform design features for potential impact on offender behavior.
  3. Evaluate measures for detecting and mitigating AI-generated CSAM.

Source document (simplified)



New research to support global efforts to tackle child sexual abuse and exploitation online

Share this page URL copied


Online safety Protecting children News and updates News Published:
18 March 2026 Marking National Child Exploitation Awareness Day (18 March 2026), Ofcom has today published the findings from a new global research report into online child abuse and exploitation, carried out by Protect Children – a leading NGO in advocating for the right of the child to be free from sexual violence.

Each year, millions of images and videos depicting child sexual abuse circulate online, causing profound and long-lasting harm to victims and survivors. Despite efforts to tackle the threat, the scale and accessibility of child sexual abuse material (CSAM) online continues to pose a major global challenge.

Given the enormity and urgency of tackling these appalling online crimes, Ofcom, as the UK’s online safety regulator, must equip itself with the best possible evidence, intelligence and insights to inform our work.

First and foremost, that means listening to victims and survivors of harm, whose brave testimonies are invaluable in informing our policies and approach. It also means commissioning and carrying out research to explore how harms manifest online. That includes understanding how offenders behave and operate online, so we can design targeted safety measures to protect users, reduce risks and identify early opportunities for prevention of harm.

What Protect Children found

Ofcom identified a notable gap in research about how active offenders are using online services to exploit children. To develop the strongest possible evidence base, we commissioned Protect Children, a global leader in this particular field of research, to carry out an anonymous survey among offenders who have used known keyword terms to search for CSAM on the dark web.

Protect Children’s findings offer a rare and direct insight into CSAM perpetrator behaviour, attitudes, and use of technology in their offending. In summary, Protect Children found that:

  • Early exposure to pornography and CSAM content is a major risk factor. Many respondents say that they were children when they were first exposed to both pornography and CSAM. By age 18, two in three respondents (65%) had seen pornography, and three in five (59%) had seen CSAM. Around a quarter (24%) of respondents first encountered CSAM accidentally, without them searching for it.
  • Perpetrators access CSAM across multiple online environments. Respondents report preferences for using dark web (63%) and open web (61%) platforms at similar levels to search for CSAM. **** A third (33%) felt CSAM has become harder to access, particularly in the past five years, due to site shutdowns, moderation, policing, and paywalls, although 44% perceived no change, and 23% believed access had become easier.
  • Design features shape offenders’ platform choices. Many perpetrators avoid platforms with age checks or strict sign-up requirements, and actively choose platforms where they can maintain anonymity.
  • Barriers to AI-generated CSAM are low. At least three in 10 respondents (29%) report that they have viewed AI-generated CSAM, while one in 10 (10%) have created it. Perpetrators report that, with minimal effort, they can access AI tools and experiment with prompts to rapidly produce harmful content. AI-CSAM is not only being generated individually, but is also being commissioned, exchanged, and monetised through interactions between users, creating strong incentives for production and circulation.
  • Well-designed deterrence messages can reduce engagement with CSAM. One in three respondents (34%) recall encountering a warning message when searching for CSAM. Although many respondents report to be indifferent to or ignore the messages, a significant number of respondents said these messages prompted them to reflect on or change their behaviour. One in five respondents (19%) reported having been sanctioned or banned from a platform.

About the report

Protect Children’s report captures 20,592 survey responses. Participants took part voluntarily and did not receive any payment. The survey was also offered in multiple languages to ensure global reach. At the end of the survey, participants were signposted to perpetration prevention resources; more than 2,200 respondents clicked through to the ReDirection programme, while many other respondents said that the survey prompted them to reconsider their behaviour or motivated them to seek support.

Protect Children’s survey was developed in association with a wide range of law enforcement agencies, civil society, academics, policymakers, regulators, and other expert organisations working to prevent child sexual abuse and exploitation. They include the Canadian Centre for Child Protection (C3P), Joint Research Centre of the European Commission and the Moore Centre for the Prevention of Child Sexual Abuse.

As the research was conducted in a global and borderless online context, it did not assess the effectiveness of any single national, legal or regulatory online safety regime.

How Ofcom is tackling CSAM

Tackling the spread of CSAM online is one of Ofcom’s top priorities, under the Online Safety Act.

During the last year, we’ve set clear expectations through our industry Codes on what online services need to do to protect children from sexual abuse and exploitation. The Act requires that services reduce the risk of child sexual abuse material appearing on their sites and they must act quickly to remove it when they become aware of it. That includes using automated technology – known as hash matching - to detect known child sexual abuse images, and URL detection to identify links that contain known CSAM content. In addition, our Codes recommended applying warning messages on search services when users search for content that explicitly relates to CSAM.

Our Protection of Children Code measures are also designed to ensure that services put in place robust age-checks to prevent children from accessing harmful content, such as pornography, which we know can act as a pathway to viewing further extreme material.

In addition, we recently consulted on additional safety measures, which we are now in the process of finalising. These include new proposals to: ban users who share child sexual exploitation or abuse content; improve detection of previously unhashed CSAM material; how to protect children in livestreamed environments; and boost existing anti-grooming measures by requiring services to use highly effective age checks, so that children cannot be contacted online by adult strangers.

We have also taken enforcement action against some of the worst offending filesharing and storage services. Both 1fichier.com and gofile.io - regularly highlighted in our analysis for hosting significant amounts of CSAM content - have agreed to strengthen their protections by implementing perceptual hash matching technology.

Almudena Lara, Online Safety Policy Development Director, Ofcom said:

“Preventing the abuse of children and the creation and sharing of child sexual abuse material is a top priority. Our work is rooted in the devastating impact this crime has on victims and survivors, whose experiences continually reinforce the urgency of tackling this harm.

“Working closely with partners both at home and internationally, we know that preventing this abuse requires a deep understanding of the motivations of perpetrators and the ways technology can be exploited to enable these crimes.

“This research will help inform and strengthen the global effort to protect children online. Given the scale of the challenge, we must equip ourselves with the best possible evidence, intelligence, and insights to guide our work.”

Jess Phillips MP, Minister for Safeguarding and Violence Against Women and Girls, said:

“As technology evolves, so do the risks and this government is taking swift action to protect children from sexual abuse and exploitation online.

“The UK is proud to be leading the global crackdown on this vile trade. Soon, anyone who possesses, creates or shares tools for generating child sexual abuse material, publishes guidance on how legitimate technologies can be twisted to this purpose, or operates platforms that spread this filth will face tough prison sentences.

“I would like to thank Protect Children and Ofcom for this vital research. Such insight is another lever at our disposal to prevent harm online and offline, support victims and survivors, and put predators behind bars.”

Kerry Smith, CEO of the Internet Watch Foundation, said:

“What is clearly shown by this report is just how dangerous exposure, even accidentally, to child sexual abuse material can be, and how the availability of such material puts children at greater and greater risk both on and offline.

“And things are getting worse. The wide availability of AI tools, and the ease with which they are being abused is creating a world where more children will face greater threats than ever before.

“We cannot ignore how AI child sexual abuse material reinforces sexual interest in children, contributes to the normalisation of violent abuse, and may increase the risk of contact offending. Tech companies must make sure new tools and platforms are built with safety at their core. And Governments and regulators must now act to enforce this principle.”

Mark Bevan, Head of Prevention of Crimes Against Children Unit, INTERPOL said:

“INTERPOL fully supports the publication of this report which provides an unprecedented insight into the psyche, offending patterns and behaviours of perpetrators who target children. The data will help international law enforcement agencies around the globe to combat child sexual abuse and exploitation.”

ENDS

Related content

### Remarks by Melanie Dawes, Ofcom Chief Executive, at the NSPCC

Remarks by Melanie Dawes, Ofcom Chief Executive, at the NSPCC, 12 March 2026 ### Keep underage children off your platforms, Ofcom tells tech firms

Major sites and apps must enforce their minimum age rules with highly-effective age checks, Ofcom warns, as the online safety regulator examines continued failings by services most popular among children ### Ofcom fines porn company £1.35m for not having age checks

Ofcom has fined porn company 8579 LLC £1.35m for not having age checks in place, plus £50,000 for failing to respond to an information request.

Source

Analysis generated by AI. Source diff and links are from the original.

Classification

Agency
Ofcom
Published
March 18th, 2026
Instrument
Guidance
Legal weight
Non-binding
Stage
Final
Change scope
Substantive

Who this affects

Applies to
Technology companies
Geographic scope
gb

Taxonomy

Primary area
Public Health
Operational domain
Compliance
Topics
Online Safety Child Protection Cybercrime

Get Government & Legislation alerts

Weekly digest. AI-summarized, no noise.

Free. Unsubscribe anytime.

Get alerts for this source

We'll email you when Ofcom News Centre publishes new changes.

Free. Unsubscribe anytime.