Changeflow GovPing Telecom & Technology Letters of Caution to X and TikTok for Online S...
Priority review Enforcement Added Final

Letters of Caution to X and TikTok for Online Safety Failures

Favicon for www.imda.gov.sg Singapore IMDA Press Releases
Filed March 31st, 2026
Detected March 31st, 2026
Email

Summary

IMDA issued Letters of Caution to X and TikTok placing both platforms under Enhanced Supervision after finding serious weaknesses in their content moderation. X saw a 120% increase in CSEM cases with Singapore nexus (33 to 73 cases), while TikTok had 17 terrorism content cases from Singapore-based accounts in 2025. Both platforms must demonstrate effectiveness of rectification measures by 30 June 2026.

What changed

IMDA's second Online Safety Assessment Report 2025 revealed critical failures by X and TikTok in proactively detecting harmful content. For X, the 73 detected CSEM cases all violated X's own policies and were only removed after IMDA flagged them—despite IMDA sharing analysis of indicators in 2024. For TikTok, the 17 terrorism content cases (edited footage of transnational terrorist organizations) were initially deemed non-violating when user-reported through TikTok's own mechanism, demonstrating systematic assessment failures.

Both platforms are now under Enhanced Supervision and must provide supporting data to demonstrate rectification effectiveness by 30 June 2026. They must regularly account for progress until IMDA is satisfied issues are resolved. This enforcement action applies to all Designated Social Media Services under the SMS Code, which requires proactive detection and swift removal of CSEM and terrorism content before user exposure.

What to do next

  1. Submit supporting data and information demonstrating effectiveness of CSEM/terrorism content rectification measures by 30 June 2026
  2. Implement robust proactive detection technologies for CSEM content with Singapore nexus
  3. Establish accurate terrorism content assessment processes for user-reported content

Penalties

Continued Enhanced Supervision until IMDA is satisfied with rectification; potential further enforcement action for non-compliance

Source document (simplified)

Be aware of scammers impersonating as IMDA officers or misrepresenting our policies. Government officials will NEVER call you to transfer money, verify refunds, disclose bank log-in details or request for your personal information. For scam-related advice, please call the ScamShield Helpline at 1799 or go to www.ScamShield.gov.sg.

  1. Architects of SG's Digital Future
  2. Resources
  3. Press Releases, Factsheets and Speeches
  4. IMDA Issues Letters of Caution to X and TikTok for Serious Weaknesses in their Measures to Detect and Remove CSEM and Terrorism Content Respectively

IMDA Issues Letters of Caution to X and TikTok for Serious Weaknesses in their Measures to Detect and Remove CSEM and Terrorism Content Respectively

31 MAR 2026 | 8 mins read
- Both platforms have been placed under Enhanced Supervision and must regularly account for progress in implementing rectification measures until IMDA is satisfied that issues have been resolved
- They are also required to provide supporting data and information to demonstrate effectiveness of their rectification measures by 30 June 2026
SINGAPORE – 31 MAR 2026

  1. The Infocomm Media Development Authority (IMDA) has issued Letters of Caution to X and TikTok for serious weaknesses in their measures to proactively detect and remove child sexual exploitation and abuse material (CSEM) and terrorism content respectively. They have also been placed under Enhanced Supervision. IMDA found a 120% increase in cases of CSEM on X originating from or targeting Singapore users, up from 33 cases in 2024 to 73 cases in 2025. On TikTok, IMDA found 17 cases of terrorism content shared by Singapore-based accounts for the first time in 2025. CSEM and terrorism content are very egregious harms, and the Code of Practice for Online Safety – Social Media Services (the “SMS Code”) requires designated Social Media Services (DSMSs) 1 to proactively detect and swiftly remove CSEM and terrorism content, through the use of technologies and processes, before users encounter such content.

  2. These findings are part of the second Online Safety Assessment Report (the “Report”) 2025 on DSMSs, which assesses the presence, comprehensiveness and effectiveness of the online safety measures implemented by DSMSs to mitigate risks from harmful content, as required by the SMS Code. The inaugural Report published last year assessed that DSMSs had put in place the baseline safety measures. This second Report builds upon this baseline and highlights the areas of weakness DSMSs need to address, as well as improvements they have made over the past year. The Report allows users, including parents, to make informed decisions for themselves and their children about the risks and available safety measures on the various DSMSs.

X and TikTok placed under Enhanced Supervision

  1. As part of its assessment for the Report, IMDA conducted tests for CSEM and terrorism content to obtain an indicative sample of such content across the DSMSs. The 73 CSEM cases detected on X all had a Singapore nexus, and involved content sharing or linking to CSEM, as well as self-generated CSEM. This occurred despite IMDA sharing its analysis of the CSEM cases and their indicators with X in 2024. All 73 cases also violated X’s own policies against CSEM and were only removed by X when IMDA flagged the cases to X.

  2. IMDA detected 17 cases of terrorism content shared by Singapore-based accounts on TikTok. These cases primarily comprised videos with edited footage or audio related to known transnational terrorist organisations. In addition, when some of these were reported to TikTok via its in-app user reporting mechanism, TikTok found that the content did not violate its community guidelines. This demonstrated that TikTok did not accurately assess the terrorism content when they were user reported. TikTok only removed them when IMDA flagged the cases to TikTok.

  3. Both X and TikTok have accepted IMDA’s findings and committed to put in place specific measures to rectify these serious weaknesses, in particular to enhance their automated detection systems through the use of AI and incorporation of additional signals to improve their proactive detection of CSEM and terrorism content respectively. To ensure accountability for the effective implementation of these rectification measures, IMDA has issued Letters of Caution to X and TikTok to place both services under Enhanced Supervision and require them to:

  • Provide regular updates to IMDA on their progress in implementing the rectification measures they have committed to, until IMDA is satisfied that the issues are adequately resolved.
  • Provide supporting data and information to IMDA in their next annual online safety report due on 30 June 2026, to demonstrate the effectiveness of their implementation of the rectification measures.
  • Should X or TikTok fail to satisfy IMDA that they have improved the effectiveness of their measures to address the specific types of CSEM and terrorism content that IMDA has detected, IMDA will not hesitate to explore further options, including potential regulatory action under the Broadcasting Act.

More can be done by DSMSs to improve child safety measures

  1. In the 2025 Report, while there has been improvement in some areas highlighted in the 2024 Report, IMDA also identified areas of weakness that the DSMSs will need to account for. IMDA urges the DSMSs to continue strengthening these measures.

Overview of DSMSs' Online Safety Ratings

  1. In particular, more can be done by some DSMSs to improve their safety measures for all users and children, their user reporting and resolution mechanisms, as well as their data accountability. Facebook, YouTube and HardwareZone were found to have weaknesses in the effectiveness of their child safety measures, which could lead children to easily access age-inappropriate content. The comprehensiveness of child safety measures across different DSMSs also varied greatly. Instagram and TikTok reported the most comprehensive child safety measures, while HardwareZone and X only had a few baseline measures. Given the rapidly evolving online safety risk landscape, especially for children, DSMSs must continue to prioritise enhancing the comprehensiveness and effectiveness of their measures to minimise children’s exposure to harmful and age-inappropriate content.

  2. Most DSMSs improved the effectiveness and timeliness of their responses to user reports. All DSMSs, except TikTok, took action on a greater proportion of legitimate user reports on content that violated their own community guidelines in 2025. Their action rates ranged from 54% to 93% in 2025, compared to approximately 50% or less in 2024. TikTok was the only DSMS to decline in the effectiveness of its user reporting mechanisms, with its action rate for legitimate user reports declining from 39% in 2024 to 25% in 2025. All DSMSs also improved on the time they took to act on such user reports.

Overview of DSMSs’ Action Rates on Legitimate User Reports

Online safety of users, especially children, of utmost priority

  1. IMDA’s main priority as Singapore’s online safety regulator is to ensure a safe online environment for users in Singapore and to protect children, in particular, from harmful content. Throughout the year, IMDA has engaged the DSMSs on weaknesses in their online safety measures, flagged harmful content, and raised concerns when risks were detected. While IMDA adopts a collaborative approach to engage with the DSMSs, we will hold the DSMSs accountable when we assess that their online safety measures do not adequately achieve the outcomes of the Code. Under the Broadcasting Act, IMDA also has powers to direct social media services to block access to egregious content found on their services. Our overriding objective remains to ensure the online safety of Singapore users, especially children.

Next steps for online safety in Singapore

  1. All DSMSs will need to provide IMDA with updates on the steps taken to improve on their areas of weakness in their next annual online safety report. At the same time, IMDA will continue to engage the DSMSs regularly throughout the year to highlight emerging online safety risks and ensure the DSMSs have the required measures in place to protect Singapore users.

  2. Online safety risks continue to evolve, as technology has made it easier to create and disseminate harmful content. DSMSs will have to remain vigilant and continually improve the effectiveness of their online safety measures, especially for children.

  3. IMDA is constantly monitoring the rapidly evolving online safety risk landscape and reviewing the relevance of its regulations including the SMS Code. In 2025, IMDA made it a requirement for Designated App Distribution Services to implement age assurance measures to ensure children do not download apps that are inappropriate for their age. To ensure that online safety measures are effectively and accurately applied to children, IMDA plans to extend age assurance requirements to DSMSs. We are also studying how online safety requirements for children can be further enhanced. IMDA is currently in discussions with DSMSs and more details will be announced later this year.

  4. IMDA’s Online Safety Assessment Report and the DSMSs’ annual online safety reports are published in full on IMDA’s website at www.imda.gov.sg/online-safety for public reference.

Issued by the Infocomm Media Development Authority

About Infocomm Media Development Authority

The Infocomm Media Development Authority (IMDA) leads Singapore’s digital transformation by developing a vibrant digital economy and an inclusive digital society. As Architects of Singapore’s Digital Future, we foster growth in Infocomm Technology and Media sectors in concert with progressive regulations, harnessing frontier technologies, and developing local talent and digital infrastructure ecosystems to establish Singapore as a digital metropolis.

For more news and information, visit www.imda.gov.sg or follow IMDA on LinkedIn (IMDAsg), Facebook (IMDAsg), and Instagram (@imdasg).

For media enquiries, please contact: Felicia Goh
Manager
(Communications and Marketing)
IMDA
Email: media@imda.gov.sg

Explore more

Related Programmes

10Gbps Nationwide Broadband Network (10G NBN) Grant Programme

10Gbps Nationwide Broadband Network (10G NBN) Grant

IMDA offers grants to support the development of Singapore's 10G Next-Generation Broadband Network, preparing the nation for...

AI for Fun (Primary) Programme

AI for Fun (Primary)

AI for Fun programme offers upper primary students 10 hours of computational thinking and coding enrichment classes.

AI for Fun (Secondary) Programme

AI for Fun (Secondary)

For secondary school students, AI for Fun strengthens students’ digital making skills and complements Ministry of Education...

View all programmes

Named provisions

Code of Practice for Online Safety – Social Media Services (SMS Code) Second Online Safety Assessment Report 2025 Enhanced Supervision

Source

Analysis generated by AI. Source diff and links are from the original.

Classification

Agency
IMDA
Filed
March 31st, 2026
Compliance deadline
June 30th, 2026 (91 days)
Instrument
Enforcement
Legal weight
Binding
Stage
Final
Change scope
Substantive

Who this affects

Applies to
Technology companies
Industry sector
5112 Software & Technology
Activity scope
Online Content Moderation Child Sexual Exploitation Material Detection Terrorism Content Removal
Threshold
Designated Social Media Services (DSMSs) under the SMS Code
Geographic scope
Singapore SG

Taxonomy

Primary area
Consumer Protection
Operational domain
Compliance
Topics
Cybersecurity Public Health

Get Telecom & Technology alerts

Weekly digest. AI-summarized, no noise.

Free. Unsubscribe anytime.

Get alerts for this source

We'll email you when Singapore IMDA Press Releases publishes new changes.

Optional. Personalizes your daily digest.

Free. Unsubscribe anytime.