Content Moderation Framework for Social Media Platforms Working Paper CC2025/08
Summary
The Competition Commission of South Africa published Working Paper CC2025/08 seeking public input on three complementary regulatory models for content moderation on social media platforms: the AAVMS Draft White Paper framework, amendments to Chapter XI of the Electronic Communications and Transactions Act to introduce an Industry Representative Body model, and a proposed Online Integrity Ombud mechanism. The paper invites stakeholder feedback on feasibility, legal sufficiency, institutional design, and risk mitigation.
What changed
The Competition Commission of South Africa released Working Paper CC2025/08 presenting three complementary regulatory models for content moderation on social media platforms. Model 1 draws from the DCDT's AAVMS Draft White Paper for short-term governance of video-sharing platforms and on-demand services. Model 2 proposes expanding Chapter XI of the Electronic Communications and Transactions Act to explicitly include social media platforms and introduce an Industry Representative Body with 'safe harbour' conditions. Model 3 advocates for a dedicated Online Integrity Ombud to address digital harms.\n\nTechnology companies operating social media platforms in South Africa should monitor these developments as the Commission signals intent to recommend legislative reforms. The paper indicates the models are designed to be sequential and complementary rather than competing, with immediate steps proposed to formalize the IRB model under ECTA. Stakeholders have an opportunity to shape the final recommendations through the consultative process.
What to do next
- Technology companies are invited to review Working Paper CC2025/08 and provide comments on the three proposed content moderation models
- Stakeholders should consider the targeted questions on feasibility, legal sufficiency, institutional design, and risk mitigation outlined in the paper
- Respondents may submit feedback to inform the Commission's final recommendation on platform accountability frameworks
Archived snapshot
Apr 18, 2026GovPing captured this document from the original source. If the source has since changed or been removed, this is the text as it existed at that time.
Reviewing South Africa's Content Moderation for Social Media Platforms:
The Draft White Paper on Audio and Audiovisual Media Services and Online Content Safety Policy Framework, Electronic Communications and Transactions Act 25 of 2002, and an Online Integrity Ombudsman Model WORKING PAPER CC2025/08 Noluthando Jokazi, Kuhle Majola and Ezile Njisane Abstract This discussion paper explores potential regulatory frameworks for addressing content moderation and platform liability on social media platforms in South Africa. It arises from recommendations made in the Competition Commission's Media and Digital Platforms Inquiry ("MDPMI") Provisional Report published in February 2025. The Paper also draws from the Draft White Paper 1 on Audio and Audiovisual Media Services and Online Safety ("AAVMS Draft White Paper") by the Department of Communications and Digital Technologies ("DCDT"), which was recently re- published for public comments in July 2025. It also draws from submissions made by civil society, particularly Media Monitoring Africa ("MMA"), which has advocated for a more robust legal framework by creating a model that centres around a digital harms ombud, the Online Integrity Ombud ("OIO"), which can be found on the Competition Commission's website. 2 The paper is part of a broader consultative process intended to gather public input on how to best balance rights, safety, and accountability on social media platforms. Following engagements and consultations with affected stakeholders, the Commission is undertaking an inclusive policy dialogue to build on the MDPMI's recommendation outlined in paragraphs 17.3 -17.7 of the Main 3Provisional Report Findings, Remedies and Recommendations. Part of the recommendation for addressing challenges related to harmful content, misinformation, and disinformation on social media platforms proposed in the Provisional Report is to consider amending the Electronic 4
MDPMI Main Report https://www.compcom.co.za/wp-content/uploads/2025/02/CC_MDPMI-Provisional-Report_Findings-1Recomendations-1.pdf Para 17.6 -17.7 Add HYPERLINK to MMA Proposal 2 https://www.compcom.co.za/wp-content/uploads/2025/02/CC_MDPMI-Provisional-Report_Findings-Recomendations-1.pdf 3Competition Commission of South Africa Main Report Findings, Remedies and Recommendations, paragraph 17.3- 17.6 page 12
Communications and Transactions Act 25 of 2002 ("ECTA") by expanding Chapter XI of ECTA
to explicitly include social media platforms and to introduce an Industry Representative Body (IRB) model to regulate content moderation responsibilities and "safe harbour" conditions. The discussion outlines in greater detail the Commission's rationale for recommending changes to ECTA and how the models broadly align with one another. We set out detailed guidance on how these build on one another and can be designed, considered, and phased out over the short, medium, and long term. Rather than presenting competing models, the AAVMS framework, ECTA-based IRB, and the MMA Ombud proposal can be understood as complementary and sequential. The AAVMS White Paper proposes using existing legislation to govern video-sharing platforms (VSPs) and on-demand services in the short term, with an Ombud as a possible medium and long- term institutional mechanism. This aligns with the MDPMI's recommendation to use ECTA's IRB framework, particularly as ECTA's Chapter XI provides a liability limitation scheme for intermediaries who subscribe to an approved code of conduct. Stakeholder insights will inform the Commission's final recommendation and may shape legislative reform. To guide stakeholders in engaging with the paper, we have included targeted questions based on feasibility, legal sufficiency, institutional design, risk and mitigation (including a question on risk that has been adopted in the EU, through the European Media Freedom Act ("EMFA"); which limits the overbroad takedown of media or news content). The paper outlines a roadmap for potential implementation, including immediate legal steps to formalise the IRB model under ECTA, a scoping process for the development of the OIO model, and coordination with the DCDT's AAVMS Draft White Paper to ensure policy alignment and avoid duplication. Given the evolving nature of global regulation, South Africa has an opportunity to lead with a hybrid model that addresses short-term immediate gaps in platform accountability for online platforms and long-term legislation that adapts international best practice to local conditions.
Table of Contents
- MODEL 1: The Draft Audio and Audiovisual Media Services and Online Content Safety ("AAVMS Draft White Paper") Policy Framework White Paper as a Complementary Tool for Content Moderation on Social Media 4
- General objectives and Purpose of the AAVMS Draft White Paper ............................................................................ 4 b) Intended Scope of the AAVMS Draft White Paper ....................................................................................................... 5
- The type of content the AAVMS may target ................................................................................................................. 7 c) Overlap between the AAVMS and MDPMI Recommendation ..................................................................................... 7 d) The Type of content the AAVSM will target ................................................................................................................. 8 e) Overlap between the AAVMS and the EU's Digital Services Act .............................................................................. 10
- Conclusion on AAVMS .............................................................................................................................................. 11
- MODEL 2: Chapter XI of the Electronic Communications and Transactions Act in holding Social Media Platforms liable for disinformation ............................................................................................................................... 13 a) ECTA's scope of Application ..................................................................................................................................... 13 b) Chapter XI of ECTA and Social Media Platforms: Current Applications and Future Reforms ................................ 14 c) Application of Section 77 take-down notice............................................................................................................... 21 d) Recognition of Industry Representative Body ............................................................................................................ 22 e) Conclusion on ECTA ................................................................................................................................................. 27
- MODEL 3: Content Moderation using the Ombudsman Model ......................................................................... 31 a) Evaluating Regulatory Authorities and Ombud Mechanisms in Digital Governance ............................................... 32 b) Evaluating the proposed Online Intermediary Ombud .............................................................................................. 34 i) Core components of the Proposed Ombud Model ................................................................................................. 34 ii) The guiding principles and the core values ........................................................................................................... 38 iii) The practical functions, processes, and procedures ......................................................................................... 38
Conclusion: ............................................................................................................................................................ 42
1TABLE 1:ECTA LIABILITY FRAMEWORK .......................................................................................................... 16` 2TABLE 2: KEY INSTITUTIONAL FEATURES OF THE PROPOSED ONLINE INTEGRITY OMBUD .......................................... 35 3TABLE 3: PRACTICAL EXAMPLES OF COMPLAINT PROCESS .................................................................................... 39 4ANNEXURE A: FRAMEWORK COMPARISON ON STATUTORY REGULATORS AND SOFT LAW OMBUDSMAN APPROACHES .. 45MODEL 1: The Draft Audio and Audiovisual Media Services and Online Content Safety
("AAVMS Draft White Paper") Policy Framework White Paper as a Complementary Tool
for Content Moderation on Social Media
- General objectives and Purpose of the AAVMS Draft White Paper 5 1.1. The first version of the White Paper on Audio and Audiovisual Media Services and Online Content Safety ("AAVMS Draft White Paper" or "White Paper") was published in July 2023. It came 6 during a critical shift in the rise of global streaming platforms, user-generated content, and non- linear media consumption. The Policy was drafted through a needs gap, where existing traditional regulatory frameworks were no longer fit for purpose. The Policy draws from international 7 leading frameworks from other jurisdictions, such as the UK, Europe, and Australia, and seeks to strike a balance between Constitutional media freedoms, local content production, and platform accountability through a risk mitigation measure. One of the main objectives of the White Paper is to update South Africa's media regulatory framework to reflect the realities of digitisation, specifically for audio and audiovisual content that consists of Video-on-Demand ("VOD") Video- 8 Sharing Platforms Services (VSPs), Very Large Online Platforms (VLOPs). 9 1.2. Whereas the Audio and Audiovisual Content Services White Paper ("AAVCS") was more 10 focused on new statutory definitions, the AAVMS expands this regulatory framework to include online content safety and introduces measures to target 1) harmful content, 2) user protection, 3) algorithmic accountability, and 4) transparency. This second version aligns with the EU's
Department of Communications and Digital Technologies ("DCDT"), 2025, Draft Audio and Audiovisual Media Servies and Online 5Safety Policy. Available at White Paper on Audio and Audiovisual Media Services and Online Content Safety: A New Vision for South
Africa: Comments invited [accessed 13 June 2025] Previously known as the Audio and Audiovisual Content Services White Paper ("AAVCS"), hereinafter will be referred as the AAVMS 6or White Paper Ibid at page 7 These are services that let users watch video content on demand such as YouTube, DSTV Catch Up, 8 Shows such as Netflix, Disney+, Amazon Prime, Showmax 9 AAVCS refers to Audio and Audiovisual Content Services. The 2020 White Paper introduced this term to update and expand beyond 10traditional broadcasting, grouping together: • Linear broadcasting services,
- Non-linear, on-demand content services (OCS),
- Video-sharing platform services (VSPS) Audiovisual Media Services Directive ("AVMSD"), which includes a broader focus on platform
risk mitigation measures for harmful content and user safety, under a regulatory umbrella. 11
- Intended Scope of the AAVMS Draft White Paper 1.3. The AAVMS does not appear to be principally designed to regulate social media platforms. While it references VLOPs and forecasts general measures to mitigate online harm, its primary regulatory focus is on broadcasting, on-demand content services, and VSPs such as YouTube. The Draft White Paper's emphasis is on aligning with international audiovisual media services frameworks, particularly the EU and Australia and includes proposals for: 1) updating licensing structures under the ECA for VODs and VSPs; 2) for co and self-regulatory models for content governance, and in the first phase or short term, 3) the creation of an online safety ombuds (potentially housed within existing structures like the Film and Publications Board ("FPB") . The approach is intended to 12 expedite the creation of the ombuds while leveraging existing legal frameworks like ECA, the Films and Publications Act as well as ECTA. 13 14 1.4. Although VLOPs are mentioned, particularly in reference to risks, such as algorithmic manipulation, child safety, and synthetic content, these references are marginal and lack definitional clarity. Social media platforms like Facebook, X, or TikTok do not appear in core figures or explanatory diagrams and are discussed in a limited sense in terms of moderation 15 obligations. 1.5. While expanding the definition of VLOPs would clarify the scope question and could ensure platforms fall within the regulatory framework (possibly to include social media platforms to align with the DSA's VLOPs), this amendment alone would not be sufficient to address the structural limitations of the Draft White Paper. The AAVMS would need to be expanded in structure, 16
https://www.ellipsis.co.za/wp-content/uploads/2014/11/5_policy-options-audio-and-audio-visual-content-services-1.pdf accessed 20 11June 2025 para 5.1 Draft White Paper on Audio and Audiovisual Media Services and Online Safety, Govt. Gazette No. 52972 (11 July 2025), Paragraph 125.3.3 Films and Publication Act 65 of 1996 13 Ibid Paragraph 5.3.5 14 Ibid Figure 1- value chain 15 Ibid., Section 5.5.3 (Platform Definition and Obligations - VSPs and VLOPs)
enforcement tools, and possibly greater institutional collaboration. For example, the inclusion of
other co-regulatory bodies may hamper the Draft White Paper's ability to respond to content regulation on multiple instruments, duplicating oversight or leaving gaps in some areas. 1.6. Additionally the AAVMS places primary enforcement responsibility on ICASA with its enforcement tools centred on licensing, spectrum management and broadcast content quotes, 17 which are largely framed around video-sharing obligations, leaving gaps in enforcement and governance for social media platforms that do not distribute audiovisual content formats, particularly those that unfold as hybrid formats, where users share memes, text-image combos, deepfakes and encrypted messages, and it may be limited on platforms whose business model is not audiovisual content at all. 18 1.7. This structure may limit the White Paper's ability to effectively regulate non-audiovisual platforms, particularly those that are text-based or hybrid, such as X, Facebook, or WhatsApp, where a large part of South Africa's mis-and disinformation circulates. Because these platforms 19 fall outside the audiovisual scope (which includes VSPs only, such as YouTube) and may not operate with a local presence, they do not fit neatly into the licensing oversight mechanisms envisioned under the White Paper, potentially leaving significant regulatory gaps. 1.8. Given the limitations in scope of the AAVMS, particularly its focus on audiovisual services rather than social media platforms, the MDPMI proposes an Industry Representative Body ("IRB") model under Chapter XI of ECTA as an immediate and complementary regulatory mechanism. While the AAVMS notes the need for ECTA reform, it does not specify which provisions should be amended or how such amendments would support content moderation for non-audiovisual platforms. 1.9. The MDPMI has therefore proposed targeted amendments to Chapter XI of ECTA and the formulation of an IRB that could fulfil a similar function to the ombud in the short term. This
Ibid., Stakeholder Inputs Summary (FPB on co-regulatory codes). 17 Ibid., Section 4.4.2 (BCCSA's code recognized by ICASA). 18 Department of Communications and Digital Technologies ("DCDT"), 2025, Draft Audio and Audiovisual Media Servies and Online 19Safety Policy. Available at White Paper on Audio and Audiovisual Media Services and Online Content Safety: A New Vision for South
Africa: Comments invited page 5
approach recognises the AAVMS' layered, phased strategy and seeks to operationalise an
alternative route to address immediate online harms. The formation of an IRB offers a viable mechanism for regulatory oversight in the short term, without the need for entirely new legislation, and may serve as a stepping stone towards the more formal establishment of a statutory body. i. The type of content the AAVMS may target
1.10. In our view, the Draft White Paper leaves some ambiguity regarding the precise scope of content it intends to regulate. While it explicitly states that it is targeting harmful content and aligns with the mandates of existing institutions that are against harmful content, like the South African Human Rights Commission ("SAHRC") and the Film and Publication Board ("FPB"). The White Paper emphasizes the protection of constitutional rights and the protection of children from harmful content that may impair physical and mental development. It includes measures for platform users to report and flag harmful content 20 1.11. The Draft White Paper lists mis- and disinformation as examples of online harms, alongside hate speech and harmful content. This appears under broader discussions of risks and the digital environment, and the need for content regulation. It also suggests that platforms such as VSPs have a role to play in addressing online harms such as disinformation, but again, this is not expanded upon to give a legal or operational definition. The paper does not define misinformation or the threshold for disinformation; it does, however, require the establishment of the Online Safety Ombud to mandate to address complaints based on online harms. 1.12. Like VLOPs, the Draft White Paper's mention of misinformation and disinformation is conceptual rather than normative and does not mention which legislative amendment or enforcement mechanism would apply to these categories. Both mis- and disinformation are included as part of online harms and, from the reading of the paper, should be regulated under the Ombud framework.
- Overlap between the AAVMS and MDPMI Recommendation 1.13. The Draft White Paper advances several principles that are compatible with a broader content
Ibid paragraph 5.5.3
moderation framework that coincides with the MDPMI's recommendation in three ways:
1.13.1. Firstly, through the use of existing legislation, the AAVMS advocates using ECTA and 21 other legacy laws to fill regulatory gaps. This is consistent with the Commission's proposal to amend ECTA, and this discussion paper sets out in detail below under Model 2 what this would entail. 1.13.2. Secondly, through a three-stage regulatory approach the Draft White Paper proposes 22 , both immediate and long-term solutions using current instruments in the short term while exploring institutional mechanisms such as the Ombud. This sequencing is in line with the MDPMI's vision to target immediate content moderation interventions in the short term through ECTA and through an Industry Regulatory Body, and over time evolving towards a more formal institutional framework, like new legislation in the long term. However, the two recommendations differ in that the MDPMI's recommendation is to establish an IRB model, while the AAVMS seeks to establish an ombud in the short term. This discussion paper will highlight the advantages of considering an IRB model rather than an Ombud. 1.13.3. Lastly, through regulatory flexibility, the AAVMS allows for a co-regulatory or self- regulatory system, which aligns with the ECTA-IRB framework. Under Chapter XI, platforms may qualify for limited liability protections if they adhere to approved codes of conduct. This principle could also be applied to the governance of content on social media platforms.
- The Type of content the AAVSM will target 1.14. The Draft White Paper recognises the imminent growing threat of online mis- and disinformation, and to address this, it proposes the creation of an Online Safety Media Ombudsman ("Ombud") in
Page 32 figure 4: FPBB Strategic plan - indication of mandates, page 35 para 4.8 -para 5.3.5 21https://www.gov.za/sites/default/files/gcis_document/202507/52972gen3369.pdf Page 52 -54
the short term under stage 1 of the policy implementation roadmap. The Ombud is intended to 23
operate alongside existing institutions like the Films and Publications Board, ICASA, and the South African Human Rights Commission to fill the regulatory gaps between outdated legislation and the exponential growth of digital platform services. 1.15. The ombud is envisioned to play a central role in strengthening the protection of users, particularly in relation to harmful content, misinformation, and disinformation, and platform accountability. The Ombud would be mandated to develop new rules and standards and, where necessary, to enhance the safe consumption of online media services' content. Importantly, it would not 24 function in isolation but would work collaboratively with self-regulating and co-regulating bodies to prevent duplication and ensure coherence in the industry. Comparatively, other international approaches (which will be discussed in detail under Model 3 below) have been considered in designing this model . The Ombud would identify regulatory gaps, assist in the formulation of 25 new unified online content codes, and ensure consistent implementation across platforms and service types. 1.16. However, this is not far from other models that have been suggested, such as Model 2 or Model 3, and does not circumvent the issue of creating further bodies in South Africa's regulatory ecosystem. In other words, even if the Commission recommends expanding the AAMVS to include social media platforms, it will still require creating additional regulatory bodies for efficient oversight. But, to fast-track its implementation, the DCDT proposes exploring the possibility of locating a functioning ombuds within the existing independent or co-regulatory institution, which may align with Media Monitoring Africa's proposal of incorporating the South African Human Rights Commission, thereby leveraging existing infrastructure to combat unnecessary delays or compounded regulatory structures. 1.17. This may assist in centralising complaints and harmonising enforcement processes, and can bridge regulators like ICASA or the FPB where they lack jurisdiction or capacity to act. The ombud could
Ibid., Paragraph 6.4.3 23 Ibid., Executive Summary point 15 (Online Safety Ombud proposal). 24 Ibid., Section 5.3.3 (Ombudsman and existing bodies).
either be established through dedicated legislation or formed as a voluntary mechanism. Current
legal frameworks that allow for the creation of such a body include the ECA, the FPA, and ECTA. The least time-consuming option would be to establish the ombud under existing legislation. This is discussed in more detail below under Model 3. 1.18. For example, the Draft White Paper and ECTA could be interpreted as short to medium-term solutions, enabling immediate oversight using existing frameworks. e) Overlap between the AAVMS and the EU's Digital Services Act 1.19. The Digital Services Act ("DSA") is an EU regulation adopted in 2022 that addresses illegal content, transparent advertising, and disinformation. It applies to online platforms and intermediaries such as social networks, marketplaces, app stores, and others. The DSA contains key requirements such as disclosing to regulators how algorithms work, providing users with explanations of content moderation decisions, and implementing stricter controls on targeted ads. 26 Both the DSA and the AAVMS White Paper emphasise the need to modernise regulatory liability for digital intermediaries. Similarly, the AAVMS framework proposes extending traditional broadcasting regulations (such as "must carry") codes and foreign ownership to encompass online, on-demand, and user-generated platforms. 27 1.20. In terms of content moderation, the DSA has introduced stricter rules for online platforms, requiring them to enforce measures that prevent the spread of illegal content. VLOPs and Search Engines are subject to stricter rules under the DSA. According to Articles 34 and 35, VLOPs and Very Large Online Search Engines ("VLOSEs") are required to conduct annual risk assessments to adjust their services and algorithms to minimise harm and submit to independent audits. Additionally, they are required to give relevant authorities and vetted researchers access to data and algorithms. Similarly, the AAVMS requires the same in principle, proposing that VLOPs 28
Digital Services Act (regulation (EU) 2022/2065) Articles 27 - transparency reporting obligation for online platforms. Article 34 - risk 26assessments by very large online platforms ad very large online search engines. Article 42 requires audits of risk mitigation measure, often involving algorithmic transparency and Article 40 grants Digital services coordinator access to data, including formation about algorithms and compliance monitoring. https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32022R2065 accessed 21 June 2025 https://www.ensafrica.com/news/detail/3447/new-white-paper-proposing-changes-to-south-af 27 Fact-checking and content moderation
and other regulated services implement risk mitigation measures, including transparency
obligations around content curation systems, a code of conduct for harmful content, and potential for oversight through co-regulatory bodies such as ICASA, FPB, the BCCSA, and an Online Safety Ombud. 29 f) Conclusion on AAVMS 1.21. The AAVMS Draft White Paper by the DCDT represents a much-needed intervention to modernise online safety in South Africa. Its focus on streaming services, VOD, and VSPs reflects a legitimate attempt to bridge the regulatory gap between traditional broadcasting and emerging digital services. The White Paper introduces significant regulatory tools, risk mitigation duties, co- and self-regulation, platform accountability measures, and a proposed Online Safety Ombud, which all resonate with international best practice, including the EU's DSA and AVMSD. However, the AAVMS Draft White Paper falls short of fully capturing the regulatory needs posed by social media platforms, and an audiovisual-centric framework limits its application to text- based or hybrid content formats where much of the mis-and disinformation and online harm occurs. Social media platforms such as Facebook, TikTok, WhatsApp, or X are notably absent from the White Paper's main regulatory schema and its enforcement architecture. 1.22. Despite this, the White Paper does open a valuable pathway for regulatory coherence by proposing the use of existing legislation, including ECTA, to address online harms. This proposal aligns well with the MDPMI's recommendation to expand Chapter XI of ECTA as an interim solution for regulating content moderation on social media platforms. The shared emphasis on co-regulatory frameworks, conditional liability protections, and integration of an Ombud mechanism suggests that the two initiatives are not contradictory but potentially connected. 1.23. A layered and sequential approach is set out in the White Paper, for initial first stage approaches to be amendment of ECTA and establishment of Ombud using existing legislation and frameworks, which can provide short to medium term tools to incentivise immediate responsible
Draft White Paper on Audio and Audiovisual Media Services and Online Safety, Govt. Gazette No. 52972 (11 July 2025), Section 2.3 29(Digital Platforms and Online Harms)..
behavioural changes by social media platforms through Industry Regulatory Bodies and codes of
conduct. In the long term, the AAVMS suggests ensuring new legislation where there are gaps in the first and second stage (long to medium term). 1.24. For the AVVMS, ECTA, and an Ombud model could present a pragmatic and flexible roadmap that balances urgency and durability for content moderation, offering immediate legal mechanisms while preserving the space to design more robust legislation. The challenge lies in coordinating the models coherently to ensure responsive, immediate protection.
The objective is to bring social media under a co-regulatory mode aligned with global best practices
like the EU's DSA
- Should the AAVMS Draft White Paper be expanded to explicitly cover social media platforms?
- Would it make sense to expand the definitions of VLOPs to include social media platforms?
What challenges might arise from applying a media-services framework originally designed for
audio and audio-visual content to text-based and image-based platforms like social media platforms?Does the tiered approach proposed by the AAVMS Draft White Paper (e.g., different rules for
broadcasters, VODs, and VSPs) work for regulating social media?How can enforcement be structured to avoid duplication of ICASA, the FPB, and other
regulators?Should a new complaints-handling body be created under the AAVMS framework, or should
platforms be required to integrate with existing regulations?Do you consider misinformation to fall within the ambit of harmful content?
Should misinformation fall under the Ombud's scope, or would this risk overregulation and
unintended constraints on freedom of expression, as some stakeholders have pointed out?What safeguards or thresholds would you propose to ensure that any regulatory treatment of mis
or information is proportionate, rights-based, and does not undermine legitimate journalistic work?MODEL 2: Chapter XI of the Electronic Communications and Transactions Act in holding
Social Media Platforms liable for disinformation
2.1 Chapter XI of the Electronic Communications and Transactions Act 25 of 2002 ("ECTA") deals with the limitation of liability of service providers in cases where they would otherwise be found liable for third party content/ data hosted on their platforms, for example if the service providers platform was a mere transmitter of offensive third party data, the service provider would enjoy partial immunity subject to satisfying the requirements set out in s71 and s72 of ECTA. This Section will explore whether harmful content can be moderated within the legal framework of ECTA. The Section is divided into three sections in which the following will be discussed: (i) ECTA's scope of application; (ii) Chapter XI of ECTA and social media platforms: current applications and future reforms; (iii) Application of Section 77 take-down notice; (iv) Recognition of industry representative body; (v) Conclusion.
- ECTA's scope of Application 2.2 To determine whether ECTA applies to content moderation on social media networks, one must first examine the scope of the Act. Section 4 states that ECTA applies to "any electronic transaction or data message". This necessitates an analysis of what constitutes both a "data message" and an 30 "electronic transaction". ECTA defines "data message" in Section 1 as "data generated, sent, received, or stored by electronic means and includes (a) voice, where the voice is used in an automated transaction; and (b) a stored record". The use of the phrase "and includes" indicates 31 an open-ended, non-exhaustive definition, suggesting that various forms of online content, such as posts, comments, videos, and messages on social media, may reasonably fall within the ambit of a "data message". 32 2.3 Although ECTA does not explicitly define "electronic transaction", it does refer to "transactions" in general terms as being either commercial or non-commercial. This language is broad and
Section 4 of the Electronic Communications and Transactions Act 25 of 2002. 30 Section 1 of the Electronic Communications and Transactions Act 25 of 2002. 31 Ibid.
supports an inclusive interpretation. Communications and interactions on social media platforms
can, therefore, arguably qualify as non-commercial electronic transactions, especially where user- generated content is exchanged or disseminated. In cases involving monetised content, targeted advertising, or platform services, these may also constitute commercial electronic transactions. Further, ECTA defines "data" as "any electronic representation of information in any form." This 33 supports a broad reading of the kind of content that may be regulated, encompassing text, images, audio, and video content common on social media platforms. 2.4 Taken together, these definitions suggest that ECTA can extend to the regulation of social media content, particularly where the moderation of such content involves the generation, transmission, or storage of data messages. Moreover, since social media platforms typically serve both commercial (advertising, user analytics) and non-commercial (user communication) purposes, their operations arguably fall within the scope of ECTA.
- Chapter XI of ECTA and Social Media Platforms: Current Applications and Future Reforms 2.5 Chapter XI is a protective harbour for service providers, in which those service providers who meet the specified conditions may enjoy partial immunity from liability, provided they restrict their role to certain technical or passive functions [such as mere conduits, caching or hosting] without editorial control over content. A service provider is defined in Section 70 of ECTA as 34"any person providing information system services". The term "information system services" is 35
further defined as "a system for generating, sending, receiving, storing, displaying or otherwise
processing data messages and includes the Internet". Similarly, an "information system" is 36
defined using similar language. Notably, from both the definitions provided in ECTA, is that the 37 definitions are significant; they are designed to include a wide range of digital service functions, including those of current and future technologies.
Ibid. 33 Chapter XI of the Electronic Communications and Transactions Act 25 of 2002. 34 Section 70 of the Electronic Communications and Transactions Act 25 of 2002. 35 Ibid. 36 Section 1 of the Electronics Communications and Transactions Act 25 of 2002.
2.6 Moreover, the legislator intended to include future technological advancements in the Internet Technology, minimizing the need to promulgate new legislation after each development. This 38 inclusive approach, which aligns with the object of technology neutrality embedded in Section 2(f) of ECTA, affirms that the legislation should be interpreted in a manner that is neutral about the technology used. This legislative intention ensures that new forms of digital communication, including social media platforms are not excluded from the ECTA's scope merely because they 39 were not envisaged at the time of drafting. Before assessing whether ECTA confers authority to moderate content on social media networks, it is essential to characterise "social media" within this framework. Social media refers to "forms of electronic communication (such as websites for
social networking and microblogging) through which users create online communities to share information, ideas, personal messages, and other content (such as videos)". 40
2.7 Social media networks typically operate on a dual business model that includes both user participation and targeted advertising. By focusing on user retention and engagement, platforms encourage more time spent on their sites, which directly leads to revenue through advertising exposure. Algorithms improve engagement by presenting users with tailored content, which 11 increases user engagement. Social media feeds are essential for user engagement, serving as curated streams of content according to individual preferences and previous interactions. Social media feeds are frequently tailored based on individuals' followings, likes, and interests to give the most relevant content to them. An example of this is TikTok and X's "For You" pages, where users only see content based on previous post engagements and the algorithm. These algorithmically designed feeds catch and hold user attention while providing advertising interleaved with organic information (interstitial ads). The technological functions and economic 41 models of social media networks/platforms fit squarely within the definitions of "information system" and ISPs as outlined in ECTA. The networks generate, receive, process and store data
Ibid. 39 Coetzee SA (2019) "A Legal Perspective on Social Media Use and Employment: Lessons for South African Educators". Available at: A 40Legal Perspective on Social Media Use and Employment: Lessons for South African Educators. Accessed on 18 June 2025. Annexure 4 of the Media and Digital Platforms Market Inquiry. Pg 9.
messages in a manner contemplated by ECTA. Accordingly, social media networks qualify as
service providers under Chapter XI and may be eligible for safe harbour, subject to compliance with the conditions laid down in ECTA. 2.8 This interpretation is consistent with the legislature's purposive intent to future-proof ECTA, ensuring that it remains relevant and adaptive to evolving digital technologies, i.e., social media networks, without requiring the need to constantly revise and alter the law. 42 2.9 The liability framework under Sections 73 to 76 of ECTA applies to different categories of intermediary service providers including those acting as (i) mere conduits under Section 73 of 43 ECTA ; (ii) caches under Section 74 ; (iii) hosts under Section 75; and (iv) providers of 44 45 information tools under Section 76. The summary of each of the functions are summarised below in Table 1. However, for the purposes of assessing content moderation within the existing legal framework of ECTA, this Section will focus on the functions and legal obligations of host intermediaries as set out in Section 75 of ECTA.
1Table 1: ECTA Liability Framework
The Act defines intermediary as "a person who, on behalf of another person, whether as agent or not, sends, receives or stores a particular 43data message or provides other services with respect to that data message." And Section 1 of the Electronic Transactions and Communications Act 25 of 2002. 1. A mere conduit under Section 73 of the Electronics Communications and Transactions Act 25 of 2002, these are intermediary service Mere Conduit Section 73 Transmits or routes Not liable if data is 44providers who merely transmit data without initiating it, selecting the recipient or modifying the content. data without transmitted Caching is defined in Section 74 of the Electronic Communications and Transactions Act 25 of 2002 as the automatic, intermediate, and 45temporary storage of data by a service provider for the sole purpose of making the onward transmission of that data more efficient. 2. alteration; passive automatically and not Caching Section 74 Temporarily stores Not liable if storage is
No. Category Section Function Liability Condition intermediary modified content to improve automatic and the
Source: Competition Commission 2.10 Section 75 of ECTA states the following: "(1) A service provider that provides a service that consists of the storage of data provided by
a recipient of the service, is not liable for damages arising from data stored at the request of the recipient of the service, as long as the service provider-- (a) does not have actual knowledge that the data message or an activity relating to the data message is infringing the rights of a third party; or (b) is not aware of facts or circumstances from which the infringing activity or the infringing nature of the data message is apparent; and 4. Information tools Section 76 Provide automated Not liable if links are (c) upon receipt of a take-down notification referred to in Section 77, acts expeditiously to 3. Hosting Section 75 Stores content at the Not liable unless they links or references provided remove or to disable access to the data. request of users had actual knowledge (e.g., search automatically and (2) The limitations on liability established by this Section do not apply to a service provider (e.g., social media and failed to act. engines) neutrally unless it has designated an agent to receive notifications of infringement and has provided platforms) (This is where the (But we know that's
transmission content is not altered recommended IRB not the case because
efficiency comes in) of algorithms)
through its services, including on its websites in locations accessible to the public, the name, address, phone number, and e-mail address of the agent. (3) Notwithstanding this Section, a competent court may order a service provider to terminate or prevent unlawful activity in terms of any other law. (4) Subsection (1) does not apply when the recipient of the service is acting under the authority
or the control of the service provider." 2.11 As discussed earlier, Section 75 of ECTA operates as a "safe harbour" provision, protecting intermediaries from liability for third-party content stored on their networks, provided that they act in a technical, passive, and automatic manner. From a plain reading of the provision, Section 46 75 protects host intermediaries whose function is to store content at the request of users but may vary in their relationship to that content depending on the operational context. A host 47 intermediary has been described as an intermediary that often performs its functions automatically, requiring minimal human intervention, if any. The intermediary does not have real knowledge 48 of the nature of the content it distributes. It therefore does not exercise real and effective editorial control, even if it deploys automatic screening software. Due to the sheer volume of content disseminated on its platform, host intermediaries are not expected to proactively monitor all hosted content. The DSA defines online platforms as a type of hosting service that stores and discloses 49 information to the public at the request of a recipient of the service. Under the DSA, online platforms include social media platforms and content-sharing sites. 50 2.12 While the term passive does not appear in the text of Section 75 of ECTA, guidance of its meaning can be drawn from Recital 42 of the EU e-Commerce Directive (2000/31/EC). The recital 51
Marx, F.E. and O'Brien, N., 2011. To regulate or to over-regulate? Internet Service Provider liability: the Industry Representative Body 46in terms of the ECT Act and regulations. Obiter, 32(3), pp.551. Available at: To regulate or to over-regulate? Internet Service Provider liability : the Industry Representative Body in terms of the ECT Act and regulations | Obiter. Accessed on 19 June 2025.
Ibid. 48 Ibid. 49 The DSA defines an online platform as "a hosting service that, at the request of a recipient of the service, stores and disseminates 50information to the public.". Under the Organisation for Economic Co-operation and Development(OECD), an online platform is defined as "a digital service that facilitates interactions between two or more distinct but interdependent sets of users (whether firms or individuals) who interact through the service via the Internet" The EU e-Commerce Directive (2000/31/EC).
provides one of the earliest and most authoritative definitions of passive intermediary:
"The exemptions from liability established in this Directive cover only cases where the activity
of the information society service provider is limited to the technical process of operating and giving access to a communication network over which information made available by third parties is transmitted or temporarily stored, for the sole purpose of making the transmission more efficient; this activity is of a mere technical, automatic and passive nature, which implies that the information society service provider has neither knowledge of nor control over the
information which is transmitted or stored." 52 2.13 This recital has informed much of the legal understanding around intermediary liability and has often been used to distinguish between "passive" and "active" intermediaries. By contrast, an active intermediary may involve itself in content moderation, curation, or promotion in such a way that it exercises control over the information. Therefore, Section 75 of ECTA, read in its plain language, may suggest that active host intermediaries may fall outside the protection of safe harbour provisions. For this reason, it is necessary to interpret Section 75 in a manner that gives effect to the legislature's intention as discussed above, and which allows the legislation governing traditional passive hosts to govern active hosts. 2.14 Applying a purposive interpretation of Section 75 of ECTA advertently supports the inclusion of social media platforms within the scope of "hosting service provider". While the literal language appears to be aimed at traditional web hosts or passive intermediaries, the provision was enacted to remedy the legal uncertainty surrounding the liability of service providers that store user- generated content. Thereby, the mischief it aimed to address was the risk of unfairly holding intermediaries liable for [third-party] content which they did not create or control, thereby stifling digital innovation and undermining the right to freedom of expression. 2.15 In the matter before Makgoro J, Birtie van Zyl (Pty) Ltd v Minister for Safety and Security CCT
77/8, the Court affirmed that our Constitution requires a purposive approach to statutory
Ibid.
interpretation. Further, in his judgement Mokgoro J states that "the purpose of a statute play an 53
important role in establishing a context that clarifies the scope and intended effect of the law". 54 Further Makgoro J concedes to the correct application of the purposive statutory approach, in which he cites that in the matter Bato Star Fishing (Pty) Ltd v Minister of Environmental Affairs
and Tourism and Others before Ngcobo J, the Court correctly reiterated the dissenting judgment 55of Schreider JA in Jaga v Donges, NO and Another : 56
"Certainly no less important than the oft repeated statement that the words and expressions used
in a statute must be interpreted according to their ordinary meaning is the statement that they must be interpreted in the light of their context. But it may be useful to stress two points in relation
to the application of this principle. The first is that 'the context', as here used, is not limited to
the language of the rest of the statute regarded as throwing light of a dictionary kind on the part to be interpreted. Often of more importance is the matter of the statute, its apparent scope and
purpose, and within limits, its background." 2.16 In Investigating Directorate: Serious Economic Offences v Hyundai Motor Distributors (Pty) Ltd, Langa DP affirmed that where legislation is reasonably capable of more than one interpretation, the Court must prefer interpretations of legislation that fall within constitutional bounds over those This approach was further emphasised in the case Cools Ideas 1186 CC v Hubbard, that do not. 57 where the Court held that: 58 "A fundamental tenet of statutory interpretation is that the words in a statute must be given their
ordinary grammatical meaning, unless to do so would result in an absurdity. There are three important interrelated riders to this general principle, namely: (a) that statutory provisions should always be interpreted purposively; (b) the relevant statutory provision must be properly contextualised; and
Birtie van Zyl (Pty) Ltd v Minister for Safet and Security CCT 77/8, para 21. Available at: 11.pdf. Accessed on 20 June 2025. 53 Ibid. And Thornton Legislative Drafting 4ed (1996) at 155 cited in JR de Ville above n 18 at 244. 54 Ibid. And 6 [2004] ZACC 15; 2004 (4) SA 490 (CC); 2004 (7) BCLR 687 (CC). 55 Ibid. And Jaga v Dönges NO and Another; Bhana v Dönges NO and Another 1950 (4) SA 653 (A) at 662-3 56 Investigating Directorate: Serious Economic Offences v Hyundai Motor Distributors (Pty) Ltd 2001 (1) SA 545 (CC). Para 23. 57 Cool Ideas 1186 CC v Hubbard and Another (CCT 99/13) [2014] ZACC 16; 2014 (4) SA 474 (CC) 2014 (8) BCLR 869 (CC) (5 June 582014).
(c) all statutes must be construed consistently with the Constitution, that is, where reasonably possible, legislative provisions ought to be interpreted to preserve their constitutional validity. This proviso to the general principle is closely related to the purposive approach
referred to in (a)." 59
2.17 Therefore, one should deploy an interpretation that avoids absurdity and promotes justice, even where the interpretation departs from a strict literal reading. Social media platforms are technologically more complex than traditional intermediaries; however, functionally, they store data provided by users and do not exercise prior editorial control over each post. Where the social media platform expeditiously removes unlawful content once notified, as contemplated in Section 77 of ECTA, they mirror the passive activity envisaged by Section 75 of ECTA. Interpreting 60 Section 75 to include social media platforms aligns not only with the legislative purpose but also 61 promotes the spirit, purport, and objectives of the Bill of Rights, such as freedom of expression, the right to information, and the right to dignity. This interpretation, extended to social media platforms, ensures a balanced, just, and future-oriented application of ECTA.
- Application of Section 77 take-down notice 2.18 Under Chapter XI, members of an Industry Body Representative ("IRB") are exempt from liability when transmitting, caching and storing, hosting, and linking or referring to unlawful content, provided that they were not aware of the content, were not active in creating the content and did not select the receiver or modify the content. This exemption is part of ECTA's "safe harbour" 62 protections and is conditional on the service providers' compliance with the "notice and take- down" procedures outlined in Section 77 of ECTA. When a user or an entity becomes aware of 63 unlawful content or action taking place on the platforms, they may notify the platform of the unlawful content and require it to remove or disable access to the unlawful content. In order to 64 retain their exemption, service providers must act "expeditiously" upon receipt of such a notice
Ibid. Para 28. 59 Section 77 of the Electronics Communications and Transactions Act 25 of 2002. 60 Section 75 of the Electronics Communications and Transactions Act 25 of 2002. 61 Chapter XI of the Electronics Communications and Transactions Act 25 of 2002. 62 Section 77 of the Electronic Communications and Transactions Act 25 of 2002. 63 Ibid.
and follow the procedure set out in Section 77(1) - (5). 65
2.19 In terms of Section 77, any person or organisation may submit a valid takedown notice to a service provider or its designated agent i.e. a recognised IRB. Service providers that are members of the 66 IRB, generally appoint the IRB as their designated agent to receive take-down request, although 67 some may deal with take-down notices themselves. The IRB would forward the notice to the 68 relevant service providers, after checking that the content is actually hosted on the accused organisation's network and that the remedial action specified by the complainant is feasible. The 69 requests are then forwarded by the IRB to the service provider. The complainant should generally receive a response as to the status of the request within three working days. Once a service provider has responded to the notification, either by removing the content concerned or by refusing to remove the content for some reason, the complainant receives further notification from the IRB or directly from the service provider concerned. 70
- Recognition of Industry Representative Body 2.20 In terms of section 72, a service provider only enjoys partial immunity from liability under Chapter XI of ECTA if the service provider is a member of an IRB recognised by the Minister of 71 Communications in accordance with Section 71 of ECTA. To qualify for this immunity, service 72 providers must have adopted and implemented the official code of conduct of that IRB at the time of notice and take-down. Section 71 states that: 73
The Minister may, on application by an industry representative body for service providers by notice in the Gazette, recognise such body for purposes of Section 72. (2) The Minister may only
Section 77 (1) to (5) of the Electronic Communications and Transactions Act 25 of 2002. 65 Section 77 of the Electronic Communications and Transactions Act 28 of 2002. 66 Ibid. 67
Ibid. 69 Ibid. 70 Section 72 of the Electronic Communications and Transactions Act 25 of 2020. 71 Section 71 of the Electronics Communications and Transactions Act 25 of 2002. 72 Section 72 (b) of the Electronics Communications and Transactions Act 25 of 2002.
recognise a representative body referred to in subsection (1) if the Minister is satisfied that-- (a) its members are subject to a code of conduct; (b) membership is subject to adequate criteria; (c) the code of conduct requires continued adherence to adequate standards of conduct; and (d) the representative body is capable of monitoring and enforcing its code of conduct adequately. 74
2.21 To be recognised by the Minister as an IRB, interested parties must comply with the guidelines for the recognition of IRBs, found under GN 1283 in GG 29474 of 2006-12-14 (the Guidelines). 75 Although, initially the Guidelines were prepared specifically for ISPs, the Guidelines also expressly state that they serve as a guideline for other categories of information system service Further the Guidelines must be applied mutatis mutandis - allowing for alteration in providers. 76 line with new developments. 77 2.22 The Guidelines include the following: (i) a description of ECTA's background; (ii) the Guidelines' founding principles; its objective and scope; (iii) a Best Practice Code of Conduct that contains minimum and preferred standards of conduct by an IRB's members; (iv) a checklist of the adequate criteria necessary for membership; and (v) the considerations the Minister is to take into account when determining whether an IRB is capable of adequately monitoring and enforcing its own Code of Conduct and therefore eligible for recognition in terms of ECTA. Section 5 of the Guidelines sets out the minimum standards required by an IRB for recognition by the Minister. Further, under Section 6 members do not need to strictly adhere to the preferred requirements, rather members should strive to meet the standards. 2.23 As per Section 2.1 of the Guidelines, it was the legislatures intention to create a self-regulated sector wherein the regulation of content and conduct on the internet should be carried out by the industry rather than the Government. The key objective of the Guidelines is to ensure that IRB's 78
Section 71 of the Electronics Communications and Transactions Act 25 of 2002. 74 GN 1283 in GG 29474 of 2006-12-14. Available at: Government Notice 1283. 75 Ibid. 76 Ibid. 77 Marx F and O'Brien N (2011) To Regulate or to over regulate? Internet Service Provider Liability: The Industry Representative Body
and their members comply with ECTA before enjoying protection under Chapter XI. The 79
intention is for the internet to be self-regulated by industry players rather than Government intervention. Industry players can voluntarily accept the Guidelines. This self-regulation must be 80 effective, using practical and realistic methods that respect and promote South Africa's constitutional values. It is not the legislatures intention for IRBs to assume the role of the law 81 enforcement authorities and they therefore do not have a general obligation to monitor for illegal conduct or content. While IRBs are not expressly required under ECTA or the Guidelines to 82 report illegal activity to law enforcement, they are obliged to implement effective takedown procedures and ensure their members comply with the legal framework enshrined in ECTA. In serious or persistent cases, IRBs may choose to escalate matters to authorities, but this is not a statutory obligation under current law. The Guidelines contain the international Best Practice Code of Conduct, which reflects one of the aims of Chapter XI, that is to provide users with remedies against unlawful content. 83 2.24 Section 71(2)(a) of ECTA requires members of the IRB to adhere to a Code of Conduct. Section 84 5 of the Guidelines sets out the industry Best Practice Code of Conduct, the standards of which have been laid down in Sections 5 and 6 of the Guidelines. The said Section 5 and 6 of the Guidelines contain [16] minimum and preferred requirements based on international best practice. The minimum requirements are mandatory for IRBs to comply with in order to be 85 recognized by the Minister. While the preferred requirements are optional and set out the 86 standards that industry players should aim to achieve. 87 2.25 Section 71(2)(d) of ECTA requires IRBs to monitor and enforce their Code of Conduct. Part 3 88
In Terms of The ECT Act and Regulations. Accessed 1 June 2025. Ibid. 79 Ibid. 80 Ibid. 81 Ibid. 82 Ibid. 83 Section 71(2)(a) of the Electronic Communications and Transactions Act 25 of 2002. 84 GN 1283 in GG 29474 of 2006-12-14. Available at: Government Notice 1283. p 7 - 17. 85 Marx F and O'Brien N (2011) To Regulate or to over regulate? Internet Service Provider Liability: The Industry Representative Body 86In Terms of The ECT Act and Regulations. Accessed 1 June 2025. Ibid. 87 Section 71(2)(d) of the Electronics Communications and Transactions Act 25 of 2002.
of the Guidelines provides effect to Section 71(2)(d) of ECTA. Summary of Part 3 of the
Guidelines is as follows: 2.25.1 Nature & Independence of the IRB: The Minister must consider whether the IRB is appropriately structured and constituted. The Minister must therefore consider how representative the IRB is of the industry; its independence, whether it is unbiased; and whether it has a proper constitution-making provision for a variety of aspects. 2.25.2 Complaints, disciplinary, and take-down procedures: The IRB's complaints, disciplinary, and take-down procedures will only be considered effective where there is widespread knowledge of the Code of Conduct and the aforementioned procedures amongst the IRB's members and the public. The procedures must be effective and binding. 89 2.25.3 Monitoring procedures: The Minister should consider whether there is an effective monitoring and enforcement policy in place. The policy should include procedures for: regular compliance spot checks; initiating investigations or following up complaints; and checking compliance with conditions set down as a result of complaints or disciplinary proceedings. It should also include annual compliance statements from members and compulsory reporting by members of take-down notices. 90 2.25.4 Reporting duties: To receive continued recognition, an IRB must report any changes to the IRB's Constitution, Article of Association, and Code of Conduct to the Minister. The Minister must then evaluate whether the IRB is still eligible for recognition. The IRB must also provide an annual report by 28 February each year on the following: 2.25.4.1 Membership of the IRB. 2.25.4.2 Statistics on take-down notices and complaints received. 2.25.4.3 Disciplinary proceedings against members; and 2.25.4.4 Any other information the Minister may require. 91 2.26 Albeit one might ask, "What compels social media platforms to voluntarily subscribe to
GN 1283 in GG 29474 of 2006-12-14. Available at: Government Notice 1283. And Marx F and O'Brien N (2011) To Regulate or to 89over regulate? Internet Service Provider Liability: The Industry Representative Body In Terms of The ECT Act and Regulations. Accessed 1 June 2025. Ibid. 90 Ibid.
an IRB model?" A key limitation in adopting such a model lies in incentivising platforms when
their liability is not yet clearly established in law. The lack of legal compulsion exposes a risk; social media platforms may choose not to voluntarily adhere to an IRB or disregard the recommendation altogether. This limitation is further compounded by the lack of binding obligations under Chapter XI, potentially weakening enforcement prospects. Therefore, an IRB model may offer a flexible framework for local accountability; its success hinges on the willingness of platforms to cooperate, a condition that is not guaranteed. As such, the model carries a risk of limited or uneven uptake, which may weaken its intended impact on content moderation and digital rights enforcement. 2.27 However, the Digital Company (Pty) Ltd v Meta Platforms Inc case illustrates a potential shift in judicial willingness to bring large social platforms to take into account and take down harmful content. While the judgment did not create a precedent for liability, it underscored the reputational risks companies face when they fail to act against harmful content. Public litigation, adverse media coverage, and transparency reporting may therefore serve as soft levers, pressuring platforms toward voluntary alignment with local accountability norms. Still, the effectiveness of such mechanisms remains uncertain, and the absence of formal enforcement may continue to undermine the IRB model's legitimacy and uptake.
Referring complaints
2.28 In the instance that the relevant social media platform refuses or fails to act on takedown notices or evades responsibility to address harmful content, users may turn to alternative avenues for recourse. The South African Human Rights Commission (SAHRC) remains a key institution empowered to investigate complaints involving violations of constitutional rights, including those related to dignity, privacy, and freedom of expression online. In addition, individuals may lodge complaints with the Information Regulator, particularly where the harmful content involves the misuse of personal data in contravention of privacy protections. Other routes include the Film and Publication Board (FPB) in matters involving illegal or harmful digital content, such as hate speech or child sexual abuse material. While these institutions do not currently have direct jurisdiction over social media platforms in the same
way a dedicated digital regulator might, their involvement can still trigger public inquiries,
enforcement actions, or legal pressure that encourages platform accountability. In the absence of comprehensive legislation, these bodies play a complementary role in bridging the regulatory gap.
- Conclusion on ECTA 2.29 Section 72 of ECTA grants partial immunity to service providers, but only if they are members of an IRB that has been formally recognised by the Minister and if they have adopted and implemented that IRB's Code of Conduct. Section 71 sets out the framework for such recognition, requiring the Minister to be satisfied that the IRB has an enforceable Code of Conduct, adequate membership criteria, continuous adherence to standards, and the capacity to monitor and enforce compliance. These requirements are further expressed in the Guidelines, which were initially designed for ISPs but now extend, mutatis mutandis, to other categories of information system service providers, including social media platforms. 2.30 The Guidelines outline ECTA's background, the purpose and scope of the IRB system, and minimum as well as preferred standards. A key principle is the promotion of effective industry self-regulation rather than direct government oversight. The Guidelines emphasize that while service providers are not expected to act as law enforcement, they are still required to take reasonable steps to report and address unlawful activities. IRBs must implement a Code of Conduct grounded in international best practice, particularly Sections 5 and 6 of the Guidelines, which set out the minimum mandatory requirements and optional preferred standards. 2.31 The ECTA framework, particularly Chapter XI, can be interpreted as a tool to address the spread of disinformation, especially where such content results in harm or violates constitutional rights. While ECTA does not expressly mention "disinformation," the structure of Chapter XI is built on the premise of preventing the infringement of the rights of others, a principle that aligns with the constitutional duty to safeguard dignity, equality, and freedom of expression. In this context, disinformation may be considered harmful where it incites violence, promotes discrimination, or undermines public health and safety, thereby infringing on constitutionally protected rights.
Section 77 can be seen as a protective tool against human rights violations, and its proper
application could support a rights-based approach to platform regulation in the absence of comprehensive legislation.
- Legislative and definition reform
Does the current definition of a "service provider" or "internet service provider in ECTA
sufficiently capture the role of platforms that do more than [passively] transmit or host data, i.e., social media platforms, search engines, and marketplaces? Or is the definition outdated and would require an amendment that includes active intermediaries?In your opinion, should the definition be broadened to reflect platform-based or
algorithmic content moderation systems? Provide an example of how the definition that incorporates new technology should be defined.Would it be more effective to distinguish between technical service providers and content
platforms based on their influence over online discourse and user experience, as the EU does under the Digital Services Act ("DSA")? What risks or benefits could that bring?Do you support a legislative basis for content moderation oversight that is modelled on
amending ECTA and using an IRB model?Effectiveness and accountability of IRBs
How accessible and effective are IRBs for ordinary users who encounter harmful online
content? Considering the diverse user group on social media platforms, is the current take- down process usable, or is it too technical/legalistic for the average complainant?Is the current IRB model forward compatible? In other words, would the current IRB model
respond to emerging harms like AI-generated misinformation, cross-platform manipulation, or algorithmic bias?If platforms claim immunity but consistently fail to act on take-down notices, should there
be legal consequences? Kindly propose the consequences. Would the recommended consequence require amending ECTA or simply enforcing what already exists more vigorously through existing legislation?International best practice from the DSA
The DSA introduces the term "Very Large Online Platform" (VLOP) obligations, such
as systemic risk assessment, algorithmic transparency, and external audits. In your view, would these measures be reasonably enforceable in the South African context under a Code of Conduct of an IRB?What aspects of the DSA's tiered, risk-based framework (e.g., due diligence obligations,
trusted flaggers, public transparency reports) could realistically be transplanted into
South Africa's legal framework? Which might clash with our constitutional, institutional,
or resource constraints?
Should platforms be required to assess the societal impact of their algorithms, including
the spread of disinformation, polarisation, and harm to children or vulnerable users? Or would this cross the line into unconstitutional censorship or overregulation?In the DSA, users have appeal rights when content is taken down. Should South Africa
introduce counter-notice mechanisms under ECTA to ensure a fair process for content creators and platform users?Risks and Safeguards
In your view, does the current South African legal framework, particularly the takedown
provisions under the Electronic Communications and Transactions Act (ECTA), sufficiently protect media publishers from the risk of blanket or arbitrary takedown notices issued by online platforms? How would the introduction of a provision similar to Article 18 of the European Media Freedom Act (EMFA), which requires platforms to give prior notice and reasons before removing lawful media content, provide stronger protection for independence and media freedom in South Africa?What safeguards do you think should be introduced in South Africa to ensure fair and
transparent content moderation for media publishers?Structural and Institutional design
Various IRB models exist under ECTA; for example, ISPA is funded by membership fees.
Would this structure work for the recommended IRB for social media platforms? Please elaborate.Should the IRB's functions be housed within existing regulators like the Information
Communications Technology Regulators Forum, ICASA, the FPB, or SAHRC?Could the social media platforms establish their own IRB, and is there an incentive to do
Additional information
Is the Minister's current role in recognising IRBs sufficient to ensure public
accountability and consistency with constitutional standards? Should the Film and Publication Board or the SAHRC be given a stronger role in oversight or appeals of take- down notices?Would you support the establishment of a national Forum to which all recognised IRBs
must report annually or bi-annually?Should the Forum make public recommendations or guidelines based on IRB reporting, or should
it remain a closed consultative body?Could the Forum serve as a neutral venue for resolving public complaints about IRB decisions,
especially in the absence of a formal appeal mechanism in ECTA?Given South Africa's constitutional framework and limited regulatory capacity, do you
believe that a purely self-regulatory model, in other words, led by industry through IRBs, is sufficient to ensure accountable and effective content moderation? Or would a co- regulatory model, where government plays a formal oversight or standard-setting role
alongside industry, be more appropriate? What are the risks and benefits of each in the South African context?
- What risks or limitations do you foresee in implementing the proposed model?
Should the Guidelines be amended, and what clauses/ features should be included or
removed?MODEL 3: Content Moderation using the Ombudsman Model
3.1 In response to the growing threat of online misinformation and the absence of a dedicated regulatory structure in South Africa, the MMA has developed a concept note that proposes the establishment of an OIO as part of a solution to regulating the dissemination of harmful content on online platforms. The MMA's OIO concept is envisioned as an independent, transparent, and accountable body designed to address the legitimacy crisis in content moderation by focusing on platform behaviour rather than targeting individual users. Its mandate includes overseeing moderation decisions, responding to user complaints, and ensuring fairness in addressing misinformation and disinformation. The OIO concept draws on lessons from the London School of Economics and Political Science's ("LSE") "Tackling the Information Crisis report" and the 92 EU and Australia. 93 3.2 This section of the paper is structured in three parts. First, it explores the respective roles and regulatory strengths of a statutory regulatory authority versus an ombuds model in addressing harmful content disseminated on online platforms, with particular attention to issues of independence, accountability, and extra-territorial jurisdiction to determine application. Second, it offers a critical analysis of the MMA's proposed OIO, evaluating its design, objectives, and the potential risks and limitations associated with its implementation. Finally, the section concludes with an assessment of whether the OIO is a long-term solution for content moderation in South Africa, drawing comparative lessons from international regulatory approaches referenced in the
Beckett, C. and Livingstone, S., 2018. Tackling the information crisis: a policy framework for media system resilience-the report of the 92LSE Commission on Truth Trust and Technology. Media Monitoring Africa concept note on "Briefing note: Pathways to an Online Integrity Ombud". Pg 15.
MMA's concept note to inform the feasibility and effectiveness of the model.
- Evaluating Regulatory Authorities and Ombud Mechanisms in Digital Governance 3.3 The Annexure 1 below informs the observations in this section. Due to the complex collection of interconnected difficulties and constraints associated with digital platforms' rapid expansion and societal effect, jurisdictions around the world such as Australia, United Kingdom ("UK") and the European Union ("EU") are progressively establishing dedicated regulators and while academic scholars and interested persons explore the potential of an ombuds systems. The establishment of such regulatory organisations addresses the need for accountability, user protection, fair competition, transparency, and confidence in the digital ecosystem. In contrast, ombud models for this purpose remain mostly proposals rather than established mechanisms. 3.4 These comparator regions or jurisdictions face substantial issues from harmful content, such as misinformation, hate speech, child sexual abuse imagery, and child-harming content, all of which require rigorous regulation. Australia's Online Safety Act puts a statutory duty of care on digital services to proactively assess, reduce, and manage content-related hazards. Similarly, as established by the Online Safety Act, the eSafety Commissioner oversees mandated risk assessments. 3.5 Additionally, Annexure 1 reveals that the UK's Online Services Act requires digital platforms to proactively monitor and remove harmful content and establishes duties that apply based on tiered classifications of services enforced by OfCom. The EU's DSA sets a cross-border, harmonized notice-and-action regime for content moderation, focused on procedural fairness and transparency, with strong penalties for non-compliance mechanisms to guarantee freedom of speech. 3.6 Regulatory models establish unambiguous statutory authority to require compliance, impose sanctions and enforce corrective action. This legal clarity is crucial given leading platforms' huge influence and economic power. An ombud model often lacks substantial enforcement capabilities, constraining them to mediation or recommendation roles, which regulators may consider as
insufficient to compel effective action against harmful online content.
3.7 One other reason for comparator regions opting to adopt a regulatory body as opposed to an ombud, is because regulatory procedures place an emphasis on proactive, risk-based systematic oversight rather than simply responding to individual complaints as they arise. For example, The UK and EU regimes compel platforms to perform ongoing risk assessments and implement preventative measures that authorities can enforce. Australia's legislation also requires platform 94
accountability through enforced compliance systems overseen by the eSafety Commissioner. 95
3.8 Additionally, these regulatory frameworks incorporate safeguards through explicit definitions and standards of practice that seek to strike a balance between media freedom and the need to prevent online harm. Ombud models, on the other hand, frequently struggle to strike a balance between these competing interests due to their limited authority. Ombud models often function as dispute resolution and mediation bodies with limited authority to enforce any binding powers. This limits their capacity to compel platforms to change their behaviour or issue substantial punishments, which regulators believe are vital to address online harms. 3.9 Australia, the UK, and the EU have established statutory authorities with enforcement powers to address the substantial, complex, and systemic concerns posed by harmful internet content, with a focus on proactive oversight, legal accountability, and balancing freedom of expression with public safety. Ombud models remain proposals primarily because they lack the authoritative legislative authority, enforcement skills, and size required to successfully regulate massive digital platforms in the face of fast-expanding online harms. While ombuds can supplement regulatory frameworks by resolving disputes and fostering public trust, current international trends favour strong statutory regulation to address the difficulties of digital content governance. 3.10 According to the LSE report, ombud models are more like proposals than full regulatory mechanisms because they lack the statutory enforcement powers required to compel large platforms to act decisively. Instead, these approaches prioritize monitoring, evaluation, and trust-
building through transparency and counsel, complementing rather than replacing legal authorities.
This approach reflects the complexities of balancing free expression and combating harmful content, as well as the need for layered governance involving multiple actors, including platform self-regulation, independent monitoring such as the Independent Platform Agency (IPA), and statutory regulators with enforcement authority. Thus, the LSE advocates a hybrid framework in which monitoring agencies such as the proposed IPA promote accountability and media ecosystem resilience without taking on full regulatory control, leaving enforcement to statutory regulators, a trend seen in Australia, the UK, and the EU. 96
Evaluating the proposed Online Intermediary Ombud
3.11 This section provides an overview of the proposed OIO model and provides an assessment of its application while anticipating potential risks observed by the Inquiry. The concept of the proposed OIO is embedded on three main components, namely, (i) its structure and functions; (ii) the guiding principles and the core values; and (iii) the practical functions, processes and procedures. The concept note also provides a roadmap for establishing the envisioned OIO. The discussion will follow that sequence, interweaving the Inquiries own analysis and recommendations at each stage.Core components of the Proposed Ombud Model
3.12 The concept note proposes that the OIO should be established as a statutory body under new legislation, akin to the Community Schemes Ombud Service Act 9 of 2011 (SA) and the City of 97 Johannesburg--Ombudsman By-laws, 2023. The primary objective of the OIO would be to 98
address the rapid dissemination of disinformation on social media platforms and to ensure these platforms play an active role in removing harmful content. While the concept note acknowledges 99 the presence of an existing legal framework to mitigate online harms, it identifies a regulatory gap
London School of Economics and Political Sciences Truth, Trust and Technology Commission, "Tackling the Information Crisis: A 96Policy Framework for Media System Resilience". Community Schemes Ombud Service Act 9 of 2011 (SA). 97 the City of Johannesburg--Ombudsman By-laws, 2023, promulgated on 28 February 2024 under the Municipal Systems Act 32 of 2000 98(SA). Media Monitoring Africa concept note on "Briefing note: Pathways to an Online Integrity Ombud". Pg 19.
concerning platform accountability for the spread of such content. The OIO is thus envisioned 100
as the body to fulfil this oversight function. 3.13 To support extra-territorial application, the concept note draws inspiration from the UK's Online
Safety Act 2023 (UK), c. 50, proposing a three-tier test to establish jurisdiction. This test 101
includes: (i) the number of users in South Africa; (ii) whether South African users constitute a target market of the platform; and (iii) whether the content on the platform poses a serious risk to users in South Africa. A jurisdictional nexus must be established by demonstrating a connection 102 between the harmful content and at least one of the factors in the criteria. 103
3.14 Concerning membership, the concept note provides that an online platform's membership in the
OIO will be determined by whether it meets the extra-territoriality test briefly outlined above. Membership may take one of three forms: (i)voluntary, (ii) automatic, or (iii) mandatory, each of which carries its limitations and risks. Regardless of the form chosen, additional legislative steps 104 would be necessary. For instance, platforms could be permitted to offer services in South Africa only if they are members of the OIO and comply with its code of conduct, upon which they would be granted a licence to operate within the market.
2Table 2: Key Institutional Features of the Proposed Online Integrity Ombud
Ibid. pg 10. Powers of the OIO Based on the incentive-disincentive model. Three categories of 100 Online Safety Act 2023 (UK), c. 50. And Media Monitoring Africa concept note on "Briefing note: Pathways to an Online Integrity 101Complaints mechanisms Ombud". Pg 20. Under the proposed complaints process, users are first required to powers:
Ibid. 102i. The complaint mechanisms. submit their grievances directly to the relevant digital Ibid. 103 Ibid. ii. Its accountability function; and platform. Should the platform fail to resolve the complaint in
- Its digital literacy and empowerment initiatives a timely or satisfactory manner, the matter may then be Source: Media Monitoring Africa concept note 3.15 The above table 2, provides an overview of the remaining structural and functional components of
Financial foundation & Platforms should fund the OIO through annual contributions based the envisioned OIO. While the proposed model offers a strong framework to hold large online Funds escalated to the Online Integrity Ombud (OIO) for on revenue. In line with best practices, the OIO will maintain platforms accountable for the spread of harmful content, it also presents certain limitations. A
independent intervention and resolution. Proposed remedies financial controls and audits. Two additional funds are
include (i) Order to pay a fineproposed: (1) a user support or compensation fund for harm ; (ii) Referral mechanism. 105 106 The concept note proposes that the fines should be tiered according to the severity and frequency of the violations. Refer to page 21 of 105Accountability function the concept note. Further the collected fines can be paid into a digital literacy caused by inadequate platform moderation, and (2) a digital The OIO may offer tax, financial, or certification incentives for The concept suggests that in the instance that it does not have jurisdiction to hear a complaint, it should have powers to refer the 106Digital literacy & The OIO will promote user empowerment by supporting or fund where the funds can be used to compensate aggrieved platforms demonstrating responsible content moderation, literacy fund to support education initiatives. Both would be complaint to an appropriate regulator or body to hear the complaint.
Ibid. Empowerment managed by the OIO with strong governance and users and in conjunction can be used to support media and verified through audits and annual transparency reports. developing digital literacy initiatives and requiring platforms
digital literacy campaigns. These reports will inform incentives and guide OIO oversight. to report on their own efforts in this area. transparency mechanisms
primary limitation concerns the enactment of new legislation to establish the OIO's legal
personality, a process that is often complex and lengthy in South Africa. The Parliamentary Monitoring Group conducted a study on the time frame of publishing legislation. The data 107 indicate that, on average, it takes approximately 410 days from the introduction of a bill to its commencement, with highly controversial bills taking several years due to factors such as complexity, stakeholder engagement, and legislative priorities. Although urgent bills can be expedited, however, such instances are rare. 108 3.16 There is significant scope to develop and clarify existing legislation, such as the ECTA, to squarely address content moderation. This approach could provide a faster, more practical solution compared to creating entirely new laws, which often take years to enact and leave many users vulnerable in the interim. The Digital Companies v Meta Platforms Inc case illustrates the immense financial and resource burden required to compel platforms to remove harmful content, with platforms frequently using tactics to avoid accountability. This underscores the urgent need 109 to leverage existing laws for an imminent solution while longer-term regulatory frameworks are developed. 3.17 In this regard, the proposed OIO model draws clear parallels with the provisions of Chapter XI of the ECTA, which imposes obligations on service providers to remove harmful content from their Under Chapter XI, Internet Service Providers ("ISP" in the context of this paper, the platforms. 110term ISP shall be interpreted to mean online platforms) are afforded partial indemnity from
liability, provided they comply with specific requirements, including adherence to an established code of conduct. Similarly, the IRB model incorporates detailed guidelines that mandate the 111 prompt removal of harmful content and require ISPs to submit regular transparency reports detailing content that has been removed. These transparency measures serve to enhance accountability and create an evidentiary basis for the application of incentives or sanctions.
How Long Does It Take To Pass And Enact Bills? | PMG. 107 Ibid. 108 Miller N (2025) Landmark Ruling In Johannesburg High Court: Meta Ordered To Combat Online Child Sexual Abuse Material and the 109Evolving Role of AI. Available at: Landmark Ruling In Johannesburg High Court: Meta Ordered To Combat Online Child Sexual Abuse Material And The Evolving Role Of AI - Tech4Law. Chapter XI of the Electronics Communications and Transactions Act 25 of 2002. 110 Section 72 of the Electronics Communications and Transactions Act 25 of 2002.
- The guiding principles and the core values 3.18 The proposed guiding values for the OIO emphasise a rights-based approach, ensuring that its
activities are informed by the promotion and protection of human rights and fundamental freedoms. The OIO is intended to function as an independent and impartial institution, free from undue influence, to effectively carry out its mandate. Furthermore, accessibility is also a listed 112 core value, with the OIO committed to providing inclusive and respectful support to individuals from diverse backgrounds, languages, abilities, literacy levels, and ages. Finally, the concept note proposes accountability to anchor the OIO, with the OIO required to submit both functional and administrative reports to a parliamentary committee to ensure oversight, alongside maintaining internal mechanisms to address complaints about its own performance. 113
- The practical functions, processes, and procedures 3.19 The proposed functions, processes, and procedures of the OIO are envisaged to be those of an
independent and unbiased authority focusing on ensuring information integrity, enabling digital literacy, and enforcing ethical standards in the digital public realm. Practically, the Minister of Communications and Digital Technologies will have to confirm the appointment and tenure of the OIO, similarly to the Ombud Offices in South Africa. The appointment provisions may also address the OIO's core competencies and credentials. Furthermore, under the current tenure trends, the OIO would be appointed for five years(renewable). The proposed legislation will also address the suspension and removal of the OIO. Furthermore, the OIO's Office should consist of the OIO, and other staff members deemed necessary for the efficient and effective execution of the OIO's rights, responsibilities, and duties. Some practical examples of the OIO's functions are shown below. 114
Media Monitoring Africa concept note on "Briefing note: Pathways to an Online Integrity Ombud". Pg 23. 112 Ibid. 113 Ibid, pg 24.
3Table 3: Practical examples of the complaint process
Source: Media Monitoring Africa concept note 3.20 Regarding the above table 3, in scenario one, the ombud demonstrates referral mechanisms and procedural safeguards. The OIO's preliminary assessment and referral to the SAHRC exemplifies Practical Platform P submits an annual report to the OIO outlining different efforts it has responsible gatekeeping by ensuring that cases outside the ombud's jurisdiction, such as those Scenario 2 made, including revising its community standards to provide clear guidance on involving hate speech inciting violence (which may violate constitutional or criminal laws), are Practical User A approaches the OIO with a video posted on V, a social media platform disinformation and how to report it. P improved its user controls, giving users properly escalated. This demonstrates the ombud's ability to provide timely triage, procedural Scenario 1 affiliated with the OIO. The video depicts someone inciting religious hatred and more control over the moderating settings that apply to their profiles. P also gives fairness, respect for complainant consent, and collaboration with statutory bodies. However, it encouraging the use of violence against members of a religious organization. The statistics, backed up by evidence of its efforts to respond quickly to concerns exposes a fundamental limitation: when content explicitly violates civil or criminal law (e.g., OIO's complaints office conducts a preliminary evaluation and concludes that it against deception. P also provides access to empowering tools that help children hatred propaganda and encouragement to violence), the OIO has no investigative or prosecuting does not fall within the OIO's mission and jurisdiction, and in collaboration with and young people recognize and respond to deception. The OIO evaluates the authority and may only refer cases. Real-time harmful content may fall through the cracks if User B, immediately submits the matter to the South African Human Rights report and determines that P has taken reasonable steps to improve its processes
Commission (SAHRC). and responses. It certifies P as a "responsible platform" and reduces their
membership price by 25% to acknowledge their efforts.
jurisdiction is unclear or complex events emerge that cross legal boundaries.
3.21 Further, the second scenario fosters positive platform behaviour. Recognising and certifying responsible platforms, such as Platform P, and offering actual benefits (membership fee reductions) demonstrate the OIO's ability to use reputation and economic incentives to encourage platform reforms. The OIO's evaluation of annual reports encourages accountability, transparency, and ongoing development in moderation policy, user empowerment, and responsiveness, all of which are critical components of a healthy information ecosystem. However, reliance on voluntary compliance and soft laws demonstrates the ombud's reliance on platform collaboration, as its "certification" and fee incentives lack the coercive authority of legislative regulation. Non-member platforms and uncooperative actors may be unaffected, reducing the system's reach and efficacy. The OIO methodology excels in promoting systemic improvement through audits, reports, and industry standards. Ombuds, on the other hand, lack the enforcement teeth of regulators when it comes to urgent, specific internet harms, particularly those demanding immediate material removal or punitive punishment. The ombud's ability to rapidly mitigate genuine harm remains limited. 3.22 As previously noted, jurisdictions such as Australia, the UK, and the EU have moved toward statutory regulatory frameworks for reasons discussed above. One such reason includes prioritising proactive risk management and systemic supervision while maintaining strong legal authority and unambiguous enforcement authorities (e.g., Ofcom, eSafety Commissioner). Further regulatory agencies have the authority to enforce platform compliance, levy fines, and act on illegal or harmful content on a large scale. On the other hand, ombud models are frequently advocated in various jurisdictions as complementary watchdogs/monitors, with a focus on transparency and advice rather than statutory enforcement power. Furthermore, the LSE has stated that, while ombuds and watchdogs promote trust and openness, they are unable to address high-stakes content hazards, particularly in light of increasing digital damages and large global platforms. 3.23 An ombud model, as seen in the practical scenarios above, can be effective in improving platform responsibility, procedural fairness, and overall systemic improvement. It promotes transparency, best practices, and provides a low-barrier entry channel for complaints. However, its limitations,
such as a lack of statutory enforcement, reliance on voluntary compliance, restricted jurisdiction, and significant referral delays, make it ineffective as the sole governance instrument for addressing serious or systemic online problems. Hate speech, disinformation, and resolute platforms highlight these flaws. 3.24 Considering this, the immediate establishment of the OIO as a statutory regulatory body may not be feasible. Therefore, a two-pronged, phased approach is recommended. In the short term, the launch of the OIO as a soft law or voluntary mechanism (such as a code of conduct or IRB style model). While in the long term, the transition to a fully empowered statutory regulator will occur through dedicated legislation once there is sufficient stakeholder consensus, institutional readiness, and legislative capacity. This approach balances the need for urgent action with the benefits of legal certainty and durability, reflecting international best practice. A. Institutional Design & Legal Framework
What are your views on establishing the OIO through dedicated legislation versus
embedding it within existing laws or regulatory bodies?Many stakeholders have the view that South Africa should not be creating new structures
but using structures that already exist. Are there existing structures that can be used in place of the creation of a new OIO?Considering South Africa's regulatory landscape, what potential overlaps or conflicts with
other bodies (SAHRC, ICASA, the Commission) should be addressed?What lessons from the abovementioned international models are relevant to South Africa?
How can the OIO's mandate be clearly delineated to avoid confusion or duplication with
the existing SAHRC's functions?Independence, Credibility & Public Trust
What measures do you believe are necessary to ensure the ombud's independence?
How should the OIO be funded and structured to ensure independence and long-term
viability?What funding models would ensure the OIO's independence and operational sustainability?
Should platforms contribute financially to the OIO's activities, and if so, how should
contributions be structured?What risks would affect the credibility, neutrality, or public trust of the OIO?
Mandate & Powers
What role should this ombud play: advisory, investigative, enforcement, or mediation?
Do you think an OIO can effectively handle online harm issues?
How can platforms be incentivised or compelled with rulings or recommendations made by
the OIO?Rights, Responsibilities & Compliance
How can the OIO best balance platform accountability with the protection of constitutional
rights such as freedom of expression and access to information?How can the OIO ensure compliance from large, multinational platforms that may have
limited physical presence in South Africa?Conclusion:
4.1 This discussion paper has explored the feasibility and complementary nature of the three potential regulatory pathways to address content moderation on social media platforms in South Africa: The AAVMS, altogether limited in its scope, has proposed stages in targeting immediate protection of
online harm on digital platforms in South Africa, the first stages of the Draft White Paper aligns with the MDPMI's recommendations for short to medium term solution for content moderation through the expansion of ECTA, under Chapter XI provisions and the MMA's proposal for the establishment of an OIO. Drawing on both domestic policy and international best practices, the paper concludes that no single model, in isolation, is sufficient to address the complexity, scale, and urgency of online content moderation. 4.2 Instead, a layered, phased regulatory framework is recommended, one that leverages legislative instruments in the short term, such as amending ECTA to establish an IRB, while building institutional capacity and stakeholder consensus. ECTA, through Chapter XI, already offers a legally established intermediary liability regime that could be adapted to include social media platforms under a clarified definition of "service providers". This provides a constitutional and future-oriented avenue for incentive platform accountability through a code of conduct administered by recognised IRBs. The existing IRB is expanded to include platform-based services and offers pragmatic first steps to foster co-regulatory and co-regulation while ensuring the government retains oversight. 4.3 Considering this, our position (also reflected in the paper) is to now recommend that the OIO be considered medium-term relief, while the government works towards enacting legislation to establish a dedicated regulatory authority. The paper does highlight, however, that a statutory Ombud must be approached cautiously, acknowledging legislative timeframes, regulatory overlaps, and questions of jurisdiction and enforceability. A soft law or voluntary implementation in the interim, suggested by the AAVMS, possibly housed within existing regulatory instructions, could accelerate its impact while laying the groundwork for eventual statutory powers. 4.4 Ultimately, these three models should not be seen as mutually exclusive, but as sequential and complementary. South Africa's regulatory framework must evolve in step with technological development and societal risks, ensuring that platform accountability mechanisms are responsive, rights-based and locally relevant.
4.5 The Commission encourages stakeholders to engage meaningfully with the proposals outlined in
this discussion paper and provide input on the feasibility, design and implementation pathways. This will ensure that South Africa adopts a fit-for purpose regulatory regie that balances constitutional freedoms with the urgent need to address online harm and disinformation online. -End-
4Annexure A: Framework Comparison on Statutory Regulators and Soft Law Ombudsman Approaches
Mandate Has a broad This proposed Regulate online The function of the IPA Ensure platform Promote human rights
mandate to ombuds model safety, media, would be observing and accountability, online, including investigate and would aim to telecoms, and playing a policy protect users' rights, freedom of expression enforce provide users illegal/harmful advisory role. Its promote transparency and access to remedy. takedowns of with a right to online content. purpose would be to in content cyber abuse, recourse where Enforce provisions establish an institution moderation, and harmful content platforms have of the UK Online that supports initiatives regulate [declared] Scope of and ensure the failed to address Safety Act. improving information Gatekeepers under Regulates all Applies to large Regulates It would be anticipated Applies to Currently applies to 46 Proposed, this model is still in its theoretical stages and has been proposed across many other jurisdictions including SA. 115Regulator/ Application safety of all harmful content reliability. the DMA. eSafety Digital Platform OfCom Independent Platform EU Appeals Council of Europe - digital platforms digital platforms platforms to apply to platforms intermediaries, EU member states and Ibid. 116 Ombud Commissioner Ombuds Scheme Agency (proposed mechanism Internet Ombudsman digital platform or misleading accessible in operating in accessible in the generating substantial especially Very Large their obligations under (proposed model)users. information Australia, Australia (if UK, particularly advertising revenue in Online Platforms the ECHR. Australia UK EU model)adequately. the UK. (VLOPs) and Very
Remedies • Issue takedown • Mediation, • Fines, takedowns, Information not • Restoration of • Mediation outcome,
notices to apology orders, service blocks, available. content, improved recommendations, soft platforms or transparency transparency transparency, judicial enforcement. CoE may individuals; demands; notices, and appeals, and fines for escalate to Courts.
Issue civil • Referral to • Focus on non-compliance
penalties; regulator (if prevention and DMA allows forIndustry codes; created); and deterrence. structural remedies
and • No criminal or against [declared]
Transparency Empowering • Refer matters to civil gatekeepers. Platforms must Information not Platforms must Information not VLOPs must publish States must provide Enhancing Proposed as part Communications Would be established by Articles 20, 21, 53 of ECHR, including declared by the those hosting user-Large Online Search obligations legislation file Basic Online available. publish available. detailed transparency transparent procedures law enforcement enforcement Online Safety of co-regulatory Act 2003, and legislation. the DSA. CM/Rec(2018)2, and foreign-based. Minister). generated content. Engines (VLOSEs) Act 2015 or self-Online Safety Act other CoE standards agencies. powers unless Safety transparency reports every 6 to meet freedom of operating in the EU. regulatory legally Expectations reports and risk months; all platforms expression standards
initiatives empowered. (BOSE) reports, assessments. must disclose
Source: Competition Commission
and transparency moderation rules & (non-binding soft law reporting is decisions (Arts. 15, guidance). required for 34).
systemic harms.
Named provisions
Related changes
Get daily alerts for South Africa Competition Commission
Daily digest delivered to your inbox.
Free. Unsubscribe anytime.
About this page
Every important government, regulator, and court update from around the world. One place. Real-time. Free. Our mission
Source document text, dates, docket IDs, and authority are extracted directly from SA CompCom.
The summary, classification, recommended actions, deadlines, and penalty information are AI-generated from the original text and may contain errors. Always verify against the source document.
Classification
Who this affects
Taxonomy
Browse Categories
Get alerts for this source
We'll email you when South Africa Competition Commission publishes new changes.
Subscribed!
Optional. Filters your digest to exactly the updates that matter to you.