The Meta Oversight Board’s Advisory Opinion on Global Community Notes Rollout
By Yohannes Eneyew Ayalew & Maria O’Sullivan
Meta is a powerful global company, operating social media platforms that shape public opinion and influence elections. However, the company’s attempts to counter false or misleading information on its platforms has been the subject of widespread criticism from academic experts and civil society. These deficiencies have significant implications for the role that social media plays in information manipulation and the dissemination of extremist content. Despite these concerns, Meta made a decision in 2025 to end its third-party fact-checking program and move to a “community notes” model. Against this background, a recent advisory opinion handed down by Meta’s Oversight Board in March is important as it assesses the potential human rights impacts of utilising the “community notes” program on the company’s platforms on a global scale.
In its advisory opinion, the Meta Oversight Board found that while community notes may enhance users’ freedom of expression and improve online discourse, a global “one-size-fits-all” approach could pose real-world harms in crisis and conflict zones, repressive regimes, and electoral contexts. This blog post therefore analyses this important advisory opinion.
“Community Notes” and content moderation
At present, Meta’s approach to counter misinformation consists of three strategies, one of which is to provide additional information or context through mechanisms such as community notes. In simple terms, community notes are a form of crowd-sourced content moderation in which users write brief assessments of potentially misleading or inaccurate posts.
This “crowd sourcing” approach to content moderation is currently utilised on X (formerly Twitter). The problem is that it relies on the community’s ability to be able to distinguish between “truth” and misinformation and may not be seen as legitimate by the public as professional fact checking.
It is important to note that community notes systems are aimed at contextualisation, they do not result in takedown of material. Thus, in seeking to expand the use of community notes globally, it could be said that Meta is prioritising contextualization over content removal.
An example of the implications of this for containing extremism is the use of Community Notes in regulating harmful false claims that fuelled the UK’s 2024 Southport anti-immigration riots. These riots included the targeting of mosques and hotels housing asylum seekers, driven in part by false claims that spread on social media platforms relating to the killing of three children in Southport. UK think tank, Demos, found that ‘Community Notes, as deployed in July and August 2024, failed to mitigate the harmful, inaccurate information that fuelled the crisis’. One particular problem it identified was that Community Notes were largely invisible to users during the riots, so could not prevent false and harmful information spreading.
Meta’s Request and the Oversight Board’s Advisory Opinion
The Community Notes Advisory Opinion was written in response to a request by Meta for the Board to provide a policy advisory opinion on the factors Meta should consider when deciding which countries, if any, to omit when expanding the community notes framework outside of the United States.
In its Advisory Opinion, the Board did not make any recommendation about the legitimacy or wisdom of extending community notes outside the United States as a policy. The Board explicitly states that it “neither endorses nor opposes this approach.” Rather, it assumes that Meta will “make community notes available everywhere” and “addresses what considerations should preclude or condition launching the product in a particular country.”
However, the Board did find as an overall matter that community notes are inadequate as a standalone solution for addressing harmful misinformation.
It outlines the considerations that should preclude or condition launching the product in a particular country. The Board recommended that Meta:
- Initially omit countries with a history of coordinated disinformation networks;
- Not introduce community notes during crises or protracted armed conflict conditions;
- Delay introducing notes where there is language complexity that Meta cannot technically and operationally accommodate;
- Exercise extreme caution where social division and disagreement that drives political violence cannot be simply clarified; and
- Omit countries facing persistent obstacles to Internet access.
Implications for Human Rights in the Digital Age and Countering Extremism
The Meta Oversight Board has begun to emerge as a global human rights adjudicator in matters of online freedom of expression since its inception in 2020. The rollout of community notes could have impacts on human rights in at least the following ways. First, community notes may enhance freedom of expression under article 19 of the International Covenant on Civil and Political Rights (ICCPR) by enabling direct user participation, counter-speech, and the dissemination of contextual information. However, as the Board cautions, their deployment outside the United States can also pose significant human rights risks—including exposure to retaliation, manipulation, and the amplification of violent extremism—triggering responsibilities for Meta under Principle 13 of the UN Guiding Principles on Business and Human Rights to prevent and mitigate harm.
Second, the community notes system may place individuals at risk of retaliation if anonymity is compromised in contexts where governments suppress dissent both offline and online. As the Board notes, such risks may implicate a range of civil and political rights, including contributors’ rights to privacy (Article 17, ICCPR), to security of the person (Article 9, ICCPR), and, in extreme cases, even the right to life (Article 6, ICCPR).
Third, it may also pose risks to the right to participate in public affairs under Article 25 of the ICCPR, as coordinated disinformation networks can game the system, through brigade ratings or stripping context, to elevate misleading narratives and even amplify violent extremism, especially in fragile information environments. . Prior to this advisory opinion, concerns have been raised by the U.N. Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Irene Khan, who in a September 2025 report underscored that community notes are susceptible to capture through manipulated ratings and agenda-setting, alongside inconsistent standards and limited expertise.
Conclusion
We believe that community notes place a considerable burden on the public to regulate social media. This is because they require ordinary users to devote time, expertise, and labour to evaluating misleading content at scale, often in polarised environments marked by conflict, disinformation campaigns, linguistic inequalities, or fear of retaliation. As such, their effectiveness and legitimacy must be assessed in light of the geopolitical contexts in which they operate. We therefore agree with the cautions and recommendations set out in the Oversight Board’s advisory opinion.
We also observe that there are particularly weaknesses inherent in the use of Community Notes when utilised in the context of extremist content and political violence, as demonstrated by the weaknesses demonstrated in the use of Community Notes during the 2024 UK Southport riots. We therefore hope that Meta implements the recommendations of the Oversight Board Advisory Opinion and heeds the cautions outlined in the Board’s insightful and important decision.
Dr Yohannes Eneyew Ayalew is a European Research Council (ERC) Postdoctoral Fellow at the Faculty of Law, The Hebrew University of Jerusalem. Dr Ayalew was an inaugural Majority World Initiative (MWI) Scholar at Yale Law School in 2023/24. His work has been published in leading academic journals at the intersection of law, technology and human rights. His most recent publication analyses a third world critique into content moderation: ‘A third-world critique of the international human rights-based approach to content moderation’ (2025) Transnational Legal Theory.
Dr Maria O’Sullivan is an Associate Professor and the Theme Lead of the Technology-based Harms Research Stream of Centre for Law as Protection in the Deakin Law School in Melbourne, Australia. Maria is the author of a number of international and national publications on the subject of human rights, public law and technology. Her most recent publication analyses electoral disinformation: ‘The Role of Electoral Commissions in Countering Disinformation Implications for Neutrality and Trust’ [2025] Public Law.
Author note: The authors sent a written submission to the Oversight Board and joined one of the stakeholder events to provide additional expert evidence to the Board as part of the consultation process.
Imagine credit: Photo by Mariia Shalabaieva on Unsplash