Meta’s Bold Shift: Fact-Checking Program Ends as Community Notes Take Over

Meta’s Bold Shift: Fact-Checking Program Ends as Community Notes Take Over
Meta, the parent company of Facebook, Instagram, and Threads, has announced a major shift in its content moderation strategy. The tech giant revealed plans to end its third-party fact-checking program, replacing it with a Community Notes initiative modelled after the approach used on Elon Musk’s X (formerly Twitter).
Joel Kaplan, Meta’s Chief Global Affairs Officer, broke the news on Tuesday in a blog post. He explained that the new system aims to allow users to freely comment across Meta’s platforms while empowering the community to provide context for potentially misleading posts. The transition will begin in the United States.
Why Meta is Moving On
Initially, the fact-checking program was designed to give users more information about viral hoaxes and misinformation through independent experts. However, Kaplan admitted that the program didn’t work as intended.
“Over time, we ended up with too much content being fact-checked that people would understand to be legitimate political speech and debate,” Kaplan said. He added that the program, intended to inform users, often became a tool for censorship.
The process led to intrusive labels on posts and reduced their distribution, sparking criticism from users who saw it as limiting free expression.
David Paul Morris—Bloomberg/Getty Images
Enter Community Notes
Kaplan highlighted the success of Community Notes on X, describing it as a model that relies on diverse perspectives to decide when posts need additional context.
“We’ve seen this approach work on X—where they empower their community to decide when posts are potentially misleading and need more context,” Kaplan explained. He believes that Community Notes can help Meta provide accurate information without falling into the trap of bias or censorship.
Once implemented, Meta will not write or select Community Notes. Instead, contributing users will create and rate the notes, with consensus required from people holding diverse viewpoints. This feature is intended to ensure fairness and prevent one-sided ratings.
Kaplan emphasized that transparency will be a key component of the program. Meta plans to share how different perspectives influence the displayed notes and is working on ways to make this information accessible.
Contextual History and Controversies
Meta’s content moderation policies have faced significant backlash in the past. One of the most notable incidents involved former U.S. President Donald Trump, whose Facebook account was temporarily suspended following January 6, 2021, Capitol riot.
Trump criticized Meta as “an enemy of the people,” accusing the company of targeting conservative voices. His account was reinstated in 2023, but the controversy highlighted growing dissatisfaction with Meta’s approach to moderation.
In December 2024, Meta disclosed that it had removed millions of pieces of content daily. However, it acknowledged errors, stating that “one to two out of every 10 of these actions may have been mistakes.”
A Shift Toward Community Empowerment
The move to Community Notes marks a significant departure from Meta’s previous policies. By handing control to its user base, the company aims to foster a more inclusive and unbiased system of content moderation.
Kaplan believes this approach will better align with Meta’s original intention of providing users with accurate and balanced information about the content they encounter.
This change reflects a broader trend in social media, where platforms are increasingly turning to community-driven solutions to manage misinformation and promote transparency.
0 comment