Meta has announced the discontinuation of its US-based fact-checking program, replacing it with a community-driven system akin to the “Community Notes” feature on X. The company has also eased restrictions on discussions around controversial topics such as immigration and gender identity, signaling a major shift in its content moderation approach.
This policy reversal contrasts with CEO Mark Zuckerberg’s long-standing advocacy for active content moderation, a stance that has drawn criticism from conservatives accusing Meta of censorship. The change follows the appointment of Republican policy executive Joel Kaplan as the company’s global affairs head and Dana White, CEO of Ultimate Fighting Championship and an ally of President-elect Donald Trump, to Meta’s board of directors.
Zuckerberg stated that the decision reflects a renewed focus on free expression, acknowledging past mistakes and overreach in content moderation. He emphasized that the company would simplify its policies, minimize errors, and ensure that content removal requires a higher level of confidence.
Meta’s fact-checking initiative, launched in 2016, was discontinued without prior notice to its partner organizations. Jesse Stiller, managing editor at Check Your Fact, expressed surprise and concern over the abrupt decision. Other partners, including Reuters, AFP, and USA Today, have yet to comment, though Meta’s Oversight Board has endorsed the shift.
The changes will impact Facebook, Instagram, and Threads, which collectively serve over 3 billion users worldwide. Zuckerberg has recently voiced regrets about past content moderation decisions, including those related to Covid-19, and the company’s $1 million donation to Trump’s inaugural fund reflects a broader pivot in its political posture.
Critics argue that this move undermines efforts to combat misinformation and harmful content. Ross Burley, co-founder of the Centre for Information Resilience, described it as a step backward, driven more by political considerations than sound policy.
Meta’s community-based model allows users to flag misleading posts for additional context rather than relying on independent fact-checking organizations. The rollout of Community Notes will begin in the US within the next few months, with plans for iterative improvements throughout the year. Unlike its predecessor, Meta will not decide which notes appear on posts.
The shift draws comparisons to X’s Community Notes, which is under investigation by the European Commission for its handling of illegal content and information manipulation. Meta’s new approach has received praise from X CEO Linda Yaccarino, who predicts other platforms will adopt similar measures.
As part of this transformation, Meta will relocate its trust and safety teams from California to Texas and other US locations. The company also plans to narrow its automated systems’ focus to severe violations, such as terrorism and drug-related content.