(C) Common Dreams
This story was originally published by Common Dreams and is unaltered.
. . . . . . . . . .
Meta Ditches Fact-Checkers Ahead of Trump’s Second Term [1]
['David Gilbert', 'Simon Hill', 'Adrienne So', 'Brenda Stolyar', 'Boutayna Chokrane', 'Nena Farrell', 'Kieran Alger', 'Louryn Strampe']
Date: 2025-01-07 13:50:32.467000+00:00
Meta announced Tuesday that it is abandoning its third-party fact-checking programs on Facebook, Instagram, and Threads and replacing its army of paid moderators with a Community Notes model that mimics X’s much-maligned volunteer program, which allows users to publicly flag content they believe to be incorrect or misleading.
In a blog post announcing the news, Meta’s newly appointed chief global affairs officer, Joel Kaplan, said the decision was taken to allow more topics to be openly discussed on the company’s platforms. The change will first impact the company’s moderation in the US.
“We will allow more speech by lifting restrictions on some topics that are part of mainstream discourse and focusing our enforcement on illegal and high-severity violations,” Kaplan said, though he did not detail what topics these new rules would cover.
In a video accompanying the blog post, Meta CEO Mark Zuckerberg said the new policies would see more political content returning to people’s feeds as well as posts on other issues that have inflamed the culture wars in the US in recent years.
“We're going to simplify our content policies and get rid of a bunch of restrictions on topics like immigration and gender that are just out of touch with mainstream discourse,” Zuckerberg said.
Meta has significantly rolled back fact-checking and decided to get rid of the content moderation policies it had put in place in the wake of revelations in 2016 about influence operations conducted on its platforms, which were designed to sway elections and in some case promote violence and even genocide.
Ahead of last year’s high-profile elections across the globe, Meta was criticized for taking a hands-off approach to content moderation related to those votes.
Echoing comments Zuckerberg made last year, Kaplan said that Meta’s content moderation policies had been put in place not to protect users but “partly in response to societal and political pressure to moderate content.”
Kaplan also blasted fact-checking experts for their “biases and perspectives” which led to over-moderation: “Over time we ended up with too much content being fact-checked that people would understand to be legitimate political speech and debate,” Kaplan wrote.
WIRED reported last year, however, that dangerous content like medical misinformation has flourished on the platform while groups like anti-government militias have utilized Facebook to recruit new members.
Zuckerberg meanwhile blamed the “legacy media” for forcing Facebook to implement content moderation policies in the wake of the 2016 election. “After Trump first got elected in 2016 the legacy media wrote nonstop about how misinformation was a threat to democracy,” Zuckerberg said. “We tried, in good faith, to address those concerns without becoming arbiters of truth, but the fact-checkers have just been too politically biased and have destroyed more trust than they've created.”
Meta’s decision could have a direct negative impact on media organizations in the US who partner with the company for fact-checking, including Reuters and USA Today. Meta's fact-checking partners did not immediately respond to requests for comment.
[END]
---
[1] Url:
https://www.wired.com/story/meta-ditches-fact-checkers-in-favor-of-x-style-community-notes/
Published and (C) by Common Dreams
Content appears here under this condition or license: Creative Commons CC BY-NC-ND 3.0..
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/commondreams/