Facebook owner Meta has ended its third-party fact-checking program and will instead rely on its users to report misinformation, as the social media giant prepares for Donald Trump’s return as president.
The $1.6 trillion company said Tuesday that it “will allow greater expression by lifting restrictions on some topics that are part of mainstream discourse and focusing our enforcement on illegal, high-risk violations” and “taking a more personalized approach to political content.”
âIt’s time to go back to our roots of freedom of expression on Facebook and Instagram,â said Mark Zuckerberg, CEO and co-founder of Meta. he said in sharing the video.
Trump strongly criticized Zuckerberg during the US presidential election campaign last year, and indicated that if Meta interfered in the 2024 elections, he would “spend the rest of his life in prison.”
But the Facebook founder sought to rebuild relations with the president-elect after his victory in November, including visiting him at his residence in Mar-a-Lago, Florida.
On Monday, Meta moved to make further progress with the incoming US presidential administration by appointing Dana White, the founder of the UFC and a prominent Trump supporter, to its board of directors.
White will sit on Meta’s board alongside another Trump ally, technology investor Marc Andreessen, who has long pushed the company to ease its censorship of online content.
Zuckerberg said the complexity of its content moderation system, which was expanded in December 2016 after Trump’s first election victory, had introduced “a lot of errors and a lot of censorship.”
Starting in the US, Meta will move to a so-called âcommunity feedbackâ model, similar to the one used by Elon Musk X, which allows users to add context to controversial or misleading posts. Meta itself will not write community feedback.
Meta said there is no “immediate plan” to end third-party fact-checking and community feedback outside the United States. It is unclear how this system would be compatible with regulations such as the European Union’s Digital Services Act and the UK’s Online Safety Act, which require online platforms to put in place measures to tackle illicit content and protect users.
Zuckerberg added that Meta will also change its systems to “significantly reduce” the amount of content its automated filters remove from its platforms.
This includes lifting restrictions on topics such as immigration and sex, to focus its systems on âunlawful and high-risk abusesâ, such as terrorism, child exploitation and fraud, as well as content related to suicide, self-harm and eating disorders.
He acknowledged that the changes would mean Meta would “catch less bad stuff”, but said the trade-off was worthwhile to reduce the number of “innocent people’s” posts removed.
These changes bring Zuckerberg closer to alignment with Musk, who reduced content moderation after purchasing the social media platform, then called Twitter, in 2022.
âJust like in X, community feedback will require agreement among people with a range of viewpoints to help prevent biased evaluations,â Meta said in a blog post.
âThis is awesome,â Musk said in an X post referencing the Meta changes.
Joel Kaplan, a prominent Republican who announced last week that he would take over as the company’s head of global affairs from Sir Nick Clegg, told Fox News on Tuesday that third-party fact-checkers were “extremely biased.”
Referring to Trump’s return to the White House on January 20, Kaplan added: “We have a real opportunity now. We have a new administration and a new president coming in, who are great advocates of free speech and that makes a difference.”
As part of the changes announced on Tuesday, Meta also said it would move its US-based content moderation staff from California to Texas. âI think it will help us build the trust to do this work in places where there’s less concern about our teams being biased,â Zuckerberg said.
Meta’s changes have been criticized by online safety activists. Ian Russell, whose 14-year-old daughter Molly committed suicide after viewing harmful content on sites including Instagram, said he was âappalledâ by the plans.
âThese movements could have serious consequences for many children and young people,â he said.
Zuckerberg first introduced a third-party fact-checking service as part of a set of measures in late 2016 designed to address criticism of rampant misinformation on Facebook.
he He said At the time the company needed “stronger detection” of misinformation and would work with the news industry to learn from fact-checking systems used by journalists.
Meta said it now spends billions of dollars annually on its safety and security systems, and employs or contracts with tens of thousands of people around the world.
But on Tuesday, Zuckerberg blamed governments and “legacy media” for pushing his company to “impose more and more censorship.”
He said Meta would work with the Trump administration “to deter governments around the world that go after American companies and push for more oversight.”
He pointed to restrictive regulations in China and Latin America, as well as highlighting what he called the “increasing number” of European laws that “institutionalize censorship and make it difficult to build anything innovative there.”
Meta shares were down 2 percent Tuesday morning to $616.11.
https://www.ft.com/__origami/service/image/v2/images/raw/https%3A%2F%2Fd1e00ek4ebabms.cloudfront.net%2Fproduction%2Fa0e12c92-5189-4310-84f8-18e5101ec83e.jpg?source=next-article&fit=scale-down&quality=highest&width=700&dpr=1
2025-01-07 16:28:00
#Meta #ends #thirdparty #factchecking #scheme #prepares #return #Trump