META | Published January 8, 2025
Meta’s platforms are built to be places where people can express themselves freely. That can be messy. On platforms where billions of people can have a voice, all the good, bad and ugly is on display. But that’s free expression.
In his 2019 speech at Georgetown University, Mark Zuckerberg argued that free expression has been the driving force behind progress in American society and around the world and that inhibiting speech, however well-intentioned the reasons for doing so, often reinforces existing institutions and power structures instead of empowering people. He said: “Some people believe giving more people a voice is driving division rather than bringing us together. More people across the spectrum believe that achieving the political outcomes they think matter is more important than every person having a voice. I think that’s dangerous.”
In recent years we’ve developed increasingly complex systems to manage content across our platforms, partly in response to societal and political pressure to moderate content. This approach has gone too far. As well-intentioned as many of these efforts have been, they have expanded over time to the point where we are making too many mistakes, frustrating our users and too often getting in the way of the free expression we set out to enable. Too much harmless content gets censored, too many people find themselves wrongly locked up in “Facebook jail,” and we are often too slow to respond when they do.
We want to fix that and return to that fundamental commitment to free expression. Today, we’re making some changes to stay true to that ideal.
Ending Third Party Fact Checking Program, Moving to Community Notes
When we launched our independent fact checking program in 2016, we were very clear that we didn’t want to be the arbiters of truth. We made what we thought was the best and most reasonable choice at the time, which was to hand that responsibility over to independent fact checking organizations. The intention of the program was to have these independent experts give people more information about the things they see online, particularly viral hoaxes, so they were able to judge for themselves what they saw and read.
That’s not the way things played out, especially in the United States. Experts, like everyone else, have their own biases and perspectives. This showed up in the choices some made about what to fact check and how. Over time we ended up with too much content being fact checked that people would understand to be legitimate political speech and debate. Our system then attached real consequences in the form of intrusive labels and reduced distribution. A program intended to inform too often became a tool to censor.
We are now changing this approach. We will end the current third party fact checking program in the United States and instead begin moving to a Community Notes program. We’ve seen this approach work on X – where they empower their community to decide when posts are potentially misleading and need more context, and people across a diverse range of perspectives decide what sort of context is helpful for other users to see. We think this could be a better way of achieving our original intention of providing people with information about what they’re seeing – and one that’s less prone to bias.
READ FULL ARTICLE
SOURCE: www.about.fb.com
RELATED: Meta abruptly ends US fact-checks ahead of Trump term
This photo illustration created on January 7, 2025, in Washington, DC, shows an image of Mark Zuckerberg, CEO of Meta, and an image of the Meta logo. Drew Angerer, AFP
ABS-CBN NEWS | Published January 8, 2025
Social media giant Meta on Tuesday slashed its content moderation policies, including ending its US fact-checking program, in a major shift that conforms with the priorities of incoming president Donald Trump.
“We’re going to get rid of fact-checkers (that) have just been too politically biased and have destroyed more trust than they’ve created, especially in the US,” Meta founder and CEO Mark Zuckerberg said in a post.
Instead, Meta platforms, including Facebook and Instagram, “would use community notes similar to X (formerly Twitter), starting in the US,” he added.
Meta’s surprise announcement echoed long-standing complaints made by Trump’s Republican Party and X owner Elon Musk about fact-checking that many conservatives see as censorship.
They argue that fact-checking programs disproportionately target right-wing voices, which has led to proposed laws in states like Florida and Texas to limit content moderation.
Zuckerberg said that “recent elections feel like a cultural tipping point towards, once again, prioritizing speech” over moderation.
The shift came as the 40-year-old tycoon has been making efforts to reconcile with Trump since his election in November, including donating one million dollars to his inauguration fund.
Trump has been a harsh critic of Meta and Zuckerberg for years, accusing the company of bias against him.
The Republican was kicked off Facebook following the January 6, 2021, attack on the US Capitol, though the company restored his account in early 2023.
Zuckerberg dined with Trump at his Mar-a-Lago resort in November in a sign of strengthening ties.
READ FULL ARTICLE
SOURCE: www.abs-cbn.com
Be the first to comment