Meta’s content moderation policies have come under scrutiny, with a senior executive admitting to excessive removal of user content across its platforms. Nick Clegg, former British deputy prime minister and Meta‘s president of global affairs, recently acknowledged the company’s high error rates in content moderation. He emphasized the need for enhanced precision and accuracy when applying their rules, stating that the current system often hampers free expression.
During a recent press briefing, Clegg expressed regret over the company’s rigorous removal of COVID-19 pandemic-related posts. “We know that when enforcing our policies, our error rates are still too high, which gets in the way of the free expression that we set out to enable,” the former leader of Britain’s Liberal Democrats said. “Too often, harmless content gets taken down or restricted, and too many people get penalized unfairly.”
He explained that decisions during the pandemic were driven by uncertainty and, in hindsight, were overly strict. Clegg pointed out that users have voiced concerns about over-enforcement, leading to the removal or restriction of innocuous content.
In recent months, Threads—another Meta-owned social media platform—has been notably affected by erroneous takedowns. For instance, Meta’s systems mistakenly suppressed a photo of President-elect Donald Trump, prompting a public apology. The company’s Oversight Board has also raised alarms about the risk of excessive removal of political speech, especially ahead of the U.S. presidential election.
Despite these issues, Meta—owned by billionaire Mark Zuckerberg and the parent company of Facebook—has not implemented significant changes to its content rules since the election.
Clegg indicated that updates might be forthcoming, referring to the rules as a “living, breathing document.” When questioned about Zuckerberg’s recent meeting with Trump and Meta’s stance on government pressure to moderate content, Clegg refrained from providing specific details.