Facebook’s Leaked Content Moderation Documents Reveal Serious Problems

Facebook’s Leaked Content Moderation Documents Reveal Serious Problems

A significant part of Facebook’s moderation is done from places like Morocco and the Philippines

Highlights
  • Facebook’s rules are reportedly filled with gaps, biases, and errors
  • Moderators get 8 to 10 seconds to review Facebook posts
  • Moderation rules are set by Facebook employees over breakfast meetings
Advertisement

Facebook's thousands of content moderators worldwide rely on a bunch of unorganised PowerPoint presentations and Excel spreadsheets to decide what content to allow on the social network, revealed a report. These guidelines, which are used to monitor billions of posts every day, are apparently filled with numerous gaps, biases, and outright errors. The unnamed Facebook employee, who leaked these documents, reportedly feared that the social network was using too much power with too little oversight and making too many mistakes.

The New York Times reports that an examination of the 1,400 of Facebook's documents showed that there are serious problems with not just the guidelines, but also how the actual moderation is done. Facebook confirmed the authenticity of the documents, however it added that some of them have been updated.

Here are the key takeaways from the story.

Who sets the rules?
According to the NYT report, although Facebook does consult outside groups while deciding the moderation guidelines, they are mainly set by a group of its employees over breakfast meetings every other Tuesday. This employee group largely consists of young engineers and lawyers, who have little to no experience in regions they are deciding guidelines about. The Facebook rules also seem to be written for English-speaking moderators, who reportedly use Google Translate to read non-English content. Machine translated content can often strip out context and nuances, showing a clear lack of local moderators, who will be more capable of understanding their own language and local context.

Biases, gaps, and errors
The moderation documents accessed by the publication also showed that they are often outdated, lack critical nuance, and sometimes plain inaccurate. For example, the Facebook moderators in India were apparently told to remove any comments that are critical of a religion by flagging them illegal, something that is not actually illegal according to the Indian law. In another case, a paperwork error allowed a known extremist group from Myanmar to remain on Facebook for months.

Facebook: Backlash Threatens World's Biggest Platform

The moderators often find themselves frustrated by the rules and say that they don't make sense at times and even force them to leave posts live, which may end up leading to violence.

“You feel like you killed someone by not acting,” one unnamed moderator told NYT.

“We have billions of posts every day, we're identifying more and more potential violations using our technical systems,” Monika Bickert, Facebook's head of global policy management, said. “At that scale, even if you're 99 percent accurate, you're going to have a lot of mistakes.”

The moderators, who are actually reviewing the content, said they have no mechanism to alert Facebook of any holes in the rules, flaws in the process or other threats.

Seconds to decide
While the real-world implications of the hateful content of Facebook maybe massive, but the moderators are barely spending seconds while deciding whether a particular post can stay up or be taken down. The company is said to employ over 7,500 moderators globally, many of which are hired by third-party agencies. These moderators are largely unskilled workers and work in dull offices in places like Morocco and the Philippines, in sharp contrast to the fancy offices of the social network.

As per the NYT piece, the content moderators face pressure to review about a thousand posts per day, meaning they only have 8 to 10 seconds for each post. The video reviews may take longer. For many, their salary is tied to achieving the quotas. With so much pressure, the moderators feel overwhelmed, with many burning out in a matter of months.

Political matters
Facebook's secret rules are very extensive and make the company a much more powerful judge of global speech than it is understood or believed. No other platform in the world has so much reach and so deeply entangled with people's lives, including the important political matters.

NYT report notes that Facebook is becoming more decisive while barring groups, people or posts, which it feels may lead to violence, but in countries where extremism and the mainstream are becoming dangerously close, the social network's decisions end up regulating what many see as political speech.

Facebook Breaches: What Can You Do to Protect Your Data

The website reportedly asked moderators in June to allow posts praising Taliban if they included details about their ceasefire with the Afghan government. Similarly, the company directed moderators to actively remove any posts wrongly accusing an Israeli soldier of killing a Palestinian medic.

Around Pakistan elections, the company asked the moderators for extra scrutiny to Jamiat Ulema-e-Islam while treating Jamaat-e-Islami as benign, even though both are religious parties.

All these examples show the power Facebook possesses in driving the conversation and with everything happening in the background, the users are not even aware of these moves.

Little oversight and growth concerns
With moderation largely taking place in third-party offices, Facebook has little visibility into the actual day-to-day operations and that can sometimes lead to corner-cutting and other issues.

One moderator divulged an office-wide rule to approve any posts if no one on hand is available to read the particular language. Facebook claims this is against their rules and blamed the outside companies. The company also says that moderators are given enough time to review content and they don't have any targets, however it has no real way to enforce these practices. Since the third-party companies are left to police themselves, the company has at times struggled to control them.

Facebook Let Some Companies Access Your Private Messages, Friends Lists: Report

One other major problem that Facebook faces while controlling the hateful and inflammatory speech on its platform is the company itself. The company's own algorithms highlight content that is most provocative, which can sometimes overlap with the kind of content it is trying to avoid promoting. The company's growth ambitions also force it to avoid taking unpopular decision or things that may put it in legal disputes.

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Facebook
Nokia 6.1 Plus Reportedly Receiving New Android Pie Update With Pro Camera Mode, Ability to Hide Notch
Moto G7 Launch Set for February Next Year, Report Claims
Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »