Facebook Found to Repeatedly Trend Fake News Since Firing Human Editors

Facebook Found to Repeatedly Trend Fake News Since Firing Human Editors
Highlights
  • Facebook promoted an iPhone story from the site fakingnews.firstpost.com
  • On August 26, Facebook laid off its editorial team
  • Mark Zuckerberg has insisted that Facebook is "not a media company"
Advertisement

The Megyn Kelly incident was supposed to be a anomaly. An unfortunate one-off. A bit of (very public, embarrassing) bad luck. But in the six weeks since Facebook revamped its Trending system - and a hoax about the Fox News Channel star subsequently trended - the site has repeatedly promoted "news" stories that are actually works of fiction.

As part of a larger audit of Facebook's Trending topics, The Washington Post logged every news story that trended across four accounts during the workdays from August 31 to September 22. During that time, we uncovered five trending stories that were indisputably fake and three that were profoundly inaccurate. On top of that, we found that news releases, blog posts from sites such as Medium and links to online stores such as iTunes regularly trended.

"I'm not at all surprised how many fake stories have trended," one former member of the team that used to oversee Trending told The Post. "It was beyond predictable by anyone who spent time with the actual functionality of the product, not just the code."

(Also see: This Is the News Facebook Chooses for You to Read)

Our results shouldn't be taken as conclusive: Since Facebook personalizes its trends to each user, and we tracked results only during work hours, there's no guarantee that we caught every hoax. But the observation that Facebook periodically trends fake news still stands - if anything, we've underestimated how often it occurs.

There was the thinly sourced story, on August 31, of a Clemson University administrator who kicked a praying man off campus. (The sordid tale, aggregated by a right-wing outlet, has been soundly debunked by the school.)

The next week, on September 8, Facebook promoted a breathless account of the iPhone's new and literally magical features, sourced from the site fakingnews.firstpost.com. The day after, Facebook trended a news release from the "Association of American Physicians and Surgeons" - a discredited libertarian medical organization - as well as a tabloid story claiming that the September 11 attacks were a "controlled demolition."

But if users thought the outrage about Facebook's 9/11 truthering would prompt some reform in Trending, they were mistaken: Less than a week later, Facebook boosted a story about the Buffalo Bills from the well-established satirical site SportsPickle.

"I'd like to say I expect more from Facebook in advocating truth and informing the citizenry," said DJ Gallo, the founder and editor of SportsPickle. "But I think we've seen with this election that much of what is posted on Facebook - and all social media - is not accurate."

Facebook's Trending feature is supposed to serve as a snapshot of the day's most important and most-discussed news, made possible by a partnership between its algorithms and a team of editors. One algorithm surfaces unusually popular topics, a human examines and vets them, and another algorithm surfaces the approved stories for people who will be most interested.

Without any piece of that process, Trending doesn't really work - an observation readily illustrated by a Facebook product called Signal, which shows popular topics before and after they're approved. The after list is overlong, and it's difficult to see how any of the topics could be relevant; the before list is an indecipherable sea of place names, sports teams and conspiracies.

Last May, however, Facebook faced a torrent of high-profile accusations about political bias on the Trending editorial team - so much so that, in the aftermath, the company decided to tweak the role humans play in approving Trending topics. On August 26, Facebook laid off its editorial team and gave the engineers who replaced them a much different mandate when it came to vetting news. Where editors were told to independently verify trending topics surfaced by the algorithm, even "cross-referenc(ing) Google News and other news sources," engineers were told to accept every trending topic unless it was linked to fewer than three recent articles, from any source, or one recent article with five related posts.

The previous editorial team could also influence which specific news stories were displayed with each topic, rejecting the story selected by the algorithm if it was "biased," "clickbait" or irrelevant. Trending's current quality review team does not vet URLs.

It's a bar so low, it's almost guaranteed to allow rumors about Megyn Kelly - if not, you know, the announcement of the Third World War. Facebook admitted as much in a statement during the Kelly aftermath, when it said the story "met the conditions for acceptance at the time because there was a sufficient number of relevant articles."

"On re-review," the statement said, "the topic was deemed as inaccurate."

Although these review guidelines appear largely to blame, Facebook hasn't indicated any plans to change them. Rather, the social network maintains that its fake news problem can be solved by better and more robust algorithms.

At a recent conference, Adam Mosseri - Facebook's vice president of product management - indicated that efforts were underway to add automated hoax- and parody-filtering technologies to the Trending algorithm, like those that exist in News Feed. (News Feed makes guesses about content based on user behavior around it.) Another solution might be something like the system Google News uses to rank top stories, which gives approved publishers the means to flag notable content.

It's worth noting, of course, that even Google News has been fooled before - all social platforms, not just Facebook, struggle with the complex and overwhelming task of identifying hoaxes and other sorts of misinformation. Still, Facebook is a special case: About 40 percent of all American adults turn to it for news, which - despite chief executive Mark Zuckerberg's insistence that Facebook is "not a media company" - makes its handling of things like Trending really important.

Walter Quattrociocchi, an Italian computer scientist who studies the spread of misinformation online, points out that Facebook is a ripe environment for hoaxes and conspiracies: Its users tend to cluster into like-minded bubbles, and they receive highly personalized news in News Feed and through services such asTrending. When Facebook selectively injects fake news into those highly personalized news diets, Quattrociocchi said, it risks further polarizing and alienating its more conspiracy-minded users.

"This is becoming a Pandora's box," he said.

And Facebook hasn't figured out just how to close it.

© 2016 The Washington Post

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Facebook, Facebook Trending, Social, Apps
Google Pixel, Pixel XL Up for Pre-Orders Open on Flipkart
Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »