We ran a little experiment a few weeks ago: During the work day, we'd check in with Facebook each hour, on the hour, and record which topics were trending for us on the platform. We're exploring some of what we learned in a series of pieces in the coming weeks. This is the first in the series.
Facebook's trending topics, taken without context, are a word salad. The company used to solve that problem by employing human editors, who described each trending topic in bland but helpful language.
For instance: When Japanese Prime Minister Shinzo Abe trended on Facebook after emerging from a green pipe, dressed as Mario, in the middle of the Rio Olympics closing ceremony, Facebook might have noted that he'd appeared in a "surprise performance" there.
Not perfect, but adequate enough. And then a month ago, everything changed.
Now, the job of disentangling the context for Facebook's trends is left up to algorithms, which automatically choose a "top article" for each topic that pops up when users hover over the trending topic for more information. That process is guided by a team of human engineers who try, with mixed results, to stop things from going horribly wrong - but the new-ish system has made some pretty egregious errors.
(Also see: Three Days After Removing Human Editors, Facebook Is Already Trending Fake News)
Facebook has long characterised itself as a neutral platform that simply connects its users to the rest of the world. But over the past several months, the company faced greatly increased scrutiny, including accusations of bias, over the news it shows its users and where that news comes from.
The further takeover of the algorithm was Facebook's response to all that criticism, something that the company appeared to hope would quell accusations of human bias in its news recommendations. Instead, the early high-profile mistakes of the new trending regime only seemed to highlight how much work it still has to do.
In the first weeks of the new Trending bar, Facebook trended conspiracy theories, old news, fake news - including one story from a site that had "Fakingnews" in its domain name - and was generally slow to pick up on major developing news stories (with the very notable exception of its swift pickup of the Brad Pitt and Angelina Jolie divorce).
Some of their bigger mistakes even became news themselves: Three days after the new system went into effect, it promoted a false trending story about Megyn Kelly claiming Fox News had "Kick(ed) her out for backing Hillary." Later, it explained why the Sept. 11, 2001, anniversary trended by promoting a "truther" article that falsely claimed to contain evidence that the World Trade Center buildings fell because of a "controlled explosions."
But day to day, the site is matching dozens of stories with trending topics, often without major incident. So, when it comes to these more low-key pairings, we started to wonder whether there were sites that we were seeing more often than others - and if so, what they were.
(Also see: What Do We Really Want Facebook to Be?)
While Mark Zuckerberg says the company he founded is not a "media company," others have said that Facebook is indeed one, whether Zuckerberg likes it or not. Sixty-six percent of Facebook users - amounting to 44 percent of the general American population - get news from Facebook, according to a recent Pew study.
Trending topics are personalised for each user, so the trends we logged for our experiment wouldn't be identical to the ones you'd see if you had done the same over the same period of time. That's fine for our purposes. We were much more interested in what Facebook actually showed us than in the overall distribution of all available trending topics at any given time.
But to get a wider selection of stories into our logs, we also included the non-personalized trending list from Facebook's Mentions app, along with the results from two dummy accounts, male and female. The male account was also accessed through a VPN to set the location outside of the Beltway.
We noticed that trending topics might get new "top" articles over time. But we didn't find definite instances where Facebook showed different articles to different users at the same time for the same trending topic, suggesting that the trending story isn't personalized to the user at this point.
We also separated out sites that were entirely or mostly dedicated to one area of interest, such as gaming, tech, sports and entertainment, and compared them among themselves.
There were a few surprises. We noticed - in the general interest category - that Buzzfeed had fewer than five stories trend for us during the course of our entire experiment.
So we asked NewsWhip, a social analytics firm that tracks the performance of publishers and content on social media platforms, to send us data on overall engagements for the sites on our lists over the same period.
There's a decent amount of movement. Facebook showed us 31 articles each for Yahoo and USA Today through its trending topics - the top amounts for all of the publications we saw - yet they are both lagging behind in overall engagements for the same period. Only CNN and the Huffington Post made it into the top six for both sets of data.
And some sites, such as the New York Times, fared much better. The Times barely squeaked into our own tracking with five articles, but does much better comparatively on overall engagements Meanwhile, Buzzfeed would have been second in overall engagements among these outlets had it had enough stories trend to make it into our initial set.
What can we conclude from this? Not a ton, although it's interesting, especially given that Facebook has struggled over the months to figure out how it defines what sources speak for each of its trending topics. Not all popular stories are authoritative, and different sites are authoritative about different things.
Facebook's old editorial team used to serve as a bit of a corrective on some of this. They could elevate "important" topics that weren't naturally trending when 5 of 10 mainstream outlets on an agreed-upon list were reporting the story. But as Gizmodo's reporting revealed last spring, that process was hardly perfect.
When, through that reporting, Facebook was accused of introducing anti-conservative bias into the system with these interventions, the company placed more limitations on the scope of the editorial team, before eliminating them altogether at the end of the summer.
Weeks later, Facebook just might be realizing that they were putting too much pressure on the algorithm, too soon. Right at the end of our experiment, Facebook made one subtle but important change to how it displays the news it selects for people to read: Now, instead of simply displaying a headline and blurb from the algorithmically-chosen context for each article to explain the trend, Facebook clearly labels each blurb as a sample from a "Popular article."
© 2016 The Washington Post
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.