Facebook Study Finds It's Not Too Polarizing

Facebook Study Finds It's Not Too Polarizing
Advertisement
For years, political scientists and other social theorists have fretted about the Internet's potential to flatten and polarize democratic discourse.

Because so much information now comes through digital engines shaped by our own preferences - Facebook, Google and others suggest content based on what consumers previously enjoyed - scholars have theorized that people are building an online echo chamber of their own views.

But in a peer-reviewed study published Thursday in the journal Science, data scientists at Facebook report that the echo chamber is not as insular as many might fear - at least not on the social network. While independent researchers said the study was important for its scope and size, they noted several significant limitations.

After analyzing how 10.1 million of the most partisan American users of the social network navigated the site during a six-month period last year, researchers found that people's networks of friends and the stories they see are in fact skewed toward their ideological preferences. But that effect is more limited than the worst case that some theorists had predicted, in which people would see almost no information from the other side.

On average, about 23 percent of users' friends are of an opposing political affiliation, according to the study. An average of almost 29 percent of the news stories displayed by Facebook's News Feed also appear to present views that conflict with the user's own ideology.

In addition, researchers found individuals' choices about which stories to click on had a larger effect than Facebook's filtering mechanism in determining whether people encountered news that conflicted with their professed ideology.

"This is the first time we've been able to quantify these effects," Eytan Bakshy, a data scientist at Facebook who led the study, said in an interview.

He said that he began the work in 2012 out of an interest in the way social networks shape how the public gets news. "You would think that if there was an echo chamber, you would not be exposed to any conflicting information," he added, "but that's not the case here."

Facebook's findings run counter to a long-standing worry about the potential for digital filtering systems to shape our world. For Facebook, the focus is on the algorithm that the company uses to determine which posts people see, and which they do not, in its News Feed.

Eli Pariser, chief executive of the viral content website Upworthy, labeled this effect the "Filter Bubble." Some Facebook users have said they unfollow those who post content with which they disagree. And with political discussions being increasingly pitched in the runup to next year's presidential election, in which the Internet will be used as a primary campaign tool, the problem appeared to be getting worse.

"This shows that the effects that I wrote about exist and are significant, but they're smaller than I would have guessed," Pariser said in an interview about Facebook's study.

Natalie Jomini Stroud, a professor of communications studies at the University of Texas at Austin, who was not involved in the study, said the results were "an important corrective" to the conventional wisdom.

The study adds to others that debate whether the Internet creates an echo chamber. A Pew Research Center report last year found that media outlets people name as their prime information sources about politics and news are strongly correlated with their political views. Another study late last year published as a working paper in the National Bureau of Economic Research analyzed Twitter usage during the 2012 election and found social media often exposed users only to opinions that match their own.

Stroud and several other researchers note that the Facebook study has limitations. It arrived at 10.1 million users by screening only for Americans older than 18, who log on to Facebook at least four out of seven days of the week, and who interacted with at least one news link during the second half of 2014. Importantly, all self-identified as liberal or conservative in their profiles.

Most of Facebook's users do not post their political views, and Stroud cautioned that those users might be either more or less accepting of conflicting political views.

Criticism of the study was swift. In an article responding to the study, Zeynep Tufekci, a professor at the University of North Carolina, Chapel Hill, said, "People who self-identify their politics are almost certainly going to behave quite differently, on average, than people who do not." She added, "The study is still interesting, and important, but it is not a study that can generalize to Facebook users."

Facebook's researchers said studying those who self-report their politics was the most technically feasible way to determine users' political affiliations, and trying to infer those from other methods would have led to greater uncertainty.

The findings are convenient for Facebook. With more than 1.3 billion users, the social network is effectively the world's most widely read daily newspaper. About 30 percent of U.S. adults get their news from the social network, according to the Pew Research Center.

But its editorial decisions are drafted with little transparency, using its News Feed algorithm. Facebook could use the study's results to show that its secret algorithm is not ruining national discourse.

Facebook said its researchers were allowed wide latitude to pursue their research interests and to present whatever they found. The results were reviewed before publication in Science, with the journal selecting an anonymous panel of scholars unaffiliated with Facebook. Science does not disclose the identity of experts and warns reviewers to declare any financial ties that might be perceived as a conflict of interest with the study being reviewed.

Facebook also noted that this study was substantively different from one that caused an outcry last year, in which the company's scientists altered the number of positive and negative posts that some people saw to examine the effects on their mood. This study did not involve an experiment that changed users' experience of Facebook; researchers analyzed how people use Facebook as it stands today.

For Facebook's study, researchers first determined the point of view of a given article by looking at whether liberals or conservatives had shared it most. They found unsurprising partisan attitudes about well-known news sources: Fox News stories were shared mainly by conservatives, while articles on the Huffington Post were shared by liberals.

Then they measured how often feeds of users, whose identifying details had been taken out, displayed stories that conflicted with their professed ideologies, and how often they clicked on those stories.

Some academics said Facebook was always tweaking the News Feed and could easily make changes that would create a more sealed echo chamber.

"A small effect today might become a large effect tomorrow," David Lazer, a political scientist at Northeastern University who studies social networks, wrote in a commentary on the Facebook study also published in Science. "The deliberative sky is not yet falling, but the skies are not completely clear either."

The study - which was also written by Solomon Messing and Lada Adamic, data scientists at Facebook - presented other findings on how people of different political persuasions use the world's largest social network.

One is that liberals live in a more tightly sealed echo chamber than conservatives but that conservatives are more selective about what they click on when they see ideologically challenging views. About 22 percent of the news stories that Facebook presents to liberals is of a conservative bent, while 33 percent of the stories shown to conservatives presents a liberal point of view. The difference, researchers said, is that liberal users are connected to fewer friends who share views from the other side.

But liberals were only 6 percent less likely to click on ideologically challenging articles instead of ideologically consistent articles that appeared in their feed. Conservatives were 17 percent less likely to click, meaning they appeared more reluctant to indulge opposing views.

The study also raised - but did not answer - the question of what happens after people click on an article that presents an opposing view: Are they being informed and persuaded by its arguments, or are they dismissing it out of hand?

"People who are really into politics expose themselves to everything," said Diana C. Mutz, a political scientist at the University of Pennsylvania. "So they will expose to the other side, but it could be to make fun of it, or to know what they're saying to better argue against it, or just to yell at the television set."

A click, in other words, is not necessarily an endorsement, or even a sign of an open mind.

© 2015 New York Times News Service

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Facebook, Social
Microsoft Said to Have No Plans to Pursue Salesforce
Sony SmartWatch 3 Review: Third Time Lucky
Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »