In an effort to curb anti-vaccination conspiracy theories and misinformation, Facebook announced Thursday it will no longer recommend the offending pages and groups, and will block advertisements that include false content about vaccines. The company will also stop recommending anti-vaccination content on Instagram.
The tech giant rolled out its plan to combat anti-vaccine content after mounting public pressure culminated in a Capitol Hill hearing this week, when a Senate panel issued a dire warning about the public health danger that vaccine misinformation poses. There, 18-year-old Ethan Lindenberger testified that his mother, an anti-vaccine evangelist, relies on Facebook or Facebook-linked sites for all of her information on the subject. And she's certainly not alone.
In a blog post, Monika Bickert, Facebook's head of global policy management, said the company is "working to tackle vaccine misinformation on Facebook by reducing its distribution and providing people with authoritative information on the topic."
"Leading global health organisations . . . have publicly identified verifiable vaccine hoaxes," Bikert wrote. "If these vaccine hoaxes appear on Facebook, we will take action against them."
Facebook also said it would be "exploring" ways to counter false content, whenever users do come across it, with "educational information" about vaccines.
Renee DiResta, who researches computational propaganda at the analysis firm New Knowledge and has closely followed the spread of deceptive health information since she started a pro-vaccine Facebook page in 2015, endorsed the company's move on Twitter.
"I'm happy to see @Facebook's thoughtful application of remove/reduce/inform to health misinformation," she said. "It strikes a balance between expression and amplification."
The changes in Facebook and Instagram recommendation systems, along with the company's proposed fact offensive, may also ease the concerns of the growing number of researchers who have noted the fast spread of misinformation online and especially on social media.
The World Health Organization recently dubbed "vaccine hesitancy" one of the top global threats of 2019, a warning punctuated by one of the worst measles outbreaks in decades, which has sickened at least 75 people across the Pacific Northwest - most of whom are unvaccinated children under 10 years old.
In the face of this burgeoning crisis, studies and news reports have indicated that Facebook's echo chambers have made the problem worse.
One group of scientists recently published a study that found the majority of the most-viewed health stories on Facebook in 2018 were downright fake or contained significant amounts of misleading information. Vaccinations ranked among the three most popular story topics.
An investigation by the Guardian newspaper found that Facebook search results for information about vaccines were "dominated by anti-vaccination propaganda."
In a statement to The Washington Post last month, Facebook said that most anti-vaccination content didn't violate its policies around inciting "real-world harm." Simply removing such material, the company said, wouldn't effectively counter fictional information with the factual.
"While we work hard to remove content that violates our policies, we also give our community tools to control what they see as well as use Facebook to speak up and share perspectives with the community around them," the company's statement read.
On Thursday, Bikert and Facebook appeared to reaffirm that stance, as the new statement made no mention of removing groups or pages altogether - something Facebook has done in the past, notably with content relating to conspiracy theorist Alex Jones and his show Infowars.
A Facebook spokeswoman told The Post that the company is not removing the pages or the groups because it's trying to strike a balance between "reach and speech." Facebook is attempting to reduce the number of people who see the content, she said, without censoring it outright.
The effort means these groups and pages will have a difficult time recruiting new members. However, the spokeswoman said, the users who already belong to the groups or pages will be able to log onto them as usual.
In February, Rep. Adam Schiff, D-Calif., sent letters to the heads of Facebook and Google, which also has been under fire for YouTube's role in promoting misinformation, asking how they plan to protect their users from potentially dangerous hoaxes.
"As Americans rely on your services as their primary source of information, it is vital that you take responsibility with the seriousness it requires, and nowhere more so than in matters of public health and children's health," Schiff wrote to Facebook chief executive Mark Zuckerberg.
After Facebook's Thursday announcement, Schiff struck a cautious note on Twitter, writing, "The ultimate test will be if these measures reduce the spread of anti-vaccine content on their platforms, to the benefit of public health."
In a later statement, Schiff added that he's "pleased that all three companies are taking this issue seriously and acknowledged their responsibility to provide quality health information to their users."
Lindenberger, who famously vaccinated himself against his mother's wishes, spoke to the Senate Committee on Health, Education, Labor and Pensions on Tuesday and reiterated Schiff's calls for reliable information - not the type of stuff his mother was reading on social media.
During Lindenberger's testimony, one senator asked the Ohio teen if his mother got most of her information online.
"Yes," Lindenberger replied. "Mainly Facebook."
"And where do you get most of your information?" the lawmaker asked.
Laughing, Lindenberger said, "Not Facebook."
© The Washington Post 2019
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.