The Washington PostDemocracy Dies in Darkness

Facebook will consider removing or demoting anti-vaccination recommendations amid backlash

February 15, 2019 at 11:48 a.m. EST
Robert Kennedy Jr., right, speaks at a rally held in opposition to a proposed bill that would remove parents' ability to claim a philosophical exemption to opt their school-age children out of the combined measles, mumps and rubella vaccine, Friday, Feb. 8, 2019, at the Capitol in Olympia, Wash. Amid a measles outbreak that has sickened people in Washington state and Oregon, lawmakers earlier Friday heard public testimony on the bill. (AP Photo/Ted S. Warren)

As public pressure intensifies over how Facebook promotes misinformation about vaccines, the social media giant is considering removing anti-vaccination content from its recommendation systems.

Facebook has become something of a haven for a small but vocal community of parents who reject facts about immunizations, often citing junk science or conspiracy theories, and opt out of having their children vaccinated.

This week, Facebook has come under fire for promoting anti-vaccination material, especially ads targeting women in regions with high numbers of measles cases, according to reporting from the Daily Beast. The outcry intensified after Rep. Adam B. Schiff (D-Calif.) wrote a letter to founder and chief executive Mark Zuckerberg asking how Facebook planned to protect users from misleading material about vaccinations. Schiff sent a similar letter to Sundar Pichai, chief executive of Google, which is also under scrutiny about how its search engine and subsidiary YouTube promote potentially dangerous misinformation.

Despite the evidence, the anti-vaccination movement is gaining strength. (Video: Luis Velarde/The Washington Post)

“We’ve taken steps to reduce the distribution of health-related misinformation on Facebook, but we know we have more to do,” Facebook said Friday in a statement emailed to The Washington Post, after Bloomberg News first reported the company might remove misleading or harmful content in the wake of Schiff’s letter. "We’re currently working on additional changes that we’ll be announcing soon.”

Specifically, Facebook is looking into cutting back or removing this content from recommendations, including “Groups you should join," according to a Facebook spokesperson. It’s also considering demoting it in search results.

These tensions come as the United States faces a troubling resurgence of measles, a disease that was declared eliminated in 2000 by the Centers for Disease Control and Prevention due to extensive use of the measles, mumps and rubella vaccine. The first measles vaccine was licensed in the U.S. in 1963 and it was later combined with a vaccine for mumps and rubella.

But this year, more than 100 cases of measles have been confirmed in 10 states this year, according to the CDC, surpassing the total number of cases confirmed in 2016. Last month, Washington Gov. Jay Inslee (D) declared a state of emergency after 25 cases of measles were reported in a single county, where nearly a quarter of kids attend school without having had measles, mumps and rubella immunizations.

Facebook has contended that most anti-vaccination content didn’t violate its community guidelines for inciting “real-world harm.” The company told The Washington Post this week that it didn’t believe that removing such material would help raise awareness of the facts about vaccinations. Facebook said it thinks accurate counter-speech is a more productive safeguard against misinformation.

“While we work hard to remove content that violates our policies, we also give our community tools to control what they see as well as use Facebook to speak up and share perspectives with the community around them,” Facebook said in a statement emailed to The Post on Wednesday.

The platform has a spotty record when it comes to the quality of information in popular health content seen by its users. A recent study from the Credibility Coalition and Health Feedback, a group of scientists who evaluate the accuracy of health media coverage, found the majority of the most-clicked health stories on Facebook in 2018 were fake or contained a significant amount of misleading information. The study looked at the top 100 health stories with the most engagements on social media and had a network of experts assess their credibility. The study found less than half were “highly credible.” Vaccinations ranked among the three most popular story topics.

Health-related content is eligible to be reviewed by Facebook’s fact-checking partners, meaning that content found to be misleading or false will be demoted in users’ feeds and appear along with related articles from fact-checkers. But this doesn’t work in the social network’s groups, where the bulk of anti-vaccination material is spread.

Anti-vaxxers are spreading conspiracy theories on Facebook, and the company is struggling to stop them

The World Health Organization recently named “vaccine hesitancy” as one of the top global threats of 2019. But a recent investigation by the Guardian found that in Facebook’s search results vaccines were “dominated by anti-vaccination propaganda.” Facebook did not respond to questions from the Guardian about its plans for dealing with the issue. A different Guardian investigation also found Facebook had accepted advertising revenue from Vax Truther, Anti-Vaxxer, Vaccines Revealed and Michigan for Vaccine Choice, among others.

In his letter to Zuckerberg, Schiff asked how the company plans to reconcile the fact that frightened and confused parents may be making decisions based on misinformation about vaccinations on its platform, making the population at large more vulnerable to a deadly disease.

“I acknowledge that it may not always be a simple matter to determine when information is medically accurate, nor do we ask that your platform engage in the practice of medicine, but if a concerned parent consistently sees information in their Newsfeed that casts doubt on the safety or efficacy of vaccines, it could cause them to disregard the advice of their children’s physicians and public health experts and decline to follow the recommended vaccination schedule,” Schiff wrote.

Google’s YouTube has already begun changing algorithms to try to control the spread of misinformation. Last month, YouTube said it would begin removing videos with “borderline content” that “misinform users in harmful ways.”

“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to our users,” the company wrote in a blog post.

Correction: An earlier version of this story had the wrong date for when the measles, mumps and rubella vaccine was introduced. This story has been corrected with the date the measles date was licensed in the U.S.