Facebook announced a series of new policies aimed at combatting the spread of anti-vaccine content on its platform. The new policies, which will make it harder to find false information about vaccines, come after mounting public pressure to take a more active role in fighting viral misinformation critics say has contributed to the outbreaks of preventable diseases.
In some ways, the efforts are similar to Facebook’s fight against fake news. The company isn’t banning anti-vaccine content outright, but making it more difficult to find, and reducing the chance it will go viral. At the same time, Facebook is looking for ways to surface accurate information to people who search for content or are invited to groups that promote conspiracy theories related to vaccines.
With the new policy changes, Facebook will demote pages and groups that share false information about vaccines. The company will also stop surfacing these pages as search recommendations on its app and website. Similarly, anti-vaccination content will no longer appear in Instagram’s Explore page, or in the app’s hashtag pages.
The company said it will rely on information from organizations like the Centers for Disease Control and the World Health Organization, which have “publicly identified verifiable vaccine hoaxes.”
Facebook says it will also begin rejecting ads with anti-vaccination content and has disabled advertisers’ ability to target based on categories like “vaccine controversies.” (A recent reportin The Guardian found the social network had accepted thousands in advertising from anti-vaccination groups.)
Finally, the company is “exploring ways to share educational information about vaccines when people come across misinformation on this topic.” It’s not clear exactly what this will look like, but Facebook has attempted similar tactics in the past. The company previously introduced “disputed” tags, which would appear when people shared stories that had been debunked by Facebook’s fact checkers. Notably, the social network abandoned “disputed” tags in 2017, because it found them ineffective.
The new crackdown comes as calls for Facebook to act have grown louder in recent days. Besides the report in The Guardian, a teenager — who previously made headlines for his decision to defy the wishes of his parents and get vaccinated — recently told Congress that Facebook was his mother’s primary source of anti-vaccine misinformation.