The story of an unvaccinated 6-year-old-boy getting tetanus in Oregon, surviving with an $800,000 medical bill and his parents still choosing not to vaccinate him, went viral last month, perfectly illustrating the power of social media in effectively spreading information about vaccines — both good and bad.
This isn’t the first vaccine story to go viral in recent months, either. It was recently revealed that Amazon was allowing users to donate to anti-vaccine organizations through a program by the AmazonSmile Foundation. Doctors are reporting that teens are sneaking out to get vaccinated, and one teen even spoke at a US Senate hearing in March, condemning the anti-vax movement — saying that his mother’s beliefs were the result of misinformation spread through Facebook. Measles stories continue to dominate the news cycle, as global outbreaks reach new highs.
Social media can be used to spread all kinds of information, including some that isn't real. Unfortunately, the impact that that misinformation can have on essential health initiatives can be devastating, especially when it comes to vaccination efforts.
Take Action: Where You Live Shouldn’t Decide Whether You Live. Demand World Leaders Invest in Vaccines
The Spread — and Promotion — of Misinformation
The spread of misinformation across social media could trigger a massive disease outbreak, according to a theory from a 2013 World Economic Forum report that warned that “digital wildfires” could cause the "viral spread" of misleading information.
Years later, that prediction might be coming true.
Canada, the US, Madagascar, and the Philippines are all experiencing measles outbreaks, and measles cases tripled in Europe between 2017 and 2018. The Centers for Disease Control and Prevention suggested in a recent study that the anti-vaccination movement was to blame for the increasing number of outbreaks. The World Health Organization (WHO) even listed vaccine hesitancy as one of 2019’s global health threats.
We are declaring a public health emergency in Williamsburg due to the 300 cases of measles reported in our city — primarily concentrated in Brooklyn.
— Mayor Bill de Blasio (@NYCMayor) April 10, 2019
There's no room for misinformation when it comes to protecting our children. Vaccines are safe and effective. They work. pic.twitter.com/Mf6k31X0Fk
The anti-vaccination movement gained notoriety in the late ’90s, when former doctor Andrew Wakefield published now-discredited findings in the Lancet that linked the measles, mumps, and rubella (MMR) vaccine to autism.
Wakefield’s medical license was revoked, and researchers have never been able to replicate his findings, but the damage was done — the public took his false information as fact and pro-vaccine advocates have been fighting against this misinformation ever since.
The spread of information — true or false — has been made easier in recent years thanks to social media.
“Fake news” flies around the internet every day, and forums on all topics make it easy for people to share whatever information they want to without vetting.
The ability to promote misleading content has been especially problematic in recent years, coupled with algorithms that often direct users looking for info on vaccines to anti-vax content on platforms like Facebook and YouTube. The Guardian reported in February, for example, that anti-vax content ranked high in search results for groups and pages about vaccines, and YouTube’s recommendations also directed searchers to anti-vax misinformation.
Perhaps more concerning is that for years, individuals and groups could pay for anti-vaccine ads on Facebook. Earlier this year, data compiled by the Daily Beast indicated that 147 anti-vax ads had been bought via seven Facebook pages. These ads were viewed between 1.6 million and 5.2 million times — and they were targeted toward women over the age of 25, as these women are statistically most likely to have children who need to be vaccinated.
A study from 2017 entitled “Polarization of the vaccination debate on Facebook” looked at how Facebook users consumed vaccine content online, and found that it was “dominated by the echo-chamber effect.”
Echo chambers create environments in which users only consume ideas that are the same as their own, which solidifies their stance.
“Vaccine hesitancy is not new — it always has been around,” Eve Dubé, who leads the Social Sciences and Humanities Network of the Canadian Immunization Research Network, and who is a former member of the WHO working group on vaccine hesitancy, told Global Citizen. “With social media and internet, rumors or anti-vaccine content can spread much faster than it used to.”
But Dubé said the the echo chamber effects gives the “false impression of consensus.”
In Canada, for instance, she said that less than 2% of parents actually refuse all vaccines for their children — but social media can make it seem like there are many more anti-vaxxers than that.
The Change in Tides
There is light at the end of the misinformation tunnel.
Facebook announced recently that it will no longer allow misinformation about vaccines to be promoted through ads or recommendations, and will limit the reach of groups and pages that spread misinformation about vaccines.
“We also believe in providing people with additional context so they can decide whether to read, share, or engage in conversations about information they see on Facebook,” Monika Bickert, VP of global policy management at Facebook, said in a statement. “We are exploring ways to give people more accurate information from expert organizations about vaccines at the top of results for related searches, on Pages discussing the topic, and on invitations to join groups about the topic.”
Since this announcement, Facebook has also announced new steps in its current “Remove, Reduce, Inform” initiative to manage difficult content.
“This involves removing content that violates our policies, reducing the spread of problematic content that does not violate our policies and informing people with additional information so they can choose what to click, read or share,” the statement reads.
YouTube also unveiled a new fact check alert to help users understand misinformation on issues like vaccines, and Pinterest has made it clear where it stands, too.
“We want Pinterest to be an inspiring place for people, and there's nothing inspiring about misinformation,” a Pinterest spokesperson said. “That's why we continue to work on new ways of keeping misleading content off our platform and out of our recommendations engine.”.
In other words, big social media platforms are fighting back.
Meanwhile, individuals can, too.
Dubé admits that groups in some countries, like the United States, Australia, and Italy, have anti-vaccination campaigns with a stronger and more organized presence, but she stresses that fighting against their messaging isn’t necessarily the best approach.
“I think we can do a better job in public health … [using social] media to share information in a better way, that is a bit less formal in the way we usually see,” she said, pointing out that public health information is often very factual and dry.
She thinks public health officials can use social media to their advantage and help encourage those who are on the fence.
“I think there’s some who are really opposed [to vaccines],” she said. “But it’s a tiny minority and of course we don’t want them to be more vocal.”
Because anti-vaxxers often rely on negative stories to convey their messages, it can be hard to tackle it from the other angle — there’s no news to report when a child is vaccinated and nothing happens.
But as they say in health care: no news is good news.