An unpublished study found that people who source information from Facebook are more likely to oppose the Covid-19 vaccine.
Amy Morahan, a student at the University of Canterbury, researched the impact of traditional media and Facebook on attitudes and intentions toward a vaccine by surveying 484 people selected nationwide.
The study was conducted in August and September of this year, and preliminary results – prior to peer review of the study – were made available.
Morahan showed the participants seven messages in favor of vaccination and three messages against vaccination.
* Covid-19: Vaccine certification data will be legally protected, and will work offline
* COVID-19: Facebook could have stopped swarming anti-vaccine comments on users
* COVID-19: Nelson Marlborough Chief Medical Officer addresses vaccine concerns
People indicated that they frequently saw all positive messages on traditional media platforms such as newspapers and online news sites.
Conversely, Morahan has found that the more people get vaccine information from Facebook, the more likely it is that a negative view of the vaccine will emerge.
“The more knowledge you have from Facebook, the less likely you are to get the vaccine.”
The research also showed that there was no demographic difference in the 25 percent of respondents who did not want to be vaccinated.
Morahan was surprised that age and education level made no difference for those with negative opinions, saying it’s easy to assume that people with less education would be more likely to oppose the vaccine.
“This makes the effects of the media source even more significant.”
Morahan has now been collecting data and will release a final report in February next year. She hoped the research would help contacts effectively target reluctant audiences to vaccinate.
Originally, I started the research after noticing the same trends about positive messages about vaccines in traditional media versus negative posts on Facebook.
She said the research was worrying because people could not be sure if messages on the social media platform could be verified.
“Anyone can say whatever they like.”
Back in March, Facebook employees revealed that they had found a way to help stop the spread of misinformation about the virus through posts on its page.
By changing how posts about vaccines are categorized in people’s newsfeeds, the company’s researchers realized they could cut back on misinformation that individuals had seen about Covid-19 vaccines and provide users with posts from legitimate sources like the World Health Organization.
“Given these results, I suppose we are hoping to launch ASAP,” a Facebook employee wrote in response to the internal memo about the study.
Have you come across claims online about injuries or deaths from the Covid-19 vaccine?
Instead, Facebook has discontinued some of the suggestions from the study. No further changes were made until April, with critics saying they believed the tech giant was concerned that the change could affect the company’s earnings.
“Why don’t you remove the comments? Because engagement is the only thing that matters,” said Imran Ahmed, chief executive of the Center for Countering Digital Hate, the UK’s internet watchdog.
“It attracts attention and attention equals eyeballs and eyeballs equal advertising earnings.”
However, in an emailed statement, Facebook said it had made “significant progress” this year by reducing misinformation about vaccines in users’ feeds.