In our newly published study “From Facebook to YouTube: The Potential Exposure to COVID-19 Anti-Vaccine Videos on Social Media” (by Anatoliy Gruzd, Deena Abul-Fottouh, Melodie YunJu Song, and Alyssa Saiphoo), we explored the role of Facebook and YouTube in exposing people to COVID-19 vaccine misinformation.
Our research examined a very specific, but common information-sharing pathway that starts with when a Facebook user encounters a vaccine-related post with a YouTube video, follows the video to YouTube, and then views a list of related videos recommended by YouTube.
The findings showed that despite efforts by both platforms to combat misinformation, anti-vaccine videos continued to spread on both Facebook and YouTube. Our study found that 66% (37 out of 56) of the most viral Facebook entities (i.e., groups and pages) in our dataset promoted anti-vaccine videos, and 57% (276 out of 484) of the YouTube videos shared on Facebook were anti-vaccine.
This reveals a gap in platform-led initiatives to remove misinformation and underscores the need for public health agencies to redouble efforts to counteract the spread of anti-vaccine content. The study provides practical insights on why public health agencies need collaborate with influential YouTube channels to share accurate information about the benefits and safety of vaccination.
Our study demonstrates once again that despite efforts made by Facebook and YouTube to change how their auto recommender algorithms work, COVID-19 vaccine-related misinformation in the form of anti-vaccine content continues to persist on these widely used platforms.