Journal of Medical Internet Research (Sep 2023)

Exploring YouTube’s Recommendation System in the Context of COVID-19 Vaccines: Computational and Comparative Analysis of Video Trajectories

  • Yee Man Margaret Ng,
  • Katherine Hoffmann Pham,
  • Miguel Luengo-Oroz

DOI
https://doi.org/10.2196/49061
Journal volume & issue
Vol. 25
p. e49061

Abstract

Read online

BackgroundThroughout the COVID-19 pandemic, there has been a concern that social media may contribute to vaccine hesitancy due to the wide availability of antivaccine content on social media platforms. YouTube has stated its commitment to removing content that contains misinformation on vaccination. Nevertheless, such claims are difficult to audit. There is a need for more empirical research to evaluate the actual prevalence of antivaccine sentiment on the internet. ObjectiveThis study examines recommendations made by YouTube’s algorithms in order to investigate whether the platform may facilitate the spread of antivaccine sentiment on the internet. We assess the prevalence of antivaccine sentiment in recommended videos and evaluate how real-world users’ experiences are different from the personalized recommendations obtained by using synthetic data collection methods, which are often used to study YouTube’s recommendation systems. MethodsWe trace trajectories from a credible seed video posted by the World Health Organization to antivaccine videos, following only video links suggested by YouTube’s recommendation system. First, we gamify the process by asking real-world participants to intentionally find an antivaccine video with as few clicks as possible. Having collected crowdsourced trajectory data from respondents from (1) the World Health Organization and United Nations system (nWHO/UN=33) and (2) Amazon Mechanical Turk (nAMT=80), we next compare the recommendations seen by these users to recommended videos that are obtained from (3) the YouTube application programming interface’s RelatedToVideoID parameter (nRTV=40) and (4) from clean browsers without any identifying cookies (nCB=40), which serve as reference points. We develop machine learning methods to classify antivaccine content at scale, enabling us to automatically evaluate 27,074 video recommendations made by YouTube. ResultsWe found no evidence that YouTube promotes antivaccine content; the average share of antivaccine videos remained well below 6% at all steps in users’ recommendation trajectories. However, the watch histories of users significantly affect video recommendations, suggesting that data from the application programming interface or from a clean browser do not offer an accurate picture of the recommendations that real users are seeing. Real users saw slightly more provaccine content as they advanced through their recommendation trajectories, whereas synthetic users were drawn toward irrelevant recommendations as they advanced. Rather than antivaccine content, videos recommended by YouTube are likely to contain health-related content that is not specifically related to vaccination. These videos are usually longer and contain more popular content. ConclusionsOur findings suggest that the common perception that YouTube’s recommendation system acts as a “rabbit hole” may be inaccurate and that YouTube may instead be following a “blockbuster” strategy that attempts to engage users by promoting other content that has been reliably successful across the platform.