Computers in Human Behavior Reports (Aug 2022)

Self-imposed filter bubbles: Selective attention and exposure in online search

  • Axel G. Ekström,
  • Diederick C. Niehorster,
  • Erik J. Olsson

Journal volume & issue
Vol. 7
p. 100226

Abstract

Read online

It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, it has been suggested that filter bubbles may result from individuals engaging selectively with information in search engine results pages. However, this “self-imposed filter bubble hypothesis” has remained empirically untested. In this study, we find support for the hypothesis using eye-tracking technology and link selection data. We presented partisan participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Participants spent more time viewing own-side links than other links (p = .037). In our sample, participants who identified as right-wing exhibited a greater such bias than those that identified as left wing (p < .001). In addition, we found that both liberals and conservatives tended to select own-side links (p < .001). Finally, there was a significant effect of trust, such that links associated with less trusted sources were attended less and selected less often by liberals and conservatives alike (p < .001). Our study challenges the efficacy of policies that aim at combatting filter bubbles by presenting users with an ideologically diverse set of search results.

Keywords