Frontiers in Cognition (Jun 2024)
Emotional modulation of statistical learning in visual search
Abstract
IntroductionVisual search is facilitated when participants encounter targets in repeated display arrangements. This “contextual-cueing” effect is attributed to incidental learning of spatial distractor-target relations, which subsequently guides visual search more effectively toward the target location. Conversely, behaviorally significant, though task-irrelevant, negative emotional stimuli may involuntarily capture attention and thus hamper performance in visual search. This raises the question of how these two attention-guiding factors connect.MethodsTo this end, we investigated how an emotionally alerting stimulus induced by different classes of emotional (face, scene) pictures prior to the search task relates to memory-related plasticity. We tested 46 participants who were presented with repeated and non-repeated search layouts, preceded at variable (50, 500, 1,000 ms) intervals by emotional vs. neutral faces or scenes.ResultsWe found that contextual learning was increased with emotional compared to neutral scenes, which resulted in no contextual cueing was observed at all, while no modulation of the cueing effect was observed for emotional (vs. neutral) faces. This modulation occurred independent of the intervals between the emotional stimulus and the search display.DiscussionWe conclude that emotional scenes are particularly effective in withdrawing attentional resources, biasing individual participants to perform a visual search task in a passive, i.e., receptive, manner, which, in turn, improves automatic contextual learning.
Keywords