Concurrent contextual and time-distant mnemonic information co-exist as feedback in the human visual cortex
Javier Ortiz-Tudela,
Johanna Bergmann,
Matthew Bennett,
Isabelle Ehrlich,
Lars Muckli,
Yee Lee Shing
Affiliations
Javier Ortiz-Tudela
Department of Psychology, Goethe University Frankfurt, Frankfurt am Main, Hessen, Germany; Corresponding authors at: Department of Psychology, Goethe University Frankfurt, Frankfurt am Main, Hessen, Germany.
Johanna Bergmann
Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
Matthew Bennett
Université Catholique de Louvain, Louvain-la-Neuve, Belgium
Isabelle Ehrlich
Department of Psychology, Goethe University Frankfurt, Frankfurt am Main, Hessen, Germany
Lars Muckli
School of Psychology and of Neuroscience, University of Glasgow, United Kingdom
Yee Lee Shing
Department of Psychology, Goethe University Frankfurt, Frankfurt am Main, Hessen, Germany; IDeA Center for Individual Development and Adaptive Education, Frankfurt am Main, Germany; Brain Imaging Center, Goethe University Frankfurt, Frankfurt am Main, Germany; Corresponding authors at: Department of Psychology, Goethe University Frankfurt, Frankfurt am Main, Hessen, Germany.
Efficient processing of the visual environment necessitates the integration of incoming sensory evidence with concurrent contextual inputs and mnemonic content from our past experiences. To examine how this integration takes place in the brain, we isolated different types of feedback signals from the neural patterns of non-stimulated areas of the early visual cortex in humans (i.e., V1 and V2). Using multivariate pattern analysis, we showed that both contextual and time-distant information, coexist in V1 and V2 as feedback signals. In addition, we found that the extent to which mnemonic information is reinstated in V1 and V2 depends on whether the information is retrieved episodically or semantically. Critically, this reinstatement was independent on the retrieval route in the object-selective cortex. These results demonstrate that our early visual processing contains not just direct and indirect information from the visual surrounding, but also memory-based predictions.