Scientific Reports (Mar 2024)

Visual features are processed before navigational affordances in the human brain

  • Kshitij Dwivedi,
  • Sari Sadiya,
  • Marta P. Balode,
  • Gemma Roig,
  • Radoslaw M. Cichy

DOI
https://doi.org/10.1038/s41598-024-55652-y
Journal volume & issue
Vol. 14, no. 1
pp. 1 – 7

Abstract

Read online

Abstract To navigate through their immediate environment humans process scene information rapidly. How does the cascade of neural processing elicited by scene viewing to facilitate navigational planning unfold over time? To investigate, we recorded human brain responses to visual scenes with electroencephalography and related those to computational models that operationalize three aspects of scene processing (2D, 3D, and semantic information), as well as to a behavioral model capturing navigational affordances. We found a temporal processing hierarchy: navigational affordance is processed later than the other scene features (2D, 3D, and semantic) investigated. This reveals the temporal order with which the human brain computes complex scene information and suggests that the brain leverages these pieces of information to plan navigation.