Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany; Department of Psychology, Friedrich-Alexander University Erlangen-Nuernberg, Erlangen, Germany
Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, United Kingdom
Oliver Stegle
Max Planck Institute for Intelligent Systems, Tübingen, Germany; European Molecular Biology Laboratory, Genome Biology Unit, Heidelberg, Germany; Division of Computational Genomics and Systems Genetics, German Cancer Research Center (DKFZ), Heidelberg, Germany, Heidelberg, Germany
Uta Noppeney
Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, United Kingdom; Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
To form a more reliable percept of the environment, the brain needs to estimate its own sensory uncertainty. Current theories of perceptual inference assume that the brain computes sensory uncertainty instantaneously and independently for each stimulus. We evaluated this assumption in four psychophysical experiments, in which human observers localized auditory signals that were presented synchronously with spatially disparate visual signals. Critically, the visual noise changed dynamically over time continuously or with intermittent jumps. Our results show that observers integrate audiovisual inputs weighted by sensory uncertainty estimates that combine information from past and current signals consistent with an optimal Bayesian learner that can be approximated by exponential discounting. Our results challenge leading models of perceptual inference where sensory uncertainty estimates depend only on the current stimulus. They demonstrate that the brain capitalizes on the temporal dynamics of the external world and estimates sensory uncertainty by combining past experiences with new incoming sensory signals.