eLife (Oct 2022)

Visual and motor signatures of locomotion dynamically shape a population code for feature detection in Drosophila

  • Maxwell H Turner,
  • Avery Krieger,
  • Michelle M Pang,
  • Thomas R Clandinin

DOI
https://doi.org/10.7554/eLife.82587
Journal volume & issue
Vol. 11

Abstract

Read online

Natural vision is dynamic: as an animal moves, its visual input changes dramatically. How can the visual system reliably extract local features from an input dominated by self-generated signals? In Drosophila, diverse local visual features are represented by a group of projection neurons with distinct tuning properties. Here, we describe a connectome-based volumetric imaging strategy to measure visually evoked neural activity across this population. We show that local visual features are jointly represented across the population, and a shared gain factor improves trial-to-trial coding fidelity. A subset of these neurons, tuned to small objects, is modulated by two independent signals associated with self-movement, a motor-related signal, and a visual motion signal associated with rotation of the animal. These two inputs adjust the sensitivity of these feature detectors across the locomotor cycle, selectively reducing their gain during saccades and restoring it during intersaccadic intervals. This work reveals a strategy for reliable feature detection during locomotion.

Keywords