Nature Communications (Dec 2024)
Ants integrate proprioception as well as visual context and efference copies to make robust predictions
Abstract
Abstract Forward models are mechanisms enabling an agent to predict the sensory outcomes of its actions. They can be implemented through efference copies: copies of motor signals inhibiting the expected sensory stimulation, literally canceling the perceptual outcome of the predicted action. In insects, efference copies are known to modulate optic flow detection for flight control in flies. Here we investigate whether forward models account for the detection of optic flow in walking ants, and how the latter is integrated for locomotion control. We mounted Cataglyphis velox ants in a virtual reality setup and manipulated the relationship between the ants’ movements and the optic flow perceived. Our results show that ants compute predictions of the optic flow expected according to their own movements. However, the prediction is not solely based on efference copies, but involves proprioceptive feedbacks and is fine-tuned by the panorama’s visual structure. Mismatches between prediction and perception are computed for each eye, and error signals are integrated to adjust locomotion through the modulation of internal oscillators. Our work reveals that insects’ forward models are non-trivial and compute predictions based on multimodal information.