eLife (Jul 2021)
Scanned optogenetic control of mammalian somatosensory input to map input-specific behavioral outputs
Abstract
Somatosensory stimuli guide and shape behavior, from immediate protective reflexes to longer-term learning and higher-order processes related to pain and touch. However, somatosensory inputs are challenging to control in awake mammals due to the diversity and nature of contact stimuli. Application of cutaneous stimuli is currently limited to relatively imprecise methods as well as subjective behavioral measures. The strategy we present here overcomes these difficulties, achieving ‘remote touch’ with spatiotemporally precise and dynamic optogenetic stimulation by projecting light to a small defined area of skin. We mapped behavioral responses in freely behaving mice with specific nociceptor and low-threshold mechanoreceptor inputs. In nociceptors, sparse recruitment of single-action potentials shapes rapid protective pain-related behaviors, including coordinated head orientation and body repositioning that depend on the initial body pose. In contrast, activation of low-threshold mechanoreceptors elicited slow-onset behaviors and more subtle whole-body behaviors. The strategy can be used to define specific behavioral repertoires, examine the timing and nature of reflexes, and dissect sensory, motor, cognitive, and motivational processes guiding behavior.
Keywords