NeuroImage (Nov 2021)

Touching events predict human action segmentation in brain and behavior

  • Jennifer Pomp,
  • Nina Heins,
  • Ima Trempler,
  • Tomas Kulvicius,
  • Minija Tamosiunaite,
  • Falko Mecklenbrauck,
  • Moritz F. Wurm,
  • Florentin Wörgötter,
  • Ricarda I. Schubotz

Journal volume & issue
Vol. 243
p. 118534

Abstract

Read online

Recognizing the actions of others depends on segmentation into meaningful events. After decades of research in this area, it remains still unclear how humans do this and which brain areas support underlying processes. Here we show that a computer vision-based model of touching and untouching events can predict human behavior in segmenting object manipulation actions with high accuracy. Using this computational model and functional Magnetic Resonance Imaging (fMRI), we pinpoint the neural networks underlying this segmentation behavior during an implicit action observation task. Segmentation was announced by a strong increase of visual activity at touching events followed by the engagement of frontal, hippocampal and insula regions, signaling updating expectation at subsequent untouching events. Brain activity and behavior show that touching-untouching motifs are critical features for identifying the key elements of actions including object manipulations.

Keywords