School of Psychology, College of Engineering, Science and the Environment, University of Newcastle, Newcastle, Australia; Hunter Medical Research Institute, Newcastle, Australia
School of Psychology, College of Engineering, Science and the Environment, University of Newcastle, Newcastle, Australia; Hunter Medical Research Institute, Newcastle, Australia
Matthew Hyett
School of Psychological Sciences, University of Western Australia, Perth, Australia
Gordon Parker
School of Psychiatry, University of New South Wales, Kensington, Australia
School of Psychology, College of Engineering, Science and the Environment, University of Newcastle, Newcastle, Australia; Hunter Medical Research Institute, Newcastle, Australia; School of Medicine and Public Health, College of Medicine, Health and Wellbeing, University of Newcastle, Newcastle, Australia
Facial affect is expressed dynamically – a giggle, grimace, or an agitated frown. However, the characterisation of human affect has relied almost exclusively on static images. This approach cannot capture the nuances of human communication or support the naturalistic assessment of affective disorders. Using the latest in machine vision and systems modelling, we studied dynamic facial expressions of people viewing emotionally salient film clips. We found that the apparent complexity of dynamic facial expressions can be captured by a small number of simple spatiotemporal states – composites of distinct facial actions, each expressed with a unique spectral fingerprint. Sequential expression of these states is common across individuals viewing the same film stimuli but varies in those with the melancholic subtype of major depressive disorder. This approach provides a platform for translational research, capturing dynamic facial expressions under naturalistic conditions and enabling new quantitative tools for the study of affective disorders and related mental illnesses.