Fractal and Fractional (Apr 2022)

Tool Embodiment Is Reflected in Movement Multifractal Nonlinearity

  • Yvan Pratviel,
  • Veronique Deschodt-Arsac,
  • Florian Larrue,
  • Laurent M. Arsac

DOI
https://doi.org/10.3390/fractalfract6050240
Journal volume & issue
Vol. 6, no. 5
p. 240

Abstract

Read online

Recent advances in neuroscience have linked dynamical systems theory to cognition. The main contention is that extended cognition relies on a unitary brain-body-tool system showing the expected signatures of interaction-dominance reflected in a multifractal behavior. This might be particularly relevant when it comes to understanding how the brain is able to embody a tool to perform a task. Here we applied the multifractal formalism to the dynamics of hand movement while one was performing a computer task (the herding task) using a mouse or its own hand as a tool to move an object on the screen. We applied a focus-based multifractal detrended fluctuation analysis to acceleration time series. Then, multifractal nonlinearity was assessed by comparing original series to a finite set of surrogates obtained after Iterated Amplitude Adjusted Fourier transformation, a method that removes nonlinear multiscale dependencies while preserving the linear structure of the time series. Both hand and mouse task execution demonstrated multifractal nonlinearity, a typical form of across-scales interactivity in cognitive control. In addition, a wider multifractal spectrum was observed in mouse condition, which might highlight a richer set of interactions when the cognitive system is extended to the embodied mouse. We conclude that the emergence of multifractal nonlinearity from a brain-body-tool system pleads for recent theories of radical tool embodiment. Multifractal nonlinearity may be a promising metric to appreciate how physical objects—but also virtual tools and potentially prosthetics—are efficiently embodied by the brain.

Keywords