51. Tool Embodiment Is Reflected in Movement Multifractal Nonlinearity
- Author
-
Yvan Pratviel, Veronique Deschodt-Arsac, Florian Larrue, and Laurent M. Arsac
- Subjects
cognitive system ,nonlinear dynamics ,embodiment ,human-machine interface ,Thermodynamics ,QC310.15-319 ,Mathematics ,QA1-939 ,Analysis ,QA299.6-433 - Abstract
Recent advances in neuroscience have linked dynamical systems theory to cognition. The main contention is that extended cognition relies on a unitary brain-body-tool system showing the expected signatures of interaction-dominance reflected in a multifractal behavior. This might be particularly relevant when it comes to understanding how the brain is able to embody a tool to perform a task. Here we applied the multifractal formalism to the dynamics of hand movement while one was performing a computer task (the herding task) using a mouse or its own hand as a tool to move an object on the screen. We applied a focus-based multifractal detrended fluctuation analysis to acceleration time series. Then, multifractal nonlinearity was assessed by comparing original series to a finite set of surrogates obtained after Iterated Amplitude Adjusted Fourier transformation, a method that removes nonlinear multiscale dependencies while preserving the linear structure of the time series. Both hand and mouse task execution demonstrated multifractal nonlinearity, a typical form of across-scales interactivity in cognitive control. In addition, a wider multifractal spectrum was observed in mouse condition, which might highlight a richer set of interactions when the cognitive system is extended to the embodied mouse. We conclude that the emergence of multifractal nonlinearity from a brain-body-tool system pleads for recent theories of radical tool embodiment. Multifractal nonlinearity may be a promising metric to appreciate how physical objects—but also virtual tools and potentially prosthetics—are efficiently embodied by the brain.
- Published
- 2022
- Full Text
- View/download PDF