Action representation and the sharing of feature coding within the Action Observation Network (AON) remain debated, and our understanding of how the brain consistently encodes action features across sensory modalities under variable, naturalistic conditions is still limited. Here, we introduce a theoretically-based taxonomic model of action representation that categorizes action-related features into six conceptual domains: Space, Effector, Agent & Object, Social, Emotion, and Linguistic. We assessed the predictive power of this model on human brain activity by acquiring functional MRI (fMRI) data from participants exposed to audiovisual, visual-only, or auditory-only versions of the same naturalistic movie. Using a multi-voxel encoding analysis and variance partitioning, we demonstrated that our model significantly predicts cortical activity within the AON, with a comparable effect size across modalities. The Effector and Social domains contributed most to the model predictions and domain-specific representations were largely stable across sensory modalities. This study elucidates how the human brain robustly encodes action-related information across different sensory modalities, revealing that certain action domains have a stronger influence on neural representation in a modality-general manner. Overall, this research enhances our understanding of how the brain integrates complex action information from multiple sensory inputs, offering insights into the generalized nature of action representation in human cognition and paving the way for further exploration into multisensory integration.

Neural representation of action features across sensory modalities: a multimodal fMRI study

Marras Laura;Teresi Lorenzo;Simonelli Francesca;Setti Francesca;Ingenito Alessandro;Handjaras Giacomo
;
Ricciardi Emiliano
2025

Abstract

Action representation and the sharing of feature coding within the Action Observation Network (AON) remain debated, and our understanding of how the brain consistently encodes action features across sensory modalities under variable, naturalistic conditions is still limited. Here, we introduce a theoretically-based taxonomic model of action representation that categorizes action-related features into six conceptual domains: Space, Effector, Agent & Object, Social, Emotion, and Linguistic. We assessed the predictive power of this model on human brain activity by acquiring functional MRI (fMRI) data from participants exposed to audiovisual, visual-only, or auditory-only versions of the same naturalistic movie. Using a multi-voxel encoding analysis and variance partitioning, we demonstrated that our model significantly predicts cortical activity within the AON, with a comparable effect size across modalities. The Effector and Social domains contributed most to the model predictions and domain-specific representations were largely stable across sensory modalities. This study elucidates how the human brain robustly encodes action-related information across different sensory modalities, revealing that certain action domains have a stronger influence on neural representation in a modality-general manner. Overall, this research enhances our understanding of how the brain integrates complex action information from multiple sensory inputs, offering insights into the generalized nature of action representation in human cognition and paving the way for further exploration into multisensory integration.
2025
Action representation
canonical correlation analysis
fMRI
naturalistic stimulation
sensory modality
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S1053811925004422-main_Marras.pdf

accesso aperto

Descrizione: Neural representation of action features across sensory modalities: A multimodal fMRI study
Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 5.19 MB
Formato Adobe PDF
5.19 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11771/35958
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
social impact