The embodiment of emotions constitutes a profound aspect of our psychological experience, shaping not only how we express but also how we perceive and understand our feelings. Here, we introduce an innovative haptic tool to explore the relationship between emotional states and bodily reactions, with a focus on its applicability to individuals with visual impairments. Our paradigm is highly intuitive, ensuring accessibility while maintaining accuracy comparable to traditional tasks reliant on visual stimuli. Through motion tracking and a 3D human representation, our system allows to capture in a naturalistic manner where individuals sense affective and cognitive states within their bodies. To validate this method, we conducted two experiments employing both haptic and visual versions of the same task, revealing a compelling alignment between modalities in capturing individuals’ internalized manifestations of emotional states. Our novel haptic paradigm allows the mapping of emotions in the body in an intuitive way, offering a more inclusive and versatile method for exploring how people connect their emotions to their physical experiences. More importantly, the haptic version of our task holds particular promise for investigating how individuals with limited or no visual capability maintain representations of emotional and cognitive states within their bodies. In addition to its profound implications for understanding emotions, this innovative tool holds promise for exploring a wide variety of research questions beyond the realm of affective states, thereby broadening its utility as a versatile instrument for investigating various aspects of human perception, cognition, and embodiment particularly within the context of visual impairment.

HABEMO: an innovative haptic tool for investigating the bodily representation of mental states in individuals with visual impairments

Lettieri Giada
;
2025

Abstract

The embodiment of emotions constitutes a profound aspect of our psychological experience, shaping not only how we express but also how we perceive and understand our feelings. Here, we introduce an innovative haptic tool to explore the relationship between emotional states and bodily reactions, with a focus on its applicability to individuals with visual impairments. Our paradigm is highly intuitive, ensuring accessibility while maintaining accuracy comparable to traditional tasks reliant on visual stimuli. Through motion tracking and a 3D human representation, our system allows to capture in a naturalistic manner where individuals sense affective and cognitive states within their bodies. To validate this method, we conducted two experiments employing both haptic and visual versions of the same task, revealing a compelling alignment between modalities in capturing individuals’ internalized manifestations of emotional states. Our novel haptic paradigm allows the mapping of emotions in the body in an intuitive way, offering a more inclusive and versatile method for exploring how people connect their emotions to their physical experiences. More importantly, the haptic version of our task holds particular promise for investigating how individuals with limited or no visual capability maintain representations of emotional and cognitive states within their bodies. In addition to its profound implications for understanding emotions, this innovative tool holds promise for exploring a wide variety of research questions beyond the realm of affective states, thereby broadening its utility as a versatile instrument for investigating various aspects of human perception, cognition, and embodiment particularly within the context of visual impairment.
2025
9780443235894
Blindness
Body representation
Embodiment
Touch
Visual impairment
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11771/34941
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
social impact