This study introduces a novel integration of visual and haptic feedback to improve music perception for Deaf and Hard-of-Hearing (DHH) individuals. The proposed prototype, named MusicTanvas, leverages a touchscreen augmented with electroadhesion-based haptic technology, enabling dynamic overlay of tactile textures on visual content. The system processes pre-recorded audio tracks to extract spectral, harmonic, and rhythmic features, which are mapped to both visual and haptic representations. More in detail, key features such as chroma, Mel-Frequency Cepstral Coefficients (MFCCs), and rhythmic figures, are translated into corresponding visual attributes (e.g., color, shape, and size) and tactile textures. Users can explore music in real-time through these multisensory outputs, which evolve synchronously with the audio playback. A preliminary user assessment demonstrates the system's potential to enhance musical interaction by combining visual-haptic feedback.
ENHANCING MUSIC VISUALIZATION WITH HAPTIC FEEDBACK TO EASE PERCEPTION FOR DHH LISTENERS / Mauri, Noemi; Sacchetto, Matteo; Bagnus, Piera; Nicora, Chiara; Zanoni, Massimiliano; Bianco, Andrea; Rottondi, Cristina. - ELETTRONICO. - (2025), pp. 2913-2920. ( 11th Convention of the European Acoustics Association Forum Acusticum / EuroNoise 2025 Malaga (Esp) June 23 - 26, 2025) [10.61782/fa.2025.0344].
ENHANCING MUSIC VISUALIZATION WITH HAPTIC FEEDBACK TO EASE PERCEPTION FOR DHH LISTENERS
Noemi Mauri;Matteo Sacchetto;Massimiliano Zanoni;Andrea Bianco;Cristina Rottondi
2025
Abstract
This study introduces a novel integration of visual and haptic feedback to improve music perception for Deaf and Hard-of-Hearing (DHH) individuals. The proposed prototype, named MusicTanvas, leverages a touchscreen augmented with electroadhesion-based haptic technology, enabling dynamic overlay of tactile textures on visual content. The system processes pre-recorded audio tracks to extract spectral, harmonic, and rhythmic features, which are mapped to both visual and haptic representations. More in detail, key features such as chroma, Mel-Frequency Cepstral Coefficients (MFCCs), and rhythmic figures, are translated into corresponding visual attributes (e.g., color, shape, and size) and tactile textures. Users can explore music in real-time through these multisensory outputs, which evolve synchronously with the audio playback. A preliminary user assessment demonstrates the system's potential to enhance musical interaction by combining visual-haptic feedback.| File | Dimensione | Formato | |
|---|---|---|---|
|
000344.pdf
accesso aperto
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Creative commons
Dimensione
817.75 kB
Formato
Adobe PDF
|
817.75 kB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/3006707
