Recent research has focused on the validation of methods and procedures to perform ecological tests for the assessment of hearing sensitivity under the complex acoustical conditions of everyday life environments. Virtual Reality (VR) has been extensively used to reproduce immersive acoustical scenes in combination with visual cues in order to account for the multisensory perception of the physical environment that happens in real-life situations. However, due to the complexity of recording and reproduction procedures, the main studies focus on either audiovisual rendering of simulated scenarios or in-field audio recordings without real visual contextualization. This work proposes a case study involving a challenging listening environment (a conference hall with 3.2 s of reverberation time at mid-frequencies), where 360 audiovisual scenes were recorded and then reproduced in laboratory using a 16-loudspeakers array and a VR headset. Multiple scenarios involving different target- and noise-source positions were acquired through 3rd-order-ambisonics recordings of room impulse responses and 360∘ stereoscopic video footage. Speech intelligibility tests were auralized for these scenarios, considering informational masking noise at different signal-to-noise ratios, and administrated to a panel of normal-hearing subjects to validate the proposed VR methodology that will be applied for future studies involving hearing-impaired listeners too.
Audiovisual recording and reproduction of ecological acoustical scenes for hearing research: a case study with high reverberation / Guastamacchia, Angela; Puglisi, Giuseppina Emma; Albera, Andrea; Shtrepi, Louena; Riente, Fabrizio; Masoero, Marco Carlo; Astolfi, Arianna. - ELETTRONICO. - (2024), pp. 1765-1772. (Intervento presentato al convegno Forum Acusticum 2023 tenutosi a Torino nel 11-15 September 2023) [10.61782/fa.2023.0666].
Audiovisual recording and reproduction of ecological acoustical scenes for hearing research: a case study with high reverberation
Guastamacchia, Angela;Puglisi, Giuseppina Emma;Albera Andrea;Shtrepi Louena;Riente, Fabrizio;Masoero, Marco Carlo;Arianna Astolfi
2024
Abstract
Recent research has focused on the validation of methods and procedures to perform ecological tests for the assessment of hearing sensitivity under the complex acoustical conditions of everyday life environments. Virtual Reality (VR) has been extensively used to reproduce immersive acoustical scenes in combination with visual cues in order to account for the multisensory perception of the physical environment that happens in real-life situations. However, due to the complexity of recording and reproduction procedures, the main studies focus on either audiovisual rendering of simulated scenarios or in-field audio recordings without real visual contextualization. This work proposes a case study involving a challenging listening environment (a conference hall with 3.2 s of reverberation time at mid-frequencies), where 360 audiovisual scenes were recorded and then reproduced in laboratory using a 16-loudspeakers array and a VR headset. Multiple scenarios involving different target- and noise-source positions were acquired through 3rd-order-ambisonics recordings of room impulse responses and 360∘ stereoscopic video footage. Speech intelligibility tests were auralized for these scenarios, considering informational masking noise at different signal-to-noise ratios, and administrated to a panel of normal-hearing subjects to validate the proposed VR methodology that will be applied for future studies involving hearing-impaired listeners too.File | Dimensione | Formato | |
---|---|---|---|
000666.pdf
accesso aperto
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Pubblico - Tutti i diritti riservati
Dimensione
669.98 kB
Formato
Adobe PDF
|
669.98 kB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2986449