Although the continuous advancements of devices for auditory assistance, it is still a fact that they often are not as supportive as needed. Many users complain of poor benefits felt from these devices when used in more acoustically complex real-world conditions, such as one-to-one or multiple-talker conversations inside time-varying noisy environments with adverse room acoustics. One causing factor is the inefficacy of in-laboratory and in-field listening tests currently used to predict the real-life impact of hearing disorders during the execution of hearing assistive devices fitting. While the former lacks ecological validity and visual cues, the latter lacks repeatability of the tested conditions. In order to close the gap between these methods, hearing research has started exploiting virtual reality (VR) to auralize complex acoustic environments (CAEs) where hearing assistive devices could be tested. This project proposes the development of a system to perform ecological auditory tests that can be easily replicated and used inside clinical contexts, through which the clinicians can create customized audiovisual scenes where speech intelligibility tests are auralized inside multiple CAEs with different noise types and positions. The system is installed inside a small sound-insulated room and mainly consists of a spherical array of 16 active loudspeakers synchronized with a VR headset to reproduce immersive 3D - 360° audiovisual scenes. The acoustical scenes are collected in-field through 3rd order ambisonics recordings of room impulse responses using the Zylia ZM-1 microphone. The visual scenes are taken through 3D and 2D footage with different resolutions using either the 8K Insta360 Pro or the 4K Insta360 ONE X2 camera. The implementation of objective and perceptual tests on normal hearing subjects is planned to evaluate the system reproduction accuracy compared with real environments and the degree of plausibility, sense of presence and immersion. Eventually, the system will be validated on hearing-impaired subjects.

Design of a system to be used in clinical contexts for the reproduction of ecological audiovisual scenes aimed at hearing assistive devices users / Guastamacchia, Angela; Riente, Fabrizio; Astolfi, Arianna. - ELETTRONICO. - (2023). (Intervento presentato al convegno CeLyA Summer School "Hearing in noise" tenutosi a Lyon, France nel June 12-14 2023).

Design of a system to be used in clinical contexts for the reproduction of ecological audiovisual scenes aimed at hearing assistive devices users

Guastamacchia, Angela;Riente, Fabrizio;Astolfi, Arianna
2023

Abstract

Although the continuous advancements of devices for auditory assistance, it is still a fact that they often are not as supportive as needed. Many users complain of poor benefits felt from these devices when used in more acoustically complex real-world conditions, such as one-to-one or multiple-talker conversations inside time-varying noisy environments with adverse room acoustics. One causing factor is the inefficacy of in-laboratory and in-field listening tests currently used to predict the real-life impact of hearing disorders during the execution of hearing assistive devices fitting. While the former lacks ecological validity and visual cues, the latter lacks repeatability of the tested conditions. In order to close the gap between these methods, hearing research has started exploiting virtual reality (VR) to auralize complex acoustic environments (CAEs) where hearing assistive devices could be tested. This project proposes the development of a system to perform ecological auditory tests that can be easily replicated and used inside clinical contexts, through which the clinicians can create customized audiovisual scenes where speech intelligibility tests are auralized inside multiple CAEs with different noise types and positions. The system is installed inside a small sound-insulated room and mainly consists of a spherical array of 16 active loudspeakers synchronized with a VR headset to reproduce immersive 3D - 360° audiovisual scenes. The acoustical scenes are collected in-field through 3rd order ambisonics recordings of room impulse responses using the Zylia ZM-1 microphone. The visual scenes are taken through 3D and 2D footage with different resolutions using either the 8K Insta360 Pro or the 4K Insta360 ONE X2 camera. The implementation of objective and perceptual tests on normal hearing subjects is planned to evaluate the system reproduction accuracy compared with real environments and the degree of plausibility, sense of presence and immersion. Eventually, the system will be validated on hearing-impaired subjects.
File in questo prodotto:
File Dimensione Formato  
Abstract Hearing in noise.pdf

accesso aperto

Tipologia: Abstract
Licenza: PUBBLICO - Tutti i diritti riservati
Dimensione 134.99 kB
Formato Adobe PDF
134.99 kB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2980451