Altered facial expressivity is frequently recognized in cognitively impaired individuals. This makes facial emotion identification a promising tool with which to support the diagnostic process. We propose a novel, non-invasive approach for detecting cognitive impairment based on facial emotion analysis. We design a protocol for emotion elicitation using visual and auditory standardized stimuli. We collect facial emotion video recordings from 32 cognitively impaired and 28 healthy control subjects. To track the evolution of emotions during the experiment, we train a deep convolutional neural network on the AffectNet dataset for emotion recognition from facial images. Emotions are described using a dimensional affect model, namely the continuous dimensions of valence and arousal, rather than discrete categories, enabling a more nuanced analysis. The collected facial emotion data are used to train a classifier to distinguish cognitively impaired and healthy subjects. Our k-nearest neighbors model achieves a cross-validation accuracy of 76.7%, demonstrating the feasibility of automatic cognitive impairment detection from facial expressions. These results highlight the potential of facial expressions as early markers of cognitive impairment, which could enhance non-invasive screening methods for early diagnosis.

Automatic Detection of Cognitive Impairment Through Facial Emotion Analysis / Bergamasco, Letizia; Lorenzo, Federica; Coletta, Anita; Olmo, Gabriella; Cermelli, Aurora; Rubino, Elisa; Rainero, Innocenzo. - In: APPLIED SCIENCES. - ISSN 2076-3417. - ELETTRONICO. - 15:16(2025). [10.3390/app15169103]

Automatic Detection of Cognitive Impairment Through Facial Emotion Analysis

Bergamasco, Letizia;Coletta, Anita;Olmo, Gabriella;
2025

Abstract

Altered facial expressivity is frequently recognized in cognitively impaired individuals. This makes facial emotion identification a promising tool with which to support the diagnostic process. We propose a novel, non-invasive approach for detecting cognitive impairment based on facial emotion analysis. We design a protocol for emotion elicitation using visual and auditory standardized stimuli. We collect facial emotion video recordings from 32 cognitively impaired and 28 healthy control subjects. To track the evolution of emotions during the experiment, we train a deep convolutional neural network on the AffectNet dataset for emotion recognition from facial images. Emotions are described using a dimensional affect model, namely the continuous dimensions of valence and arousal, rather than discrete categories, enabling a more nuanced analysis. The collected facial emotion data are used to train a classifier to distinguish cognitively impaired and healthy subjects. Our k-nearest neighbors model achieves a cross-validation accuracy of 76.7%, demonstrating the feasibility of automatic cognitive impairment detection from facial expressions. These results highlight the potential of facial expressions as early markers of cognitive impairment, which could enhance non-invasive screening methods for early diagnosis.
2025
File in questo prodotto:
File Dimensione Formato  
applsci-15-09103.pdf

accesso aperto

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Creative commons
Dimensione 351.4 kB
Formato Adobe PDF
351.4 kB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/3002575