Objective: This study introduces a machine learning approach to automate muscle ultrasound analysis, aiming to improve objectivity and efficiency in segmentation, classification, and Heckmatt grading. Methods: We analyzed a dataset of 25,005 B-mode images from 290 participants (110 FSHD patients) acquired using a single Esaote ultrasound scanner with a standardized protocol. Manual segmentation and Heckmatt grading by experienced observers served as ground truth. K-Net was utilized for simultaneous muscle segmentation and classification. Heckmatt scoring was approached with texture analysis, using a modified scale with three classes (Normal, Uncertain, Abnormal). Radiomics features were extracted using PyRadiomics and automatic scoring was performed using XGBoost, incorporating explainability through SHAP analysis. Results: K-Net demonstrated high accuracy in skeletal muscle classification and segmentation, with Intersection over Union ranging from 73.40 to 74.03 across folds. Heckmatt's grading achieved an Area Under Curve of 0.95, 0.87, and 0.97 for classes Normal, Uncertain, and Abnormal. SHAP analysis highlighted histogram-based features as critical for visual scoring. Conclusion: This study proposes and validates an automatic pipeline for muscle ultrasound analysis, leveraging machine learning for segmentation, classification, and quantitative Heckmatt grading. Significance: Automating the visual assessment of muscle ultrasound images improves the objectivity and efficiency of muscle ultrasound, supporting clinical decision-making.

Machine learning-driven Heckmatt grading in facioscapulohumeral muscular dystrophy: A novel pathway for musculoskeletal ultrasound analysis / Marzola, Francesco; van Alfen, Nens; Doorduin, Jonne; Meiburger, Kristen M.. - In: CLINICAL NEUROPHYSIOLOGY. - ISSN 1388-2457. - 172:(2025), pp. 61-69. [10.1016/j.clinph.2025.01.016]

Machine learning-driven Heckmatt grading in facioscapulohumeral muscular dystrophy: A novel pathway for musculoskeletal ultrasound analysis

Marzola, Francesco;Meiburger, Kristen M.
2025

Abstract

Objective: This study introduces a machine learning approach to automate muscle ultrasound analysis, aiming to improve objectivity and efficiency in segmentation, classification, and Heckmatt grading. Methods: We analyzed a dataset of 25,005 B-mode images from 290 participants (110 FSHD patients) acquired using a single Esaote ultrasound scanner with a standardized protocol. Manual segmentation and Heckmatt grading by experienced observers served as ground truth. K-Net was utilized for simultaneous muscle segmentation and classification. Heckmatt scoring was approached with texture analysis, using a modified scale with three classes (Normal, Uncertain, Abnormal). Radiomics features were extracted using PyRadiomics and automatic scoring was performed using XGBoost, incorporating explainability through SHAP analysis. Results: K-Net demonstrated high accuracy in skeletal muscle classification and segmentation, with Intersection over Union ranging from 73.40 to 74.03 across folds. Heckmatt's grading achieved an Area Under Curve of 0.95, 0.87, and 0.97 for classes Normal, Uncertain, and Abnormal. SHAP analysis highlighted histogram-based features as critical for visual scoring. Conclusion: This study proposes and validates an automatic pipeline for muscle ultrasound analysis, leveraging machine learning for segmentation, classification, and quantitative Heckmatt grading. Significance: Automating the visual assessment of muscle ultrasound images improves the objectivity and efficiency of muscle ultrasound, supporting clinical decision-making.
File in questo prodotto:
File Dimensione Formato  
2025_HeckmattML.pdf

accesso aperto

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Creative commons
Dimensione 4.97 MB
Formato Adobe PDF
4.97 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2998708