The advent of Healthcare 4.0 has heralded a disruptive change in medical diagnostics, with Artificial Intelligence (AI) playing a central role in improving diagnostic accuracy and treatment efficacy. This work addresses the integration of AI into medical imaging, specifically through the use of deep learning networks to analyze and segment medical images with high precision. Our study builds on the capabilities of U-Net-like models to address the challenges of segmenting clinical targets of different sizes, such as cysts and tumor masses, in medical images. We present a novel deep learning segmentation solution, Tandem (Tandem Analysis for Neural Detection and Evaluation Model), which combines a segmentation model with a classifier to produce a confidence map alongside the model's prediction. This approach aims to improve segmentation accuracy by refining predictions and providing a mechanism to assess the reliability of the model, especially when identifying smaller clinically significant targets. We evaluate our method across various imaging modalities, including 2D and 3D acquisitions. We focus on detecting and segmenting kidney cysts associated with Autosomal Dominant Polycystic Kidney Disease (ADPKD) and tumor masses. The practical effectiveness of Tandem is demonstrated by the generation of reliable confidence maps that help clinicians make informed diagnostic and treatment decisions. This study represents a significant step towards precision medicine by improving the diagnostic capabilities of AI-driven systems in medical imaging.

Tandem: a Confidence-based Approach for Precise Medical Image Segmentation / Monaco, Simone; Petrosino, Lorenzo; Xinaris, Christodoulos; Apiletti, Daniele. - (In corso di stampa). (Intervento presentato al convegno 2024 ACM SIGKDD International Conference on Knowledge Discovery and Data Mining Workshops tenutosi a Barcelona (ESP) nel August 25, 2024 - August 29, 2024).

Tandem: a Confidence-based Approach for Precise Medical Image Segmentation

Simone Monaco;Daniele Apiletti
In corso di stampa

Abstract

The advent of Healthcare 4.0 has heralded a disruptive change in medical diagnostics, with Artificial Intelligence (AI) playing a central role in improving diagnostic accuracy and treatment efficacy. This work addresses the integration of AI into medical imaging, specifically through the use of deep learning networks to analyze and segment medical images with high precision. Our study builds on the capabilities of U-Net-like models to address the challenges of segmenting clinical targets of different sizes, such as cysts and tumor masses, in medical images. We present a novel deep learning segmentation solution, Tandem (Tandem Analysis for Neural Detection and Evaluation Model), which combines a segmentation model with a classifier to produce a confidence map alongside the model's prediction. This approach aims to improve segmentation accuracy by refining predictions and providing a mechanism to assess the reliability of the model, especially when identifying smaller clinically significant targets. We evaluate our method across various imaging modalities, including 2D and 3D acquisitions. We focus on detecting and segmenting kidney cysts associated with Autosomal Dominant Polycystic Kidney Disease (ADPKD) and tumor masses. The practical effectiveness of Tandem is demonstrated by the generation of reliable confidence maps that help clinicians make informed diagnostic and treatment decisions. This study represents a significant step towards precision medicine by improving the diagnostic capabilities of AI-driven systems in medical imaging.
In corso di stampa
File in questo prodotto:
File Dimensione Formato  
27_tandem_a_confidence_based_appr.pdf

accesso aperto

Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: PUBBLICO - Tutti i diritti riservati
Dimensione 3.26 MB
Formato Adobe PDF
3.26 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2991387