Recent years have ushered in a transformative era in in vitro modeling with the advent of organoids, three-dimensional structures derived from stem cells or patient tumor cells. Still, fully harnessing the potential of organoids requires advanced imaging technologies and analytical tools to quantitatively monitor organoid growth. Optical coherence tomography (OCT) is a promising imaging modality for organoid analysis due to its high-resolution, label-free, non-destructive, and real-time 3D imaging capabilities, but accurately identifying and quantifying organoids in OCT images remain challenging due to various factors. Here, we propose an automatic deep learning-based pipeline with convolutional neural networks that synergistically includes optimized preprocessing steps, the implementation of a state-of-the-art deep learning model, and ad-hoc postprocessing methods, showcasing good generalizability and tracking capabilities over an extended period of 13 days. The proposed tracking algorithm thoroughly documents organoid evolution, utilizing reference volumes, a dual branch analysis, key attribute evaluation, and probability scoring for match identification. The proposed comprehensive approach enables the accurate tracking of organoid growth and morphological changes over time, advancing organoid analysis and serving as a solid foundation for future studies for drug screening and tumor drug sensitivity detection based on organoids.

Segmentation and Multi-Timepoint Tracking of 3D Cancer Organoids from Optical Coherence Tomography Images Using Deep Neural Networks / Branciforti, Francesco; Salvi, Massimo; D'Agostino, Filippo; Marzola, Francesco; Cornacchia, Sara; De Titta, Maria Olimpia; Mastronuzzi, Girolamo; Meloni, Isotta; Moschetta, Miriam; Porciani, Niccolò; Sciscenti, Fabrizio; Spertini, Alessandro; Spilla, Andrea; Zagaria, Ilenia; Deloria, Abigail J.; Deng, Shiyu; Haindl, Richard; Szakacs, Gergely; Csiszar, Agnes; Liu, Mengyang; Drexler, Wolfgang; Molinari, Filippo; Meiburger, Kristen M.. - In: DIAGNOSTICS. - ISSN 2075-4418. - 14:12(2024). [10.3390/diagnostics14121217]

Segmentation and Multi-Timepoint Tracking of 3D Cancer Organoids from Optical Coherence Tomography Images Using Deep Neural Networks

Branciforti, Francesco;Salvi, Massimo;D'Agostino, Filippo;Marzola, Francesco;Mastronuzzi, Girolamo;Moschetta, Miriam;Sciscenti, Fabrizio;Molinari, Filippo;Meiburger, Kristen M.
2024

Abstract

Recent years have ushered in a transformative era in in vitro modeling with the advent of organoids, three-dimensional structures derived from stem cells or patient tumor cells. Still, fully harnessing the potential of organoids requires advanced imaging technologies and analytical tools to quantitatively monitor organoid growth. Optical coherence tomography (OCT) is a promising imaging modality for organoid analysis due to its high-resolution, label-free, non-destructive, and real-time 3D imaging capabilities, but accurately identifying and quantifying organoids in OCT images remain challenging due to various factors. Here, we propose an automatic deep learning-based pipeline with convolutional neural networks that synergistically includes optimized preprocessing steps, the implementation of a state-of-the-art deep learning model, and ad-hoc postprocessing methods, showcasing good generalizability and tracking capabilities over an extended period of 13 days. The proposed tracking algorithm thoroughly documents organoid evolution, utilizing reference volumes, a dual branch analysis, key attribute evaluation, and probability scoring for match identification. The proposed comprehensive approach enables the accurate tracking of organoid growth and morphological changes over time, advancing organoid analysis and serving as a solid foundation for future studies for drug screening and tumor drug sensitivity detection based on organoids.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2990995
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo