Augmented reality robot-assisted partial nephrectomy (AR-RAPN) is limited by the need of a constant manual overlapping of the hyper-accuracy 3D (HA3D) virtual models to the real anatomy. To present our preliminary experience with automatic 3D virtual model overlapping during AR-RAPN. To reach a fully automated HA3D model overlapping, we pursued computer vision strategies, based on the identification of landmarks to link the virtual model. Due to the limited field of view of RAPN, we used the whole kidney as a marker. Moreover, to overcome the limit of similarity of colors between the kidney and its neighboring structures, we super-enhanced the organ, using the NIRF Firefly fluorescence imaging technology. A specifically developed software named “IGNITE” (Indocyanine GreeN automatIc augmenTed rEality) allowed the automatic anchorage of the HA3D model to the real organ, leveraging the enhanced view offered by NIRF technology. Ten automatic AR-RAPN were performed. For all the patients a HA3D model was produced and visualized as AR image inside the robotic console. During all the surgical procedures, the automatic ICG-guided AR technology successfully anchored the virtual model to the real organ without hand-assistance (mean anchorage time: 7 seconds), even when moving the camera throughout the operative field, while zooming and translating the organ. In 7 patients with totally endophytic or posterior lesions, the renal masses were correctly identified with automatic AR technology, performing a successful enucleoresection. No intraoperative or postoperative Clavien >2 complications or positive surgical margins were recorded. Our pilot study provides the first demonstration of the application of computer vision technology for AR procedures, with a software automatically performing a visual concordance during the overlap of 3D models and in vivo anatomy. Its actual limitations, related to the kidney deformations during surgery altering the automatic anchorage, will be overcome implementing the organ recognition with deep learning algorithms.

Indocyanine Green Drives Computer Vision Based 3D Augmented Reality Robot Assisted Partial Nephrectomy: The Beginning of “Automatic” Overlapping Era / Amparore, D.; Checcucci, E.; Piazzolla, P.; Piramide, F.; De Cillis, S.; Piana, A.; Verri, P.; Manfredi, M.; Fiori, C.; Vezzetti, E.; Porpiglia, F.. - In: UROLOGY. - ISSN 0090-4295. - 164:(2022), pp. 312-316. [10.1016/j.urology.2021.10.053]

Indocyanine Green Drives Computer Vision Based 3D Augmented Reality Robot Assisted Partial Nephrectomy: The Beginning of “Automatic” Overlapping Era

Vezzetti E.;
2022

Abstract

Augmented reality robot-assisted partial nephrectomy (AR-RAPN) is limited by the need of a constant manual overlapping of the hyper-accuracy 3D (HA3D) virtual models to the real anatomy. To present our preliminary experience with automatic 3D virtual model overlapping during AR-RAPN. To reach a fully automated HA3D model overlapping, we pursued computer vision strategies, based on the identification of landmarks to link the virtual model. Due to the limited field of view of RAPN, we used the whole kidney as a marker. Moreover, to overcome the limit of similarity of colors between the kidney and its neighboring structures, we super-enhanced the organ, using the NIRF Firefly fluorescence imaging technology. A specifically developed software named “IGNITE” (Indocyanine GreeN automatIc augmenTed rEality) allowed the automatic anchorage of the HA3D model to the real organ, leveraging the enhanced view offered by NIRF technology. Ten automatic AR-RAPN were performed. For all the patients a HA3D model was produced and visualized as AR image inside the robotic console. During all the surgical procedures, the automatic ICG-guided AR technology successfully anchored the virtual model to the real organ without hand-assistance (mean anchorage time: 7 seconds), even when moving the camera throughout the operative field, while zooming and translating the organ. In 7 patients with totally endophytic or posterior lesions, the renal masses were correctly identified with automatic AR technology, performing a successful enucleoresection. No intraoperative or postoperative Clavien >2 complications or positive surgical margins were recorded. Our pilot study provides the first demonstration of the application of computer vision technology for AR procedures, with a software automatically performing a visual concordance during the overlap of 3D models and in vivo anatomy. Its actual limitations, related to the kidney deformations during surgery altering the automatic anchorage, will be overcome implementing the organ recognition with deep learning algorithms.
2022
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S0090429522000292-main.pdf

non disponibili

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 122.97 kB
Formato Adobe PDF
122.97 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2970246