In the actual scenario of nephron sparing “precision” surgery, the availability of hyper-accuracy 3D models (HA3DTM) revealed to be useful thanks to their spatial visualization.
In the last years, a step further was represented by the advent of Augmented Reality (AR) technology, in which the superimposition of 3D virtual images allowed an intraoperative surgical navigation.
In this study, with the aim to automatize the 3D virtual and endoscopic images co-registration, we developed a dedicated software based on computer vision algorithms.
This is a feasibility study of our new dedicated software, named “IGNITE” (Indocyanine GreeN automatIc augmenTed rEality), able to automatically anchor the HA3D™ model with the endoscopic vision of the real organ. For this project we applied computer vision algorithms which need clear landmarks to correctly identify the target organ. In order to overcome the colors similarity of the operative field and to provide a more depictable shape for the software we decided to use indocyanine. In order to overcome the colors similarity of the operative field and to provide a more depictable shape for the software we decided to use indocyanine. In this way the kidney appears as a bright green organ surrounded by a dark operative field.
We enrolled patients with renal mass scheduled for AR robot-assisted partial nephrectomy. All patients underwent a four-phase contrast-enhanced CT, in order to create the HA3D models, visualized as AR images inside the robotic console.Perioperative and postoperative data were collected and analyzed.
Ten cases were enrolled in this pilot experience. Mean lesions size was 46.6 (+16.3) mm. Median PADUA score was 9 (IQR 8-10). Mean operative and ischemia time were 88.9 (+42.7) and 20.5 (3.5) min, respectively.
In all the cases (even in completely endophytic or posterior tumors) the automatic tracking was successful without manual assistance by the operator,
allowing to perform enucleoresection of the renal mass with no damages of the tumor’s pseudocapsule and avoiding positive surgical margins. Moreover, no intra- or postoperative complications (>2 according to Clavien-Dindo) were recorded.
Present findings suggest that the new evolution of our AR platform based on computer vision algorithm allows an effective automatic AR RAPN.
The automatic 3D model overlapping lead to a correct identification of the tumor, even in endophytic and posterior cases, without risk of complications or positive surgical margins.