Journal List > Ann Surg Treat Res > v.95(6) > 1108205

Lee, Kong, Kim, Yi, Chai, Lee, and Kim: Preliminary study on application of augmented reality visualization in robotic thyroid surgery

Abstract

Purpose

Increased robotic surgery is attended by increased reports of complications, largely due to limited operative view and lack of tactile sense. These kinds of obstacles, which seldom occur in open surgery, are challenging for beginner surgeons. To enhance robotic surgery safety, we created an augmented reality (AR) model of the organs around the thyroid glands, and tested the AR model applicability in robotic thyroidectomy.

Methods

We created AR images of the thyroid gland, common carotid arteries, trachea, and esophagus using preoperative CT images of a thyroid carcinoma patient. For a preliminary test, we overlaid the AR images on a 3-dimensional printed model at five different angles and evaluated its accuracy using Dice similarity coefficient. We then overlaid the AR images on the real-time operative images during robotic thyroidectomy.

Results

The Dice similarity coefficients ranged from 0.984 to 0.9908, and the mean of the five different angles was 0.987. During the entire process of robotic thyroidectomy, the AR images were successfully overlaid on the real-time operative images using manual registration.

Conclusion

We successfully demonstrated the use of AR on the operative field during robotic thyroidectomy. Although there are currently limitations, the use of AR in robotic surgery will become more practical as the technology advances and may contribute to the enhancement of surgical safety.

INTRODUCTION

In 2000, the da Vinci Surgical System (Intuitive Surgical Inc., Sunnyvale, CA, USA) was approved by the U.S. Food and Drug Administration (FDA) for general laparoscopic surgery. Robotic surgery offers 3-dimensional (3D) imaging, sophisticated endowrist movement, tremor reduction, and excellent ergonomics, compared to endoscopic remote-access surgery. Following its adoption in 2007, robotic thyroidectomy began to replace endoscopic remote-access approaches [12345].
Despite its advantages, robotic thyroidectomy is challenging for inexperienced surgeons for several reasons, including high cost, insufficient chance of training, and most importantly, safety issues. During robotic thyroidectomy, surgeons cannot use palpation. Moreover, identification of neighboring structures covered by tissue is often difficult, potentially leading to injury causing severe damage. Serious complications associated with robotic thyroidectomy have been reported, including brachial plexus injury, tracheal perforation, and flap perforation [6789]. Owing to safety concerns, FDA revoked the use of the robot in thyroid surgery in 2011 [10].
Within the medical field, augmented reality (AR) is an image guided technology in which computer generated images are superimposed onto the live video feed of the surgical view, to provide a composite view in real time. Incorporation of AR in laparoscopic or robotic surgery aids the surgeon to more easily perceive anatomical structures intuitively which helps to identify structures covered or hidden by surrounding tissue. Although there have been attempts to introduce AR laparoscopic surgery, the manufacturers of robotic systems have not yet released an AR-merged robotic system. Thus, at present AR is not readily available.
In this study, we created AR images using preoperative CT images and developed a program to control the AR images. We then superimposed the generated AR images onto the real-time operative view during robotic thyroidectomy.

METHODS

Patient information

The patient was a 45-year-old female diagnosed with papillary thyroid carcinoma. We conducted a preoperative CT scan (IQon Spectral CT, Philips Healthcare, Best, the Netherland) and the patient underwent left thyroid lobectomy using a bilateral axillo-breast approach with the da Vinci Si system (Intuitive Surgical, Sunnyvale, CA, USA), on August 7, 2017. This study was approved by the Institutional Review Board of Seoul National University (approval number: H-1710-022-889).

3D volume segmentation of the organs

We created a 3D computer-aided design (CAD) model of the target organs (thyroid glands, common carotid arteries, trachea, and esophagus) from the subject's CT DICOM files using open source software Seg3D (v.2.2.1, NIH Center for Integrative Biomedical Computing at the University of Utah Scientific Computing and Imaging Institute, Salt Lake City, UT, USA). The functions of this software include visualization, segmentation, 3D reconstruction, and quantification of DICOM data. For 3D volume segmentation, we used a thresholding method using Hounsfield units (HUs) of the organs. The threshold values were 105–452 HU for the thyroid glands and common carotid arteries, −2,676 to −242 HU for the trachea, and −590 to 405 HU for the esophagus. The software segmented target organs from each CT section to create 3D reconstructed images. Fig. 1 shows the overall procedures for constructing 3D models including segmentation, 3D reconstruction, and creating 3D printing models and 3D CAD models.

Graphic user interface for AR

We converted the 3D reconstruction result into Surface Tesselation Language file format, standard for triangulated representation of a 3D CAD model. We developed image registration software using MATLAB software (MATLAB R2017a, MathWorks Inc., Natick, MA, USA) to overlay the 3D CAD model on the intraoperative image. The functions of the image registration software included zoom-in, zoom-out, translation, roll rotation, pitch rotation, yaw rotation, and color/transparency adjustment.

Validation of the image registration software

To validate the image registration software, we made a 3D printing model of the target organs using a 3D printer (FDM 3D, Cubicon, Seongnam, Korea) with different color filaments. The image registration applied on the 3D printing model is shown in Fig. 2. Registration performance, which is the similarity between the 3D CAD Model and the 3D printed model, was quantitatively assessed with the Dice similarity coefficient (DSC) on the 2-dimensional camera view. DSC is calculated by the presence/absence formula, which is (2 × |A ∩ B| / (|A| + |B|)) where A and B are 2 different objects. The DSC was evaluated at 5 different angle views (30°, 45°, 60°, 75°, and 90°).

Application of AR image onto real-time operative image

To enable the operative field to be simultaneously visualized on the laptop monitor and the da Vinci monitor, we connected the monitor cable from the robotic system to a laptop computer. Using image registration software, during the surgery we overlaid the 3D CAD model on the surgical image displayed on the laptop monitor. The model was controlled and overlaid by an assistant.

RESULTS

Validation of the image registration software

Table 1 shows the mean DSC value was 0.987 ± 0.003. The score ranged from 0.984 to 0.9908 in 5 repetitions of the test.

Application on the real-time operative images

Fig. 3 shows the 3D CAD model overlaid on the surgical images during the surgery as well as the 3D CAD model's location and level of translation and rotation. During the initial stage of surgery, the model was overlaid on the muscles to show the approximate location of the organs (Fig. 3B). After exposing the trachea, the model was more accurately overlaid on the surgical field (Fig. 3D). During surgery, the model was intermittently displayed on a laptop monitor at the surgeon's request. The surgeon performed the procedures while referencing the model. After left thyroid lobectomy, we evaluated the integrity of the adjacent structures using the display on/off function (Fig. 3F).

DISCUSSION

Because AR integrates computer graphics and real images into a single and unified view, it has potential to overcome the obstacles caused by limited operative view. A number of trials have applied AR in laparoscopic surgeries [11121314]. However, there are several technical barriers to AR in laparoscopic or robotic surgery.
First, recognition of anatomical landmarks, essential for overlaying AR images on top of the target organs, is often difficult. This difficulty is more obvious during surgery on intraperitoneal organs because the target organs continuously move owing to respiration or patient position. Furthermore, the application of AR on deformable organs, such as the liver or pancreas, is challenging because, the overlaid AR image is likely to become separated from the operative image when deformity occurs. To avoid such difficulties, nondeformable organs such as the kidney or adrenal gland have been used as target organs in AR registration [15]. Alternatively, navigation aids have been useful to guide manual registration [16]. For deformable organs, AR is of limited applicability unless the AR registration is able to respond in real time to changes in the organs' shape. Several studies suggested adaptable registration methods in deformable organs. One study used the liver edges as landmarks to develop an automated delineation method to overlay an AR image on the liver during laparoscopic liver surgery [17]. Moreover, the application of AR is still limited in clinical settings because it increases surgical time significantly and registration is still inaccurate.
To our knowledge, this is the first study to apply AR in robotic thyroidectomy. We selected thyroidectomy to test AR on robotic surgery because of its popularity in Korea and technical applicability. Robotic thyroidectomy is one of the most commonly performed robotic procedures owing to its cosmetic advantages compared to conventional open thyroidectomy [5]. Moreover, it was technically easy to apply AR to this procedure because, although the thyroid gland is deformable, the surgical procedure required total resection of a lobe of the thyroid gland without partial preservation, and there was no requirement to trace the thyroid gland. Moreover, neighboring organs (common carotid arteries, trachea, and esophagus), which often lie beyond the operative view and must be protected during surgery, are nondeformable and easy to visualize in AR.
Before we applied AR images on the real-time surgical image, we superimposed the AR image on a 3D printing model and quantitatively evaluated the accuracy of manual registration using DSC. The DSC value was satisfactory compared to a previous study [18]. Then we overlaid the AR images on the real time surgical view. We also demonstrated that the AR image location yielded accurate registration. The AR image location information may be useful for developing automatic or semiautomatic registration algorithms.
There are several limitations in this study. First, the AR images were overlaid manually. Manual registration is advantageous because, compared to automatic registration, it is easy to apply and can be certified as a clinical product. However, automatic registration is more attractive because it is convenient and reduces surgical time. In laparoscopic surgeries, methods of automatic registration that recognize landmarks automatically using point-based, surface-based, or volume-based approaches are evolving rapidly. Thus, by recognizing easily noticeable landmarks such as the trachea or thyroid cartilage, automatic registration may become possible in robotic thyroidectomy.
Second, we used a separate monitor to operate the AR, and the surgeon had to pause the surgery to view the AR image. This was inevitable because the integration of AR on the robotic monitor is currently impossible due to a license issue. Because several companies are developing robotic surgical systems, integration will most likely be possible in the future.
Lastly, the current technology lacks a tracking system. We were not able to update the AR image with camera motion and needed to reset the image registration whenever the operative angle changed. The integrated movement of AR may develop via an optical tracking system or tracking anatomical landmarks [19]. However, at present all tracking technologies are impractical because the equipment is costly and needs extensive calibration, a main cause of inaccurate registration [20].
In conclusion, organs that should be protected during robotic thyroidectomy were clearly visualized using AR based on preoperative CT images. Although current AR in robotic thyroidectomy has limited clinical applicability due to technical and commercial limitations, AR in robotic surgery has several advantages that may enhance surgical safety.

Figures and Tables

Fig. 1

Total procedures for constructing 3-dimensional (3D) models: segmentation, 3D reconstruction, and creating 3D printer model and 3D computer-aided design (CAD) model.

astr-95-297-g001
Fig. 2

3-Dimensional (3D) printer model and 3D computer-aided design model for evaluation of registration performance using Dice similarity coefficient at different angles. Rectangular coordinate system is displayed center (x-, y-, z-axis are demonstrated in red, blue, green). (A) 3D printer model at 60°, (B) 3D printer model at 90°, (C) image registration at 60°, and (D) image registration at 90°.

astr-95-297-g002
Fig. 3

3-Dimensional computer-aided design model registration on surgical image during robotic thyroidectomy. (A) Before registration, (B) registration on the muscles, (C) tracheal exposure, (D) registration on the trachea, (E) left thyroid lobectomy, and (F) registration on the trachea, augmented reality image of left thyroid is turned off.

astr-95-297-g003
Table 1

Dice similarity coefficient results for the 3-dimensional (3D) printer model and the 3D computer-aided design model at 5 different angles

astr-95-297-i001

ACKNOWLEDGEMENTS

This work was supported by a National Research Foundation of Korea (NRF) grant funded by the Korean government (Ministry of Science, ICT & Future Planning, NRF-2016R1E1A1A01942072).

Notes

CONFLICTS OF INTEREST No potential conflict of interest relevant to this article was reported.

References

1. Kang SW, Jeong JJ, Nam KH, Chang HS, Chung WY, Park CS. Robot-assisted endoscopic thyroidectomy for thyroid malignancies using a gasless transaxillary approach. J Am Coll Surg. 2009; 209:e1–e7.
crossref
2. Terris DJ, Singer MC, Seybt MW. Robotic facelift thyroidectomy: II. Clinical feasibility and safety. Laryngoscope. 2011; 121:1636–1641.
crossref
3. Kim HY, Chai YJ, Dionigi G, Anuwong A, Richmon JD. Transoral robotic thyroidectomy: lessons learned from an initial consecutive series of 24 patients. Surg Endosc. 2018; 32:688–694.
crossref
4. Chai YJ, Suh H, Woo JW, Yu HW, Song RY, Kwon H, et al. Surgical safety and oncological completeness of robotic thyroidectomy for thyroid carcinoma larger than 2 cm. Surg Endosc. 2017; 31:1235–1240.
5. Chai YJ, Song J, Kang J, Woo JW, Song RY, Kwon H, et al. A comparative study of postoperative pain for open thyroidectomy versus bilateral axillo-breast approach robotic thyroidectomy using a self-reporting application for iPad. Ann Surg Treat Res. 2016; 90:239–245.
crossref
6. Landry CS, Grubbs EG, Warneke CL, Ormond M, Chua C, Lee JE, et al. Robot-assisted transaxillary thyroid surgery in the United States: is it comparable to open thyroid lobectomy? Ann Surg Oncol. 2012; 19:1269–1274.
crossref
7. Kandil EH, Noureldine SI, Yao L, Slakey DP. Robotic transaxillary thyroidectomy: an examination of the first one hundred cases. J Am Coll Surg. 2012; 214:558–564.
crossref
8. Ban EJ, Yoo JY, Kim WW, Son HY, Park S, Lee SH, et al. Surgical complications after robotic thyroidectomy for thyroid carcinoma: a single center experience with 3,000 patients. Surg Endosc. 2014; 28:2555–2563.
crossref
9. Lee S. Robotic thyroidectomy: pros and cons of various surgical approaches. Korean J Endocr Surg. 2015; 15:73–78.
crossref
10. Inabnet WB 3rd. Robotic thyroidectomy: must we drive a luxury sedan to arrive at our destination safely? Thyroid. 2012; 22:988–990.
crossref
11. Hansen C, Wieferich J, Ritter F, Rieder C, Peitgen HO. Illustrative visualization of 3D planning models for augmented reality in liver surgery. Int J Comput Assist Radiol Surg. 2010; 5:133–141.
crossref
12. Simpfendorfer T, Baumhauer M, Müller M, Gutt CN, Meinzer HP, Rassweiler JJ, et al. Augmented reality visualization during laparoscopic radical prostatectomy. J Endourol. 2011; 25:1841–1845.
13. Haouchine N, Dequidt J, Berger MO, Cotin S. Deformation-based augmented reality for hepatic surgery. Stud Health Technol Inform. 2013; 184:182–188.
14. Okamoto T, Onda S, Yanaga K, Suzuki N, Hattori A. Clinical application of navigation surgery using augmented reality in the abdominal field. Surg Today. 2015; 45:397–406.
crossref
15. Baumhauer M, Neuhaus J, Fritzsche K, Meinzer HP. The MITK image guided therapy toolkit and its exemplary application for augmented reality guided prostate surgery. In : Dossel Olaf, Schlegel WC, editors. Proceedings of World Congress on Medical Physics and Biomedical Engineering; Sep 7–12, 2009; Munich, Germany. IFMBE Proceedings, vol 25/6.
16. Vemuri AS, Wu JC, Liu KC, Wu HS. Deformable three-dimensional model architecture for interactive augmented reality in minimally invasive surgery. Surg Endosc. 2012; 26:3655–3662.
crossref
17. Soler L, Nicolau S, Pessaux P, Mutter D, Marescaux J. Real-time 3D image reconstruction guidance in liver resection surgery. Hepatobiliary Surg Nutr. 2014; 3:73–81.
18. Bodenstedt S, Goertler J, Wagner M, Kenngott H, Miller-Stich B, Dillmann R, et al. Superpixel-based structure classification for laparoscopic surgery. In : Medical Imaging 2016: Image-guided procedures, robotic interventions, and modeling; 2016 Feb 28-Mar 1; San Diego (CA).
19. Bernhardt S, Nicolau SA, Soler L, Doignon C. The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal. 2017; 37:66–90.
crossref
20. Cheng A, Kang JU, Taylor RH, Boctor EM. Direct three-dimensional ultrasound-to-video registration using photoacoustic markers. J Biomed Opt. 2013; 18:066013.
crossref
TOOLS
ORCID iDs

Young Jun Chai
https://orcid.org/0000-0001-8830-3433

Similar articles