Accessible Requires Authentication Published by De Gruyter December 4, 2017

Image-based 3D surface approximation of the bladder using structure-from-motion for enhanced cystoscopy based on phantom data

Quentin Péntek ORCID logo, Simon Hein, Arkadiusz Miernik and Alexander Reiterer

Abstract

Bladder cancer is likely to recur after resection. For this reason, bladder cancer survivors often undergo follow-up cystoscopy for years after treatment to look for bladder cancer recurrence. 3D modeling of the bladder could provide more reliable cystoscopic documentation by giving an overall picture of the organ and tumor positions. However, 3D reconstruction of the urinary bladder based on endoscopic images is challenging. This is due to the small field of view of the endoscope, considerable image distortion, and occlusion by urea, blood or particles. In this paper, we will demonstrate a method for the conversion of uncalibrated, monocular, endoscopic videos of the bladder into a 3D model using structure-from-motion (SfM). First of all, frames are extracted from video sequences. Distortions are then corrected in a calibration procedure. Finally, the 3D reconstruction algorithm generates a sparse surface approximation of the bladder lining based on the corrected frames. This method was tested using an endoscopic video of a phantom that mimics the rich structure of the bladder. The reconstructed 3D model covered a large part of the object, with an average reprojection error of 1.15 pixels and a relative accuracy of 99.4%.

  1. Author Statement

  2. Research funding: Authors state no funding involved.

  3. Conflict of interest: Authors state no conflict of interest.

  4. Informed consent: Informed consent is not applicable.

  5. Ethical approval: The conducted research is not related to either human or animals use.

References

[1] Bergen T, Wittenberg T. Stitching and surface reconstruction from endoscopic image sequences: a review of applications and methods. IEEE J Biomed Health Inform 2016; 20: 304–321. Search in Google Scholar

[2] Brischwein M, Wittenberg T, Bergen T. Image based reconstruction for cystoscopy. Curr Direct Biomed Eng 2015; 1: 470–474. Search in Google Scholar

[3] Chang PL, Handa A, Davison AJ, Stoyanov D, Edwards PE. Robust real-time visual odometry for stereo endoscopy using dense quadrifocal tracking. Inf Proc Comput Assist Int 2014; 8498: 11–20. Search in Google Scholar

[4] Daul C, Blondel WP, Ben-Hamadou A, et al. From 2D towards 3D cartography of hollow organs, In Proc. 7th Int. Conf. Electr. Eng., Comput. Sci. Autom. Control Mexico 2010; 285–293. Search in Google Scholar

[5] Fischler M, Bolles R. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 1981; 6: 381–395. Search in Google Scholar

[6] Grasa OG, Bernal E, Casado S, Gil I, Montiel JM. Visual SLAM for handheld monocular endoscope. IEEE Trans Med Imaging 2014; 1: 135–146. Search in Google Scholar

[7] Lavest JM, Rives G, Lapresté JT. Dry camera calibration for underwater applications. Lect Notes Comput Sci 2003; 1843: 654–668. Search in Google Scholar

[8] Lowe D. Distinctive image features from scale-invariant keypoints. Int J Comput Vis 2004; 2: 91–110. Search in Google Scholar

[9] Maier-Hein L, Mountney P, Bartoli A, et al. Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery. Med Image Anal 2013; 17: 974–996. Search in Google Scholar

[10] Mueller-Richter UD, Limberger A, Weber P, Ruprecht KW, Spitzer W, Schilling M. Possibilities and limitations of current stereo-endoscopy. Surg Endosc Interv Tech 2004; 18: 942–947. Search in Google Scholar

[11] Okatani T, Deguchi K. Shape reconstruction from an endoscope image by shape from shading technique for a point light source at the projection center. Comput Vis Image Underst 1997; 66: 119–131. Search in Google Scholar

[12] Soper T, Porter M, Seibel E. Surface mosaics of the bladder reconstructed from endoscopic video for automated surveillance. IEEE Trans Biomed Eng 2012; 6: 1670–1680. Search in Google Scholar

[13] Sturm P. Critical motion sequences for monocular self-calibration and uncalibrated Euclidean reconstruction. IEEE Conf. on Computer Vision and Pattern Recognition 1997; 1100–1105. Search in Google Scholar

[14] Szeliski R. Computer vision: algorithms and applications. London: Springer-Verlag 2010. Search in Google Scholar

[15] Totz J, Fujii K, Mountney P, Yang GZ. Enhanced visualisation for minimally invasive surgery. Int J Comput Assist Radio Surg 2012; 7: 423–432. Search in Google Scholar

[16] Triggs B, McLauchlan P, Hartley R, Fitzgibbon A. Bundle adjustment – a modern synthesis. In: Vision algorithms: theory and practice. Berlin: Springer 2000: 298–372. Search in Google Scholar

[17] Wu C. Towards linear-time incremental structure from motion. Int Conf 3D Vis 2013; 127–134. Search in Google Scholar

[18] Wu C, Jaramaz B. An easy calibration for oblique-viewing endoscopes. IEEE Int Conf Robot Autom 2008; 1424–1429. Search in Google Scholar

[19] Yamaguchi T, Nakamoto M, Sato Y, et al. Camera model and calibration procedure for oblique-viewing endoscope. Proc. Med. Imag. Comput. Comput.-Assist. Intervention Canada 2003; 373–381. Search in Google Scholar

[20] Zhang Z. A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 2000; 11: 1330–1334. Search in Google Scholar

Received: 2016-09-14
Accepted: 2017-05-16
Published Online: 2017-12-04
Published in Print: 2018-07-26

©2018 Walter de Gruyter GmbH, Berlin/Boston