BY 4.0 license Open Access Published by De Gruyter September 17, 2020

COMPASS: localization in laparoscopic visceral surgery

Regine Hartwig, Daniel Ostler, Hubertus Feußner, Maximilian Berlet, Kevin Yu, Jean-Claude Rosenthal and Dirk Wilhelm

Abstract

Tracking of surgical instruments is an essential step towards the modernization of the surgical workflow by a comprehensive surgical landscape guidance system (COMPASS). Real-time tracking of a laparoscopic camera used in minimally-invasive surgery is required for applications in surgical workflow documentation, machine learning, image-localization, and intra-operative visualization. In our approach, an inertial measurement unit (IMU) assists the tool tracking in situations when no line-of-sight is available for infrared (IR) based tracking of the laparoscopic camera. The novelty of this approach lies in the localization method adjusted for the laparoscopic visceral surgery, particularly when the line-of-sight is lost. It is based on IMU tracking and the positioning of the trocar entry point. The trocar entry point is the remote center of motion (RCM), reducing degrees of freedom. We developed a method to tackle localization and a real-time tool for position and orientation estimation. The main error sources are given and evaluated in a test scenario. It reveals that for small changes in penetration length (e.g., pivoting), the IMU’s accuracy determines the error.

Introduction

Tracking of laparoscopic instruments is an application that many researchers focused on to improve surgery in fields like visualization [1], workflow registration, planning, and evaluation. New concepts have proven to show benefits for visualization, navigation, and planning in neuro, ear, nose, throat, and spinal surgery. Most concepts rely on overlay technologies that have not been established in visceral surgery since there are fewer applications for guidance as most organs in the abdomen can move freely. Therefore, we do not focus on overlay technologies to avoid risk structures or on guidance to a predefined area of interest. Instead, we define the main motivation as a real-time 3D visualization of the surgical sight and even an image-based detection of the tumor spreading. Registration of CT images to the patient may not be applicable, but with real-time detection of anatomic landmarks, overlay technologies could become attractive. In addition, our team is conducting research to analyze the diagnostic laparoscopy and possible advantages that could arise from technological progress in the field of image-based detection and localization (refer to the talk of Berlet et al.).

With only a few exceptions, researchers found the optical tracking system to be hard to use in abdominal surgery. Establishing line-of-sight to the reflective spheres is more difficult compared to other surgical fields due to bigger movements and rotations.

In this article, we strive for finding a tracking system designed for abdominal surgery. We show that orientation and position estimates can be acquired using only an IMU for timespans where no IR tracking is available and an IR tracking system to find the trocar entry point. These results motivate us to conduct further research on multi-sensor fusion. IMU sensor data can be utilized for at least two sensor inputs. One utilization is the position estimate by using the proposed method, and the other is the incremental movement from the last known position and orientation. We further plan to integrate optical flow as sensor input using a depth map reconstruction of the 3D endoscope for more stable results. This document, however, focuses on explaining the algorithm for stable trocar entry point detection and evaluates the prerequisites for and errors of position estimates.

Related work

There are standard sensors applied in the operating room for laparoscopic surgeries in the last decade. Widely used are infrared tracking systems based on reflective spheres and infrared light detectors. Another common type is electromagnetic tracking systems. The drawback of this approach is the common distortion of the metallic objects used during surgery. Furthermore, robotic arms can determine the position using a forward kinematic and measuring the joint’s angles. Other approaches focus on matching 3D CT scans to real-time images obtained either by ultrasound sensors [2], 2D cameras [3], [4] or 3D cameras.

Others [5], [6], [7] use SLAM methods for movement detection of laparoscope and organ to detect whether organs or the camera shifts and distortions by deformations are the main hazards for this approach. Similar to our approach is [8], which uses the remote center of motion (RCM) constraints for SLAM methods and show that it outperforms conventional SLAM approaches. Another method [9] uses an IMU and 2D camera information to implement a SLAM algorithm.

Often the use of IMU sensors in laparoscopic navigation is restricted to orientation information. The position trajectories are often computed by double integration of the accelerometer, which is erroneous as soon as the IMU is tilted [10].

Other researchers also investigated on integrating IMUs in sensor fusion methods to compensate the drawbacks of other sensors.

In contrast to previous approaches, we combine the RCM constraint with IMU sensors and produce a position estimate with that information. We use infrared tracking system for initialization and bypass timespans when line-of-sight is lost.

This method is not in conflict with other methods as we plan to merge several approaches to increase the accuracy of localization information up to the point that the variance is small enough for our use case.

Methods

Coordinate system definition and representation

The definition of coordinate systems should efficiently cover the laparoscopic setting and is essential for the later formulation of transformation matrices and their computation. Figure 1a depicts the essential coordinate systems. WIMU and WIR denote native sensor coordinate systems of IMU and IR sensors, respectively. We further define C to be the Camera tip with x-axis aligned to the laparoscope axis and L the coordinate system of the attached sensor. The trocar entry point is equal to the origin of T. The coordinate system W2 mounted on the patient is only needed if we expect either the infrared camera or the patient’s bed to move. We denote the homogenous 4×4 transformation from coordinate system A to coordinate system B by T, consisting of a translation in coordinate system A and a rotation, i.e.,

(1)pA=TBAPB=TransABRotABpB
Figure 1: (a) System Setup, (b) Position Estimates, Errors: (c) Total Position Error (d) Deviation due to erroneous angle (e) Deviation in Trocar entry point (f) Deviation due to erroneous penetration length.

Figure 1:

(a) System Setup, (b) Position Estimates, Errors: (c) Total Position Error (d) Deviation due to erroneous angle (e) Deviation in Trocar entry point (f) Deviation due to erroneous penetration length.

Trocar position

When the laparoscope is inside the abdomen, we collect axis of the camera and compute the trocar position in a least-squares sense.

The tuple (x(j),t(j)) of vector and point defines the jth laparoscope axis. We use an interactive algorithm to collect new lines of

(2)TCW2rans(j)=[x(j)y(j)z(j)t(j)0001]

only when the optical tracking system has free line-of-sight to the target. Furthermore, to add a new line (x(j),t(j)) to the set of lines {(x(i),t(i))}i=1,,I we request a minimum deviation

(3)αij=arccos(x(i),x(j))π18

to all lines i=1,,I in order to make the problem well posed.

In order to find the trocar entry point p with minimum square distances d(i) to all lines (x(i),t(i)) for i=1,,I we minimize the cost function

(4)f(p)=id(i)2=ipt(i)22pt(i),x(i)22

which yields

(5)f(p)=i2(pt(i))2pt(i),x(i)2x(i)=0
(6)p=[iA(i)]+[iA(i)t(i)],

with

(7)A(i)=I(x(i)x(i)T)

We can then store the transformation

(8)TCW2rans=[Ip0T1].

Laparoscope position estimation

Based on the trocar entrypoint and the orientation we get a position estimate as follows:

(9)TLW2=TTW2TLT,

with

(10)TTW2=TTW2ransandTLT=RLW2otTransCT.

Here, Rot is the transformed rotation obtained by the IMU sensor. TransLT is the transition from trocar to laparoscope in coordinate system L, using the last known translation measured by the IR tracking system. The position estimator is, hence, unable to detect movements along the laparoscope axis. Therefore, in the COMPASS navigation setup we plan to include the stereoscopic view to determine movements along that axis. The camera view is thus received by

(11)TCW2=TLW2TCL,

with previously calibrated rigid transformation TCL dependent on the laparoscope.

Implementation

Sensors

Our prototype comprises an NDI Polaris Vega IR camera (Waterloo, Ontario, Canada) and unique geometry tools as well as MbientLab Metawear IMU sensor boards (San Francisco, California, USA). Communication with the infrared camera is established via ethernet and we read from the IMU via Bluetooth. The receiver publishes all data via unique topics via MQTT (IoT framework). Also the sensor-fusion and calibration procedures publish their data to the centralized broker. For the IoT setup there has to be a network setup with a running MQTT broker in the operating room (OR).

A 3D printed mounting is attached to the Storz 3D endoscope (Tuttlingen, Baden-Württemberg, Germany), statically connecting the reflecting spheres with the IMU sensor in a way that both coordinate systems are equal in orientation.

Evaluation and discussion

We tested six moving patterns of the laparoscope. We calibrate the trocar position in the beginning and use afterward no information of the IR Tracking System to compute the position. We observe that the position trajectory shows clear correspondence to the ground truth infrared tracking for all movements (see Figure 1b). The error of the position estimate is shown in Figure 1c. To analyze its source we investigate possible error contributions. The inaccurate part of the orientation measured by the IMU is propagating by

(12)ε1=lsin(Δθ)

with the angle between the measured orientations Δθ=2arccos(q1,q2) and the length outside the abdomen l.

The error based on an incorrect trocar entry point

(13)ε2=||Δp||

is linearly contributing as well as the error based on incorrect length outside the abdomen

(14)ε3=Δl.

To find ε2 we let the trocar entry point be updated during the whole testing period using the IR Tracking by Equation (5). We test the movement of the updated trocar entry point and computed its error, assuming the mean to be the correct value (see Figure 1e). The measurement shows that the trocar entry point can be found in a comparatively short period of time. The movement of the entry point is due to imprecise calibration of TCL. Since we calibrated the trocar entry point at the beginning the error term is approximately 22 mm.

The error term ε1 is shown in Figure 1d. There exists a strong correlation with the total error in Figure 1c, showing that it is the main error source.

The last error is the length outside the abdomen, changes in that value can not be detected. For this moving pattern (pivoting) the laparoscope does not change its penetration length and the error is therefore comparatively small (see Figure 1f).

Conclusion

It was shown that the approach of estimating position by IMU and trocar entry point is feasible for moving patterns without a change in penetration length. We defined the contributing errors and found that the error is determined by the accuracy of the IMU’s orientation. We further showed that the trocar entry point could be determined fast.

In addition, we suggested an IoT framework for extending the localization framework and announced that studies incorporating other information are planned. For that purpose, we gathered research papers, which determine the localization information in a different way. That can serve as a starting point for extensions.


Corresponding author: Regine Hartwig, Research Group MITI, Technical University of Munich, Munich, Germany, E-mail:

Funding source: BMBF (Bundesministerium fur Bildung und Forschung)

  1. Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission.

  2. Research funding: BMBF (Bundesministerium für Bildung und Forschung) funding for project COMPASS (Comprehensive Surgical Landscape Guidance System) with identification code 16SV8018.

  3. Informed consent: Informed consent is not applicable.

  4. Ethical approval: The conducted research is not related to either human or animals use.

  5. Conflict of interest: Authors state no conflict of interest.

References

1. Bernhardt, S, Nicolau, SA, Soler, L, Doignon, C. The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal 2017;37:66–90. https://doi.org/10.1016/j.media.2017.01.007. Search in Google Scholar

2. Ieiri, S, Uemura, M, Konishi, K, Souzaki, R, Nagao, Y, Tsutsumi, N, et al. Augmented reality navigation system for laparoscopic splenectomy in children based on preoperative ct image using optical tracking device. Pediatr Surg Int 2012;28:341–6. https://doi.org/10.1007/s00383-011-3034-x. Search in Google Scholar

3. Langø, T, Tangen, G, Mårvik, R, Ystgaard, B, Yavuz, Y, Kaspersen, J, et al. Navigation in laparoscopy–prototype research platform for improved image-guided surgery. Minim Invasive Ther Allied Technol 2008;17:17–33. https://doi.org/10.1080/13645700701797879. Search in Google Scholar

4. Konishi, K, Nakamoto, M, Kakeji, Y, Tanoue, K, Kawanaka, H, Yamaguchi, S, et al. A real-time navigation system for laparoscopic surgery based on three-dimensional ultrasound using magneto-optic hybrid tracking configuration. Int J Comput Assist Radiol Surg 2007;2:1–10. https://doi.org/10.1007/s11548-007-0078-4. Search in Google Scholar

5. Mountney, P, Yang, GZ. Motion compensated slam for image guided surgery. In: International conference on medical image computing and computer-assisted intervention: Springer; 2010:496–504 pp. Search in Google Scholar

6. Grasa, G, Bernal, E, Casado, S, Gil, I, Montiel, JMM. Visual slam for handheld monocular endoscope. IEEE Trans Med Imag 2014;33:135–46. https://doi.org/10.1109/tmi.2013.2282997. Search in Google Scholar

7. Grasa, ÓG. Visual SLAM for measurement and augmented reality in laparoscopic surgery. PhD thesis: Universidad de Zaragoza; 2014. Search in Google Scholar

8. Vasconcelos, F., Mazomenos, E., Kelly, J., Stoyanov, D. Rcm-slam: Visual localisation and mapping under remote centre of motion constraints. In: 2019 International conference on robotics and automation (ICRA); 2019:9278–84 pp. Search in Google Scholar

9. Huang, CC, Hung, NM, Kumar, A. Hybrid method for 3d instrument reconstruction and tracking in laparoscopy surgery. In: 2013 International conference on control, automation and information sciences (ICCAIS); 2013:36–41 pp. Search in Google Scholar

10. Gan, C. Design of inertial tracking system for laparoscopic instrument trajectory analysis; 2010. Search in Google Scholar

Published Online: 2020-09-17

© 2020 Regine Hartwig et al., published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.