BY 4.0 license Open Access Published by De Gruyter September 17, 2020

Fluoroscopy-guided robotic biopsy intervention system

Yusuf Özbek, Michael Vogele, Christian Plattner, Pedro Costa, Mario Griesser and Matthias Wieczorek

Abstract

Fluoroscopy-guided percutaneous biopsy interventions are mostly performed with traditional free-hand technique. The practical experience of the surgeon influences the duration of the intervention and the radiation exposure for patients and him-/herself. Especially when the placement of heavy and long instruments in double oblique angles is required, manual techniques reach their technical limitations very fast. The system presented herein automatizes the needle positioning using only two 2D scans while the robotic platform guides the intervention. These two images were used to plan the needle pathway and to estimate the pose of the robot using a custom-made end-effector with embedded registration fiducials. The estimated pose was subsequently used to transfer the planed needle path to the robot’s coordinate system and finally to compute the movement parameters in order to align the robot with this plan. To evaluate the system, two phantoms with 11 different targets on it were developed. The targets were punctured, and the application accuracy was measured quantitatively. The solution achieved sub-millimetric accuracy for needle placement (min. 0.23, max. 1.04 in mm). Our approach combines the advantages of fluoroscopic imaging and ensures automatic needle alignment with highly reduced X-ray radiation. The proposed system shows promising potential to be a guidance platform that is easy to combine with available fluoroscopic imaging systems and provides valuable help to the physician in more difficult interventions.

Introduction

Fluoroscopic image guidance is the predominantly applied method in clinical workflows compared with computed tomography [1] or ultrasound [2]. It offers high-resolution real-time imaging functionality, more safety, low dose radiation and effectiveness [3]. In particular, if the needle shall be placed accurately near critical regions percutaneously, the real-time fluoroscopy guidance is a highly valuable tool. Image-guided biopsy interventions such as radiofrequency ablation (RFA), brachytherapy, soft/hard tissue biopsies, or drug delivery require high-accuracy positioning of the intervention associated instruments. In case of free-hand techniques it is particular challenging to follow the planned trajectory pathway. This applies to positioning heavy drills, long biopsy guns or ablation devices in minimally invasive biopsy surgeries, for instance. Especially using ultrasound guidance is that surgeons position the needle with one hand while holding the transducer with the other. This can lead to inaccuracies and prevent the successful intervention due to these major technical limitations or several iterative fine-tunings are necessary during needle insertion. Furthermore, continual X-ray exposure of fluoroscopy, may result in long-term complications such as radiation injuries for the physicians and patients. The approach presented in this work, may help to reduce the X-ray exposure, shorten the intervention time, while providing a high accuracy needle placement in various biopsy interventions. In addition the needle or instrument alignment gets automatized by using a robotic platform that aligns the needle to the planned pathway by using two 2D fluoroscopic images only.

Methods

Robotic platform

iSYS1 robotic biopsy platform (iSYS Medizintechnik GmbH, Austria) is a certified automatic surgical instrument positioning system for CT-guided interventions [4]. It provides manual needle alignment with real-time fluoroscopic closed loop feedback for RFA.

The Robot Positioning Unit (RPU) defines its world coordinate system (4-DOF) with two modules: The lower part (POS) ensures positioning of used instruments in longitudinal axis X and in transversal axis Y while the upper part (ANG) ensures angulation in A (rotation around X axis) and B (rotation around Y) in a certain working volume. The Z axis indicates the origin w.r.t. rotation of the instrument when the robot is at its default-position (home) (Figure 1).

Figure 1: iSYS1 robotic platform working principle.

Figure 1:

iSYS1 robotic platform working principle.

Custom end-effector and phantom patients

To register the robot with the patient images, a rigid-body patient-to-image registration is necessary. That was achieved by developing a custom-made 3D printed end-effector (Figure 2) that embeds five metal spherical markers (⊘ 5 mm), which were distributed in a way to increase the application accuracy and prevent marker overlapping on the 2D images, when they are scanned from extreme positions or orientations. The Needle Guide (NG) on end-effector can hold a biopsy needle in various diameters and lengths (min. Gauge:8, Length:30 mm, max. Gauge:20, Length:300 mm). It is attached on the RPU’s ANG and POS through joints a) and b). A marker slot c) contains a concentrically fixed marker, whose mechanical relative coordinate to the robot’s home position is known. The NG is locked to the front side on the end-effector concentrically d). The needle e) is slid by the physician manually through the NG in Z direction and its position is regulated over a plastic screw mounted on the side of NG f).

Figure 2: End-effector design.

Figure 2:

End-effector design.

Two custom-built phantoms were used to validate the accuracy of the automatic needle alignment. A 3D printed model screwed directly to the robot from the bottom side which contains six targets (⊘ 2 mm spherical metal ball, fixed to the model) (Figure 3, Left) and a 3D printed spine phantom placed in a transparent acrylic glass box (20 cm3). The model was filled with transparent candle wax and contained five fixed targets with the same size as first phantom (Figure 3, Right). The wax all around the spine phantom was added in order to provide haptic feedback and as it can deviate the needle direction as common during insertion like in liver, lung, prostate, breast or in other soft tissues of the human body.

Figure 3: Left: 3D printed phantom with an aligned biopsy needle (17 gauge, ⊘ 1.52 mm, L: 20 cm). Right: Spine phantom positioned under the robot. Target-to-entry distance was for both min. 4 cm max. 8 cm.

Figure 3:

Left: 3D printed phantom with an aligned biopsy needle (17 gauge, ⊘ 1.52 mm, L: 20 cm). Right: Spine phantom positioned under the robot. Target-to-entry distance was for both min. 4 cm max. 8 cm.

Data acquisition

Two 2D single plane, DICOM images (1024 × 1024 px) including patient (target markers) and end-effector from different scan directions were acquired by Ziehm Vision RFD 3D Mobile C-arm (Ziehm Imaging GmbH, Germany). The robot gets positioned above the patient in its homed state and both maintain in a fixed position relative to the C-arm’s flat panel detector. Figure 4 shows two sample fluoroscopy acquisitions of the 3D printed phantom, which were scanned from the SI (superior/inferior) plane (Left) and lateral (sagittal) plane (Right) including the end-effector together with the patient, respectively. The images were acquired with pulsed cervical spine mode (total scan time: 1 s for each) using parameters: Kv: 44 mA: 3.2 cGy cm3: 0.60, pulse width: 58%.

Figure 4: Trajectory planning on acquired 2D images from lateral (left) and anterior (right) positions. The hind (blue circle) of the orange arrow (pathway) shows by user defined entry point (E) while the tip indicates the defined target marker (T). The green line that goes through the arrow, indicates the current position of the robot after alignment. Two parallel cyan lines show the intersection epipolar lines resulting after 3D pose estimation. The green rectangle represents the 2D limit of the robot’s positional movements in X and Y, where the entry point can be set while the transparent red area shows the angulation limits in A and B. Both indicate the reachability information of the robot to a planned trajectory after registration.

Figure 4:

Trajectory planning on acquired 2D images from lateral (left) and anterior (right) positions. The hind (blue circle) of the orange arrow (pathway) shows by user defined entry point (E) while the tip indicates the defined target marker (T). The green line that goes through the arrow, indicates the current position of the robot after alignment. Two parallel cyan lines show the intersection epipolar lines resulting after 3D pose estimation. The green rectangle represents the 2D limit of the robot’s positional movements in X and Y, where the entry point can be set while the transparent red area shows the angulation limits in A and B. Both indicate the reachability information of the robot to a planned trajectory after registration.

Automatic marker localization and pose estimation

In order to translate the pathway, planned in image space, to the coordinate space of the robot, the relative position of the robot with respect to fluoro images is required (pose). To compute this pose, we first require the geometrical parameters of the C-arm for each individual image. These can be represented in terms of a standard camera model from computer vision, i.e., the intrinsics, which define the relation from X-ray source to flat detector, and the extrinsics, which define the pose of the system w.r.t. a common coordinate frame for each individual image. We obtain this information from the DICOM Tags provided by the C-Arm vendor. The pose of the robot w.r.t. the common coordinate frame of the fluoro images was determined over automatically localized five registration fiducials while for each individual image a Laplacian of Gaussian based blob detection [5] was applied. The results gets filtered for blobs which have sizes which match the magnification factor of the C-arm system up to a certain tolerance. For each image and each blob detected within, we compute the line between the X-ray source and the blob center. Next we check each such line for one image for an intersection with all lines from all other images with up to a certain distance tolerance. The tolerance is used as the extrinsic parameters given in the DICOM tags are only a rough estimate. Each such intersection marks a possible candidate for a marker position at the end-effector. The next step of the algorithm uses an iterative closest-point (ICP) algorithm to match the known 3D points on the end-effector and a subset of the candidates. This procedure is repeated several times where the subset is chosen in a RANSAC fashion [6]. Afterward, we pick the matching/transformation which yields the smallest matching error in terms of the root mean square error (RMS). The final step of our algorithm, as we pointed out above that the extrinsics from the DICOM tags are rather rough, is that we run a Levenberg-Marquardt optimization [7], which optimizes the pose of the images. During this optimization, we minimize the reprojection error of the projected, matched 3D marker positions and the corresponding detection in the images.

Evaluation

The overall accuracy of the system was validated while puncturing the target markers in both phantoms, respectively, and measuring the 3D positional deviation between the needle and target marker centroids after needle insertion quantitatively. Following required steps were executed for each 11 targets and repeated five times for each individual target: First, the robot was positioned on the phantom and two images from different scan directions were acquired (Tables 1 and 2). Secondly, the registration fiducials on the end-effector were localized, the image-to-patient registration and pose estimation was performed automatically. Next, a trajectory pathway was planned by defining entry and target points on both images (Figure 4) and the robot was triggered to align the needle to the defined trajectory. Afterward, the needle was inserted by sliding it through NG until its tip touches the surface of the target marker. Finally, the needle was fixed at its current position and a 3D CBCT isocentric scan with the current setup was performed that contains patient (target) and needle on the acquired DICOM dataset. The 3D positional deviations between needle centroid and target marker centroid were measured on the work station manually (Figure 5).

Table 1:

Registration and measured system accuracy in mm for overall executed 30 measurements on 3D printed phantom. A scan direction, e.g., 0°/−90° indicates that first image was acquired at SI while second at the lateral plane.

Target/scan directionMeasured targeting accuracy for 3D printed phantomReg. RMS
T1T2T3T4T5T6Mean accuracy/STD
0°/−90°0.560.060.310.010.400.070.23 ± 0.220.33
0°/−75°0.250.370.640.740.340.010.39 ± 0.260.29
−60°/45°0.440.300.250.720.050.740.41 ± 0.270.29
0°/−45°0.390.010.010.250.011.030.28 ± 0.390.31
45°/−100°1.320.391.110.040.680.380.65 ± 0.480.71
Mean Accuracy/STD0.59 ± 0.420.22 ± 0.170.46 ± 0.420.35 ± 0.350.29 ± 0.270.44 ± 0.430.39 ± 0.320.38 ± 0.18

  1. Tx, Target 1, Target 2 until Target 6; Reg. RMS, Registration root-mean-square error; STD, Standard deviation.

Table 2:

Overall results for executed 25 measurements on spine phantom in mm.

Target/scan directionMeasured targeting accuracy for spine phantomReg. RMS
T1T2T3T4T5Mean accuracy/STD
0°/−90°0.591.770.870.570.730.90 ± 0.490.33
0°/−75°0.770.550.770.480.350.58 ± 0.180.29
−60°/45°0.520.300.500.670.430.48 ± 0.130.29
0°/−45°0.441.441.820.710.811.04 ± 0.560.31
45/−100°1.540.770.651.000.590.91 ± 0.380.71
Mean Accuracy/STD0.77 ± 0.440.96 ± 0.610.92 ± 0.520.68 ± 0.190.58 ± 0.190.78 ± 0.340.38 ± 0.18
Figure 5: A CBCT volumetric scan. The crosshairs placed on the target marker represent the marker centroid. The orange line that goes through the needle center on Coronal and Sagittal view, is the alignment accuracy. The distance between these two points results the overall targeting accuracy in 3D.

Figure 5:

A CBCT volumetric scan. The crosshairs placed on the target marker represent the marker centroid. The orange line that goes through the needle center on Coronal and Sagittal view, is the alignment accuracy. The distance between these two points results the overall targeting accuracy in 3D.

Results

Since the placement of the registration fiducials is an important factor and has a major influence on resulting accuracy [8], which in our case the configuration of these were changed based on the scan direction of each 2D images, best accuracy results were obtained, where fiducials on the end-effector were localized far away from each other. Tables 1 and 2 represent the measured accuracies for each individual target on both phantoms. The replicated soft tissue using candle wax for spine phantom increased the overall inaccuracy for each puncture as expected. 3D positional deviations between the needle and target centroids were measured in acquired volumetric CBCT images. Each dataset was containing 320 slices with 0.6 mm slice thickness.

Discussion and conclusion

The resulting sub-millimetric targeting accuracies independent from scan direction of the proposed work through our internal tests were promising in laboratory conditions. The best targeting accuracy was obtained when the images were acquired from SI and lateral planes (between 0° and −90°) for the first phantom with 0.23 mm. In the second phantom, the −60° and 45° directions delivered better needle placement accuracy in total with 0.48 mm. The sub-millimetric registration RMS errors were observed again from SI and lateral planes for both phantoms independent from selected target (min. 0.29 mm, max. 0.71 mm). Automatically aligning the instruments using only two scans, which enables single or multiple needle intervention without re-positioning the robot or patient, facilitate the percutaneous biopsy or RFA interventions and eliminate technical limitations in this area, while low dose radiation exposure is obtained for patients and physicians.


Corresponding author: Yusuf Özbek, Medical University of Innsbruck, Anichstrasse 35, 6020, Innsbruck, Austria, E-mail:

  1. Research funding: None declared.

  2. Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission.

  3. Competing interests: Authors state no conflict of interest.

  4. Informed consent: Informed consent is not applicable.

  5. Ethical approval: The conducted research is not related to either human or animals use.

References

1. Ming-De, H, Yuan-Hsiung, T. Accuracy and complications of CT-guided pulmonary core biopsy in small nodules: a single-center experience. Canc Imag 2019;19:51. https://doi.org/10.1186/s40644-019-0240-6. Search in Google Scholar

2. Saad, WE, Waldman, DL. Safety and efficacy of fluoroscopic versus ultrasound guidance for core liver biopsies in potential living related liver transplant donors: preliminary results. J Vasc Interv Radiol 2006;17:1307–12. https://doi.org/10.1097/01.rvi.0000233497.60313.c9. Search in Google Scholar

3. Kurban, LA, Gomersall, L, Weir, J, Wade, P. Fluoroscopy-guided percutaneous lung biopsy: a valuable alternative to computed tomography. Acta Radiol 2008;49:876–82. https://doi.org/10.1080/02841850802225893. Search in Google Scholar

4. Kettenbach, J, Kara, L, Toporek, G, Fuerst, M, Kronreif, G. A robotic needle-positioning and guidance system for CT-guided puncture: ex vivo results. Minim Invasive Ther Allied Technol 2014;23:271–8. https://doi.org/10.3109/13645706.2014.928641. Search in Google Scholar

5. Lindeberg, T. Feature detection with automatic scale selection. Int J Comput Vis 1998;30:79–116. https://doi.org/10.1023/A:1008045108935. Search in Google Scholar

6. Martin, AF, Robert, CB. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Comm ACM 1981;24:381–95. https://doi.org/10.1145/358669.358692. Search in Google Scholar

7. Levenberg, K. A method for the solution of certain problems in least squares. Quart Appl Math 1944;2:164–8. https://doi.org/10.1090/qam/10666. Search in Google Scholar

8. West, JB, Fitzpatrick, JM, Toms, SA, Maurer, CR, Maciunas, RJ. Fiducial point placement and the accuracy of point-based, rigid body registration. Neurosurgery 2001;48:810–6. https://doi.org/10.1227/00006123-200104000-00023. Search in Google Scholar

Published Online: 2020-09-17

© 2020 Yusuf Özbek et al., published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.