Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Current Directions in Biomedical Engineering

Joint Journal of the German Society for Biomedical Engineering in VDE and the Austrian and Swiss Societies for Biomedical Engineering

Editor-in-Chief: Dössel, Olaf

Editorial Board: Augat, Peter / Buzug, Thorsten M. / Haueisen, Jens / Jockenhoevel, Stefan / Knaup-Gregori, Petra / Kraft, Marc / Lenarz, Thomas / Leonhardt, Steffen / Malberg, Hagen / Penzel, Thomas / Plank, Gernot / Radermacher, Klaus M. / Schkommodau, Erik / Stieglitz, Thomas / Urban, Gerald A.

Open Access
Online
ISSN
2364-5504
See all formats and pricing
More options …

An improved tracking framework for ultrasound probe localization in image-guided radiosurgery

Svenja Ipsen
  • Corresponding author
  • Universität zu Lübeck, Institute for Robotics and Cognitive Systems, Ratzeburger Allee 160, 23562 Lübeck, Germany
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Ralf Bruder
  • Universität zu Lübeck, Institute for Robotics and Cognitive Systems, Ratzeburger Allee 160, 23562 Lübeck, Germany
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Philipp Jauer
  • Universität zu Lübeck, Institute for Robotics and Cognitive Systems, Ratzeburger Allee 160, 23562 Lübeck, Germany
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Floris Ernst
  • Universität zu Lübeck, Institute for Robotics and Cognitive Systems, Ratzeburger Allee 160, 23562 Lübeck, Germany
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Oliver Blanck
  • University Medical Center Schleswig-Holstein, Department for Radiation Oncology, Arnold-Heller-Straße 3, 24105 Kiel, Germany
  • Saphir Radiosurgery Center, Frankfurt am Main and Güstrow, Germany
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Achim Schweikard
  • Universität zu Lübeck, Institute for Robotics and Cognitive Systems, Ratzeburger Allee 160, 23562 Lübeck, Germany
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2016-09-30 | DOI: https://doi.org/10.1515/cdbme-2016-0091

Abstract

Real-time target localization with ultrasound holds high potential for image guidance and motion compensation in radiosurgery due to its non-invasive image acquisition free from ionizing radiation. However, a two-step localization has to be performed when integrating ultrasound into the existing radiosurgery workflow. In addition to target localization inside the ultrasound volume, the probe itself has to be localized in order to transform the target position into treatment room coordinates. By adapting existing camera calibration tools, we have developed a method to extend the stereoscopic X-ray tracking system of a radiosurgery platform in order to locate objects such as marker geometries with six degrees of freedom. The calibration was performed with 0.1 mm reprojection error. By using the full area of the flat-panel detectors without pre-processing the extended software increased the tracking volume and resolution by up to 80%, substantially improving patient localization and marker detectability. Furthermore, marker-tracking showed sub-millimeter accuracy and rotational errors below 0.1°. This demonstrates that the developed extension framework can accurately localize marker geometries using an integrated X-ray system, establishing the link for the integration of real-time ultrasound image guidance into the existing system.

Keywords: image guidance; real-time motion compensation; robotic radiosurgery; ultrasound tracking

1 Introduction

The recent developments in image guidance, treatment planning and delivery techniques in radiotherapy have sparked a growing trend towards applying higher radiation doses in fewer treatment fractions, a method referred to as extreme hypofractionation or radiosurgery. The high doses used for radiosurgery increase the demand for exact target localization to maximize the curative effect and minimize the dose to the surrounding healthy tissue. Especially soft-tissue targets moving with respiration, e.g. in the liver [1], are difficult to treat with high precision. The first radiosurgery system that implemented real-time target tracking for motion compensation was the robotic CyberKnife (Accuray Inc., Sunnyvale, CA). By using a correlation model between a continuous external breathing curve monitored via surface LEDs and discrete internal positions acquired via stereo X-ray (kV-) imaging, the most probable target position can be approximated [2]. Abdominal targets require an implantation of radiopaque gold markers around the tumour for X-ray visibility. This method allows for real-time motion compensation and beam guidance. Yet it does not offer any information about the actual target position since the model yields only the most probable position based on surrogate markers.

In order to overcome these drawbacks, our aim was to include an image guidance system that is real-time capable without using ionizing radiation – ultrasound. With fast four-dimensional (4D) systems using 2D matrix array transducers [3] now becoming widely available, direct target localization becomes possible. Yet, the integration of ultrasound into an existing radiosurgery platform requires two localization steps. First, the target needs to be localized inside the ultrasound volume. Additionally, the probe has to be detected to transform the target position into treatment coordinates.

In this study, we demonstrate that customized marker geometries can be localized with high accuracy using the CyberKnife’s integrated X-ray system. We describe a geometric calibration method of the kV-imaging system that considerably increases the tracking volume by making use of the full X-ray detector area and allows for localization of ultrasound probes with six degrees of freedom.

2 Material and methods

The real-time motion data provided by an ultrasound station can be integrated into the current tracking system using a coordinate transformation to treatment room coordinates (see Figure 1). To determine the transformation matrix CKTUS from CyberKnife to ultrasound coordinates, the probe must be localized with six degrees of freedom (6DOF). This requires an additional marker geometry to be attached to the probe since the orientation of the probe itself cannot be determined directly from X-ray projections due to missing landmarks.

Tracking of a tumour (T, red) using the conventional CyberKnife (CK) method and ultrasound (US, blue). Multiple fiducials (F, green) are localized for tumour tracking (CKTF). To transform US into CK coordinates, the probe is detected in X-ray images using an attached marker geometry (CKTM).
Figure 1

Tracking of a tumour (T, red) using the conventional CyberKnife (CK) method and ultrasound (US, blue). Multiple fiducials (F, green) are localized for tumour tracking (CKTF). To transform US into CK coordinates, the probe is detected in X-ray images using an attached marker geometry (CKTM).

We use a CyberKnife G4 system with a stereoscopic kV-imaging system consisting of two X-ray tubes and opposing in-floor flat-panel detectors. Since the on-board software cannot localize multiple marker geometries, we developed a software package for the detection, reconstruction and 6DOF-localization of customized marker geometries using the raw X-ray images from the system. The reconstruction of 3D objects from multiple views is a frequent problem in computer vision commonly solved by determining the geometric setup of the imaging system followed by stereo-triangulation.

2.1 Stereo X-ray calibration

Generally, an optical imaging system is approximated by an extended pinhole camera model containing intrinsic and extrinsic camera parameters. The product of the intrinsic matrix containing the focal length (fx, fy), the offset of the image centre (cx, cy) and the pixel scaling factors (sx, sy) and the extrinsic matrix describing the rotation and translation of the camera, is called homography or projection matrix P,

wp~=PP~W=[fxsx0cx0fysycy001][RCtC][XWYWZW1](1)

and defines the camera’s imaging behaviour. Matrix multi-plication will project a point W = (XW, YW, ZW, 1)T in world coordinates onto the image plane = (x, y, 1)T.

A well-established solution to the camera calibration problem is the pattern-based method by Zhang [4]. Several images showing a chessboard with known dimensions in different spatial orientations are acquired. By extracting point correspondences, a system of nonlinear equations can be constructed. It is solved for the projection matrix P minimizing the reprojection error being defined as the mean difference between the measured image coordinates and a simulated projection of the chessboard model utilizing the current projection matrix. For stereo-calibration, camera A is chosen as the reference coordinate system. The transformation R, t from projection centre PCA to PCB

PCB=RPCA+t(2)

can be estimated by stereo calibration. The given X-ray stereo setup can be described with this fundamental camera model. Yet, the X-ray “cameras” present a unique geometry which differs from that of optical cameras (see Figure 2). The location of the image plane behind the projection centre, the tilt of approximately 45 degrees towards the X-ray beam and the long focal length corresponding to the height of the X-ray tube in the room challenge standard calibration procedures.

Schematic geometry of the X-ray system. Both X-ray cameras show large focal lengths f and an image centre cx located far from the actual detector image.
Figure 2

Schematic geometry of the X-ray system. Both X-ray cameras show large focal lengths f and an image centre cx located far from the actual detector image.

Algorithms from the open-source library OpenCV [5] were adapted for X-ray camera calibration and a special calibration phantom containing a regular grid of 7 × 6 lead markers (diameter 2 mm), representing the corners of a chessboard pattern fixed to a plastic board, was designed (see Figure 3). The calibration procedure was evaluated based on the reprojection errors. Furthermore, the resulting image resolution and tracking volume size were compared to the images processed by the CyberKnife software.

Top – X-ray calibration phantom with chessboard pattern. Bottom – Ultrasound transducer and attached marker geometry on the treatment couch (left: photo, right: X-ray image).
Figure 3

Top – X-ray calibration phantom with chessboard pattern. Bottom – Ultrasound transducer and attached marker geometry on the treatment couch (left: photo, right: X-ray image).

2.2 Marker geometry localization

With known camera parameters, it is possible to reconstruct a 3D world point PW from its coordinates pA and pB in the projection images. The complexity of the correspondence problem is reduced to one dimension since the camera setup leads to co-planar, rectified image pairs (see Results) [6].

With given pixel coordinates pA and pB corresponding to the same marker point in both X-ray images it is possible to reconstruct its spatial coordinates PM,i by computing the intersection point of their projection lines. The projection lines run between each projection centre and the measured pixel coordinate and have to be converted into a mutual coordinate system (here: X-ray tube A). Due to noise and the feature extraction it is most likely that the lines will not intersect. In this case, PM,i is computed as the point with the smallest Euclidean distance to both projection lines.

To localize the ultrasound probe in the system’s X-ray reference frame, the 6DOF orientation of the attached marker is determined from the reconstructed 3D single marker coordinates, forming a point cloud, by using singular value decomposition [7]. Two different X-ray marker geometries with 20 mm and 50 mm base lengths were designed and attached to the ultrasound transducer (see Figure 3).

In order to evaluate the developed localization method, the marker geometries were mounted on a 6-axis industrial robot (Viper s850, Adept Technology Inc., Livermore, CA) with a precision of 0.03 mm reported by the manufacturer. Image pairs at 150 random positions per marker geometry were acquired within the tracking volume. The tracking accuracy of our method and the CyberKnife’s on-board tracking software was evaluated by comparing the calculated marker locations to the known robot positions.

3 Results

The mean reprojection error for the geometric camera calibration is 0.09 mm and 0.12 mm for X-ray tube A and B, respectively. The optimized focal lengths are 2.66 m (A) and 2.68 m (B). The rotation matrix R between the projection centres A and B nearly equals the identity matrix, i.e. the camera geometry can be considered rectified.

The developed calibration method makes use of the raw image data while the on-board CyberKnife software pre-processes the images to mimic an image plane with 512 × 512 pixels perpendicular to the X-ray beam before tracking. By using the full detector information (1024 × 1024 pixels), the newly developed method increases the tracking resolution by 28–32% horizontally and almost 80% vertically (see Figure 4). Additionally, the tracking volume is more than doubled from ∼ 3800 cm3 with the on-board software to more than 8000 cm3 with our extended tracking framework.

Our method using stereo-calibration covers a larger tracking volume with higher resolution (UPA) compared with the processed X-ray images used by the CyberKnife software (PA).
Figure 4

Our method using stereo-calibration covers a larger tracking volume with higher resolution (UPA) compared with the processed X-ray images used by the CyberKnife software (PA).

Using the unprocessed data for tracking and the 20 mm marker leads to a mean translational error of 0.22 mm compared with 0.20 mm for the on-board localization (see Table 1). The translational accuracy for the larger marker is slightly lower with 0.37 mm for the CyberKnife software and 0.38 mm for the extended method. Moreover, the rotational error is reduced from 0.21° to 0.05° and from 0.27° to 0.08° using the developed technique for the 50 mm and 20 mm markers, respectively.

Table 1

Mean geometric tracking accuracy of the on-board CyberKnife software and the extended method.

4 Discussion and conclusion

This study demonstrates that customized marker geometries can be localized with high accuracy using the raw image data of the CyberKnife’s integrated X-ray system. Our extended tracking framework is based on the geometric calibration of a stereoscopic X-ray setup with modified camera calibration tools. The low reprojection error and the convergence behaviour suggest that the calibration procedure was successful while the algorithms have a high generalizability and could likely be modified for different camera setups.

The success of the geometric calibration is further validated by the marker localization results which are well in accordance with the values calculated from the CyberKnife and with literature values [8]. Moreover, the rotational accuracy could be increased by a factor of four, staying below 0.1°, which is especially important when the marker position is used for coordinate transformation of a distant target location in ultrasound.

One of the major advantages of the extended tracking framework is the ability to increase the tracking resolution and volume size substantially without even changing the existing X-ray system. The CyberKnife’s internal image processing reduces the raw images to a smaller subimage with decreased resolution before localizing the targets. This was most likely implemented to imitate earlier versions of the system (G1–G3) where the X-ray detectors were located on stands above the ground perpendicular to the X-ray beam [8]. Our calibration-based approach is based on the current setup incorporating the full detector area and image content. This leads to an increased tracking resolution also noticeable in the vertical translational accuracy. Simultaneously, the tracking volume was more than doubled which increases the range of detectable marker locations and makes full use of the applied X-ray dose. A more efficient use of imaging dose is of direct benefit for the patient and is in accordance with the central ALARA principle in radiation protection (As Low As Reasonably Achievable).

While this study only covered the first localization step, we have already demonstrated the ability of the 4D ultrasound system to track 3D structures in real-time with high accuracy [9]. The combination of both steps, i.e. probe localization in X-ray images and target detection in ultrasound volumes, has recently been tested in a phantom experiment [10]. In this pilot study the ultrasound signal was used as the continuous signal component of the CyberKnife’s correlation model substituting the superficial LED markers and achieved comparable results to the conventional model.

In summary, we have shown that geometric calibration of a stereo X-ray system leads to an improved tracking framework with similar accuracy, a higher resolution and a larger field of view compared to the current on-board solution. The extended tracking method allows for accurate probe localization, thus paving the way for integration of real-time 4D ultrasound into existing radiosurgery systems.

Author’s Statement

Research funding: The author state no funding involved. Conflict of interest: Authors state no conflict of interest. Material and Methods: Informed consent: Informed consent is not applicable. Ethical approval: The conducted research is not related to either human or animal use.

References

  • [1]

    Shirato H, Seppenwoolde Y, Kitamura K, Onimura R, Shimizu S. Intrafractional tumor motion: lung and liver. Semin Radiat Oncol. 2004;14:10–8. Google Scholar

  • [2]

    Schweikard A, Glosser G, Bodduluri M, Murphy MJ, Adler JR. Robotic motion compensation for respiratory movement during radiosurgery. Comput Aided Surg. 2000;5:263–77. Google Scholar

  • [3]

    Bell MA, Byram BC, Harris EJ, Evans PM, Bamber JC. Liver tracking with a high volume rate 4D ultrasound scanner and a 2D matrix array probe. Phys Med Biol. 2012;57:1359–74. Google Scholar

  • [4]

    Zhang Z. A flexible new technique for camera calibration. IEEE T Pattern Anal. 2000;22:1330–34. Google Scholar

  • [5]

    Bradsky G, Kaehler A. Learning OpenCV, 1st ed. Köln: O’Reilly Media, Inc.; 2008. Google Scholar

  • [6]

    Hartley R, Zisserman A. Multiple view geometry in computer vision, 2nd ed. Cambridge: Cambridge University Press; 2004. Google Scholar

  • [7]

    Horn BKP. Closed-form solution of absolute orien-tation using unit quaternions. J Opt Soc Am. 1987;4:629–42. Google Scholar

  • [8]

    Fu D, Kuduvalli G. A fast, accurate, and automatic 2D-3D image registration for image-guided cranial radiosurgery. Med Phys. 2008;35:2180–94. Google Scholar

  • [9]

    Bruder R, Ernst F, Schläfer A, et al. A framework for real-time target tracking in igrt using three-dimensional ultrasound. Proc 25th Int Congress on Comp Assis Rad Surg (CARS). 2011;306–307. Google Scholar

  • [10]

    Blanck O, Jauer P, Ernst F, et al. Pilot-Phantomstudie zur ultraschallgeführten robotergestützten Radiochirurgie, in 44. Jahrestagung der DGMP. 2013;122–23. Google Scholar

About the article

Published Online: 2016-09-30

Published in Print: 2016-09-01


Citation Information: Current Directions in Biomedical Engineering, Volume 2, Issue 1, Pages 409–413, ISSN (Online) 2364-5504, DOI: https://doi.org/10.1515/cdbme-2016-0091.

Export Citation

©2016 Svenja Ipsen et al., licensee De Gruyter.. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. BY-NC-ND 4.0

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

[1]
Lena Vogel, Dwi Seno Kuncoro Sihono, Christel Weiss, Frank Lohr, Florian Stieler, Hansjörg Wertz, Sandra von Swietochowski, Anna Simeonova-Chergou, Frederik Wenz, Manuel Blessing, and Judit Boda-Heggemann
Radiotherapy and Oncology, 2018

Comments (0)

Please log in or register to comment.
Log in