Target tracking accuracy and latency with different 4D ultrasound systems – a robotic phantom study

Svenja Ipsen 1 , Sven Böttger 1 , Holger Schwegmann 1 , 2 ,  and Floris Ernst 1
  • 1 Institute for Robotics and Cognitive Systems, Universität zu Lübeck, Lübeck, Germany
  • 2 Institute for Medical Engineering, Universität zu Lübeck, Lübeck, Germany
Svenja Ipsen
  • Corresponding author
  • Institute for Robotics and Cognitive Systems, Universität zu Lübeck, Lübeck, Germany
  • Email
  • Search for other articles:
  • degruyter.comGoogle Scholar
, Sven Böttger
  • Institute for Robotics and Cognitive Systems, Universität zu Lübeck, Lübeck, Germany
  • Search for other articles:
  • degruyter.comGoogle Scholar
, Holger Schwegmann
  • Institute for Robotics and Cognitive Systems, Universität zu Lübeck, Lübeck, Germany
  • Institute for Medical Engineering, Universität zu Lübeck, Lübeck, Germany
  • Search for other articles:
  • degruyter.comGoogle Scholar
and Floris Ernst
  • Institute for Robotics and Cognitive Systems, Universität zu Lübeck, Lübeck, Germany
  • Search for other articles:
  • degruyter.comGoogle Scholar

Abstract

Ultrasound (US) imaging, in contrast to other image guidance techniques, offers the distinct advantage of providing volumetric image data in real-time (4D) without using ionizing radiation. The goal of this study was to perform the first quantitative comparison of three different 4D US systems with fast matrix array probes and real-time data streaming regarding their target tracking accuracy and system latency. Sinusoidal motion of varying amplitudes and frequencies was used to simulate breathing motion with a robotic arm and a static US phantom. US volumes and robot positions were acquired online and stored for retrospective analysis. A template matching approach was used for target localization in the US data. Target motion measured in US was compared to the reference trajectory performed by the robot to determine localization accuracy and system latency. Using the robotic setup, all investigated 4D US systems could detect a moving target with sub-millimeter accuracy. However, especially high system latency increased tracking errors substantially and should be compensated with prediction algorithms for respiratory motion compensation.

Introduction

Ultrasound (US) imaging is a fast, low-risk imaging modality with excellent soft tissue contrast not relying on ionizing radiation. It is therefore an ideal candidate for image guidance of therapeutic interventions such as needle biopsies, tissue ablations or radiotherapy. Most US guidance systems to date are limited to using 2D images, mainly due to the accessibility of systems and data. However, US imaging also offers the unique capability to acquire true volumetric data in real-time (4D US) – a distinct advantage compared to other imaging modalities, especially when monitoring dynamic tissue or instrument motion often affected by 3D translation, rotation and even deformation [1].

In radiotherapy, the only 4D US guidance system in clinical use [2] utilizes a US probe with mechanical image acquisition and thus limited temporal resolution. While this system has been shown to be suitable for monitoring slow-moving structures such as the prostate [3] or the liver during deep inspiration breath-hold [4], the slow acquisition speed prohibits tracking of fast motion, such as breathing or pulsation, in real-time [5]. Matrix array probes, on the other hand, do not rely on mechanical parts and can therefore achieve much higher volumetric framerates [6]. With previous studies showing the important role of low latency for high-accuracy motion compensation [7], [8], matrix array systems are ideally suited for image guidance tasks. Several systems have already been investigated in isolated proof-of-concept studies, e.g., [9], [10], yet a quantitative comparison of different potentially suitable systems – especially regarding their real-time tracking capabilities – has not been conducted to date.

The aim of this study is to assess and compare three different 4D US systems which can be accessed via real-time streaming interfaces, the prerequisite for online image guidance. The quantitative comparison between these systems could potentially reveal their individual strengths as well as indicate the relevance of different parameters for real-time tracking performance.

Methods

For the quantitative comparison of different 4D US systems regarding their target localization accuracy and overall latency, a dynamic phantom study was conducted.

Experimental setup

Three 4D US systems with matrix array probes shown in Figure 1 were included in this study: (1) Vivid 7, 3V-D probe (GE Healthcare), (2) Vivid E95, 4Vc-D probe (GE Healthcare) (3) Epiq 7, X6-1 probe (Philips Healthcare). The respective volume sizes were chosen to approximate the typical clinical setting of 58 × 45° in [3] as closely as possible, depending on the available system settings (see Table 1).

Figure 1:
Figure 1:

The three matrix array probes used in this study and central volume slices acquired in a US phantom. Comparable volume sizes at high (‘R’) and low (‘S’) spatial resolution settings were used (not possible for Vivid 7 system).

Citation: Current Directions in Biomedical Engineering 6, 1; 10.1515/cdbme-2020-0038

Table 1:

Acquisition settings, framerates, volume & voxel sizes in lateral, elevational and beam direction for the three 4D US systems.

SystemAcquisition setting (framerate)Volume size (16 cm depth)Voxel size, mm³
Vivid 7N/a (17 Hz)60° × 25°1.0 × 1.0 × 1.0
Vivid E95R (8 Hz)60° × 40°1.0 × 1.0 × 0.9
S (38 Hz)60° × 40°1.0 × 1.0 × 0.9
Epiq 7R (3 Hz)60° × 45°0.6 × 1.0 × 0.6
S (9 Hz)60° × 45°1.0 × 1.4 × 0.7

Two different acquisition settings (see Figure 1) were used to investigate the impact of spatiotemporal resolution: (R) high spatial resolution, low acquisition speed, and (S) low resolution, high speed. Vivid 7 only offered a single setting. All systems featured software interfaces (Vivid 7: custom, E95 & Epiq 7: provided by manufacturer) for streaming the volumetric US data via Ethernet to an external computer.

As shown in Figure 2, the US probes were attached to a collaborative robotic arm (Panda, Franka Emika) with seven degrees of freedom. The probes were moved relative to a static US phantom (CIRS Inc., see Figures 1 and 2) using four sinusoidal motion types with amplitude A and period length T: I) A=10 mm, T=2.5 s; II) A=10 mm, T=5.0 s; III) A=35 mm, T=2.5 s; IV) A=35 mm, T=5.0 s, covering a typical liver motion range [11]. The main axis of motion was in lateral direction relative to the US probes.

Figure 2:
Figure 2:

Experimental setup with robotic arm moving a US probe over a phantom with four different trajectories (red arrows indicate main motion direction). US data and robot positions were stored in real-time and analyzed retrospectively.

Citation: Current Directions in Biomedical Engineering 6, 1; 10.1515/cdbme-2020-0038

Data acquisition and analysis

Robot positions and US volumes were acquired for a duration of 60 sec and stored on an external computer, using a common clock for synchronized timestamps, for retrospective analysis. Robot positions were logged at 100 Hz.

The position of a spherical target embedded within the US phantom, shown in Figure 1, was detected in the US data with template matching (mean processing time 100 ms/volume, unoptimized Matlab implementation). Robot and US motion traces were then spatially aligned using a rigid landmark transform to explicitly exclude the effects of calibration inaccuracies on tracking performance. By defining the robot positions as ground truth for the tracking experiments, the mean root-mean-squared error (RMSE) between the corresponding data points was calculated for two scenarios: (i) excl. errors from system latency (‘best case’), i.e., latency between robot and US compensated by shifting US trace and (ii) incl. errors from latency, i.e., overall tracking accuracy.

Results and discussion

Continuous monitoring of target motion resembling a respiratory pattern was feasible with all investigated 4D US systems and acquisition settings. Figure 3 shows a typical target trajectory and the corresponding RMSE over time. Compared to the relatively smooth robot trajectory, the target trajectories from US follow discrete steps mainly due to the temporal resolution but also the finite voxel size of the US volume data (see Table 1), causing the saw-tooth shape of the RMSE. Using a sub-voxel matching approach could reduce this effect, however at the cost of higher runtime.

Figure 3:
Figure 3:

Target positions extracted from US (blue) and robotic motion trace (black) after temporal and spatial alignment (with enlarged section for clarity). The differences between both signals are shown in red (here: mean RMSE=0.7 mm).

Citation: Current Directions in Biomedical Engineering 6, 1; 10.1515/cdbme-2020-0038

The quantitative results, summarized in Table 2, show that target localization in 4D US is possible with sub-millimeter accuracy if system latency is compensated, confirming prior studies on US image guidance in radiation therapy (e.g., [5], [12]). With latency compensated, Vivid 7 achieved the lowest mean RMSE of 0.6 ± 0.1 mm, followed by 0.7 ± 0.3 mm (Epiq 7) and 0.9 ± 0.5 mm (E95). The following sections discuss the individual impact of system latency, motion type and image quality on the overall tracking accuracy.

Table 2:

RMSE with & without latency compensation for different acquisition settings (R – high spatial resolution, S – low).

SystemAcqu. settingLatency, msRMSE, mm Lat. comp.RMSE, mm No lat. comp.
Vivid 7n/a199 ± 80.6 ± 0.12.8 ± 1.6
E95R343 ± 220.9 ± 0.54.5 ± 2.7
S207 ± 351.1 ± 0.62.7 ± 1.5
Epiq 7R487 ± 131.3 ± 0.76.5 ± 4.1
S211 ± 260.7 ± 0.32.4 ± 1.6

Impact of latency

Overall system latency includes the acquisition and reconstruction time of the US volume as well as data transfer over the network to the control PC and conversion into Cartesian coordinates. The impact of latency was assessed by comparing tracking errors (i) with and (ii) without latency compensated. Using retrospectively aligned traces, (i) was calculated by removing the temporal offset between robot and US trajectories, while (ii) was based on the original trajectories and timestamps.

Latency had by far the strongest impact on US tracking accuracy in this study, with Vivid 7 achieving the lowest overall latency and the lowest RMSE. With latency left uncompensated (ii), Vivid 7’s RMSE increased by 340% to 2.8 mm. Comparable latency values of ∼200 ms (resolution setting ‘S’) caused a mean RMSE increase of 150% (2.7 mm) and 200% (2.37 mm) in E95 and Epiq 7, respectively. For slower acquisitions (‘R’), mean latencies of 340 ms/485 ms and error increases of up to 370%/395% were measured for E95 and Epiq 7, clearly highlighting the central role of latency in tracking scenarios.

It was also observed that, despite showing the fastest raw volumetric framerate among the three systems (38 Hz, ‘S’), E95 only reached 16 Hz via network streaming in this study. The additional latency was likely caused by a combination of network transfer delays, the event handling in our custom control software and the integrated Cartesian conversion of the volume data in the proprietary streaming software, requiring further investigation. While a custom streaming interface (as developed for Vivid 7 [13]) offers the advantage of being completely accessible and freely modifiable for optimization, it also has the clear disadvantage of not being an official product endorsed by the manufacturer.

Impact of motion type

Aside from system latency, motion amplitude and speed also substantially affected US tracking accuracy. The detailed results illustrated in Figure 4 show that slow target motion with a low amplitude (motion type ‘II’) led to lower RMSE values compared to fast, high-amplitude motion (type ‘III’) for all systems.

Figure 4:
Figure 4:

Distribution of RMSE between spatially aligned trajectories for the two extreme motion types with and without compensating for the latency between robot (ground truth) and US tracking. Note the different y-scale in the right subplot.

Citation: Current Directions in Biomedical Engineering 6, 1; 10.1515/cdbme-2020-0038

A more detailed analysis showed that an increase of A from 15 to 35 mm led to a rise in RMSE between 25 and 215%, while decreasing T from 5.0 to 2.5 s (i.e., increasing target speed) led to an increase in tracking errors of 20–80%. In both cases, RMSE increase was higher when the US framerate was lower since a higher amount of target motion would be missed between frames.

Impact of image quality

Image quality did not affect tracking accuracy substantially in this phantom-based study. The highest tracking accuracy was achieved by the oldest of the three systems (Vivid 7), despite the superior image quality of Epiq 7 indicated in a prior study [6]. At the same time, the experiments conducted with the low spatial resolution setting (‘S’) generally outperformed the high-resolution results (‘R’), indicating that latency played a much bigger role than image quality. However, it should be noted that image quality might have a stronger impact in a more realistic anatomical tracking scenario with real patients.

Conclusion

This is the first study to directly compare different 4D US systems regarding their tracking performance using real-time streaming interfaces for volumetric data acquisition. In our phantom experiments, sub-millimeter tracking accuracy could be achieved with all investigated systems, even for fast target motion. Higher overall latency (i.e., lower US framerates), faster target motion and larger motion amplitudes increased the tracking error by up to almost 400%. Since system latency showed the most pronounced impact on tracking errors, especially for fast motion with high amplitudes, 4D US guidance should ideally be combined with prediction algorithms to account for latency-induced errors, especially when compensating for fast respiratory or pulsatory motion.

Research funding: The following funding sources were involved in this study: BMBF grant ‘NavEVAR’ (Epiq 7), MWVATT of the State of S-H (E95, Panda).

Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission.

Competing interests: The 4Vc-D probe was provided by GE Healthcare.

Informed consent: n/a.

Ethical approval: The conducted research is not related to either human or animals use.

References

  • 1.

    Fenster, A, Bax, J, Neshat, H, Kakani, N, Romagnoli, C. 3D ultrasound imaging in image-guided intervention. In: Gunarathne, G, editor. Advancements and breakthroughs in ultrasound imaging. Rijeka, Croatia: InTech; 2013:1–26.

  • 2.

    Lachaine, M, Falco, T. Intrafractional prostate motion management with the clarity autoscan system. Med Phys Int 2013;1:72–80.

  • 3.

    Grimwood, A, McNair, HA, O’Shea, TP, Gilroy, S, Thomas, K, Bamber, JC, et al. In vivo validation of Elekta’s clarity autoscan for ultrasound-based intrafraction motion estimation of the prostate during radiation therapy. Int J Radiat Oncol Biol Phys 2018;102:912–21. https://doi.org/10.1016/j.ijrobp.2018.04.008.

    • Crossref
    • PubMed
    • Export Citation
  • 4.

    Vogel, L, Sihono, DSK, Weiss, C, Lohr, F, Stieler, F, Wertz, H, et al. Intra-breath-hold residual motion of image-guided DIBH liver-SBRT: an estimation by ultrasound-based monitoring correlated with diaphragm position in CBCT. Radiother Oncol 2018;129:441–8. https://doi.org/10.1016/j.radonc.2018.07.007.

    • Crossref
    • PubMed
    • Export Citation
  • 5.

    Harris, EJ, Miller, NR, Bamber, JC, Symonds-Tayler, JRN, Evans, PM. The effect of object speed and direction on the performance of 3D speckle tracking using a 3D swept-volume ultrasound probe. Phys Med Biol 2011;56:7127–43. https://doi.org/10.1088/0031-9155/56/22/009.

    • Crossref
    • PubMed
    • Export Citation
  • 6.

    Ipsen, S, Bruder, R, García-Vázquez, V, Schweikard, A, Ernst, F. Assessment of 4D ultrasound systems for image-guided radiation therapy – image quality, framerates and CT artifacts. Curr Dir Biomed Eng 2019;5:245–8. https://doi.org/10.1515/cdbme-2019-0062.

    • Crossref
    • Export Citation
  • 7.

    Poulsen, PR, Cho, B, Sawant, A, Ruan, D, Keall, PJ. Detailed analysis of latencies in image-based dynamic MLC tracking. Med Phys 2010;37:4998–5005. https://doi.org/10.1118/1.3480504.

    • Crossref
    • PubMed
    • Export Citation
  • 8.

    Lediju-Bell, MA, Byram, BC, Harris, EJ, Evans, PM, Bamber, JC. In-vivo liver tracking with a high volume rate 4D ultrasound scanner and a 2D matrix array probe. Phys Med Biol 2012;57:1359–74. https://doi.org/10.1088/0031-9155/57/5/1359.

    • Crossref
    • Export Citation
  • 9.

    Schlosser, J, Gong, RH, Bruder, R, Schweikard, A, Jang, S, Henrie, J, et al. Robotic intrafractional US guidance for liver SABR: system design, beam avoidance, and clinical imaging. Med Phys 2016;43:5951–63. https://doi.org/10.1118/1.4964454.

    • Crossref
    • PubMed
    • Export Citation
  • 10.

    De Luca, V, Banerjee, J, Hallack, A, Kondo, S, Makhinya, M, Nouri, D, et al. Evaluation of 2D and 3D ultrasound tracking algorithms and impact on ultrasound-guided liver radiotherapy margins. Med Phys 2018;45:4986–5003. https://doi.org/10.1002/mp.13152.

    • Crossref
    • PubMed
    • Export Citation
  • 11.

    Korreman, SS. Motion in radiotherapy: photon therapy. Phys Med Biol 2012;57:R161–91. https://doi.org/10.1088/0031-9155/57/23/r161.

    • Crossref
    • PubMed
    • Export Citation
  • 12.

    Ipsen, S, Bruder, R, Worm, ES, Hansen, R, Poulsen, PR, Høyer, M, et al. Simultaneous acquisition of 4D ultrasound and wireless electromagnetic tracking for in-vivo accuracy validation. Curr Dir Biomed Eng 2017;3:75–8. https://doi.org/10.1515/cdbme-2017-0016.

    • Crossref
    • Export Citation
  • 13.

    Bruder, R, Ernst, F, Schläfer, A, Schweikard, A. A framework for real-time target tracking in IGRT using three-dimensional ultrasound. In: Proceedings 25th international congress computer assisted radiology surgery. Berlin, Germany: Int. J. Comput. Assist. Radiol. Surg.; 2011:306–7.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • 1.

    Fenster, A, Bax, J, Neshat, H, Kakani, N, Romagnoli, C. 3D ultrasound imaging in image-guided intervention. In: Gunarathne, G, editor. Advancements and breakthroughs in ultrasound imaging. Rijeka, Croatia: InTech; 2013:1–26.

  • 2.

    Lachaine, M, Falco, T. Intrafractional prostate motion management with the clarity autoscan system. Med Phys Int 2013;1:72–80.

  • 3.

    Grimwood, A, McNair, HA, O’Shea, TP, Gilroy, S, Thomas, K, Bamber, JC, et al. In vivo validation of Elekta’s clarity autoscan for ultrasound-based intrafraction motion estimation of the prostate during radiation therapy. Int J Radiat Oncol Biol Phys 2018;102:912–21. https://doi.org/10.1016/j.ijrobp.2018.04.008.

    • Crossref
    • PubMed
    • Export Citation
  • 4.

    Vogel, L, Sihono, DSK, Weiss, C, Lohr, F, Stieler, F, Wertz, H, et al. Intra-breath-hold residual motion of image-guided DIBH liver-SBRT: an estimation by ultrasound-based monitoring correlated with diaphragm position in CBCT. Radiother Oncol 2018;129:441–8. https://doi.org/10.1016/j.radonc.2018.07.007.

    • Crossref
    • PubMed
    • Export Citation
  • 5.

    Harris, EJ, Miller, NR, Bamber, JC, Symonds-Tayler, JRN, Evans, PM. The effect of object speed and direction on the performance of 3D speckle tracking using a 3D swept-volume ultrasound probe. Phys Med Biol 2011;56:7127–43. https://doi.org/10.1088/0031-9155/56/22/009.

    • Crossref
    • PubMed
    • Export Citation
  • 6.

    Ipsen, S, Bruder, R, García-Vázquez, V, Schweikard, A, Ernst, F. Assessment of 4D ultrasound systems for image-guided radiation therapy – image quality, framerates and CT artifacts. Curr Dir Biomed Eng 2019;5:245–8. https://doi.org/10.1515/cdbme-2019-0062.

    • Crossref
    • Export Citation
  • 7.

    Poulsen, PR, Cho, B, Sawant, A, Ruan, D, Keall, PJ. Detailed analysis of latencies in image-based dynamic MLC tracking. Med Phys 2010;37:4998–5005. https://doi.org/10.1118/1.3480504.

    • Crossref
    • PubMed
    • Export Citation
  • 8.

    Lediju-Bell, MA, Byram, BC, Harris, EJ, Evans, PM, Bamber, JC. In-vivo liver tracking with a high volume rate 4D ultrasound scanner and a 2D matrix array probe. Phys Med Biol 2012;57:1359–74. https://doi.org/10.1088/0031-9155/57/5/1359.

    • Crossref
    • Export Citation
  • 9.

    Schlosser, J, Gong, RH, Bruder, R, Schweikard, A, Jang, S, Henrie, J, et al. Robotic intrafractional US guidance for liver SABR: system design, beam avoidance, and clinical imaging. Med Phys 2016;43:5951–63. https://doi.org/10.1118/1.4964454.

    • Crossref
    • PubMed
    • Export Citation
  • 10.

    De Luca, V, Banerjee, J, Hallack, A, Kondo, S, Makhinya, M, Nouri, D, et al. Evaluation of 2D and 3D ultrasound tracking algorithms and impact on ultrasound-guided liver radiotherapy margins. Med Phys 2018;45:4986–5003. https://doi.org/10.1002/mp.13152.

    • Crossref
    • PubMed
    • Export Citation
  • 11.

    Korreman, SS. Motion in radiotherapy: photon therapy. Phys Med Biol 2012;57:R161–91. https://doi.org/10.1088/0031-9155/57/23/r161.

    • Crossref
    • PubMed
    • Export Citation
  • 12.

    Ipsen, S, Bruder, R, Worm, ES, Hansen, R, Poulsen, PR, Høyer, M, et al. Simultaneous acquisition of 4D ultrasound and wireless electromagnetic tracking for in-vivo accuracy validation. Curr Dir Biomed Eng 2017;3:75–8. https://doi.org/10.1515/cdbme-2017-0016.

    • Crossref
    • Export Citation
  • 13.

    Bruder, R, Ernst, F, Schläfer, A, Schweikard, A. A framework for real-time target tracking in IGRT using three-dimensional ultrasound. In: Proceedings 25th international congress computer assisted radiology surgery. Berlin, Germany: Int. J. Comput. Assist. Radiol. Surg.; 2011:306–7.

OPEN ACCESS

Journal + Issues

Current Directions in Biomedical Engineering is an open access journal and closely related to the journal Biomedical Engineering - Biomedizinische Technik. CDBME is a forum for the exchange of knowledge in the fields of biomedical engineering, medical information technology and biotechnology/bioengineering for medicine and addresses engineers, natural scientists, and clinicians working in research, industry, or clinical practice.

Search

  • View in gallery

    The three matrix array probes used in this study and central volume slices acquired in a US phantom. Comparable volume sizes at high (‘R’) and low (‘S’) spatial resolution settings were used (not possible for Vivid 7 system).

  • View in gallery

    Experimental setup with robotic arm moving a US probe over a phantom with four different trajectories (red arrows indicate main motion direction). US data and robot positions were stored in real-time and analyzed retrospectively.

  • View in gallery

    Target positions extracted from US (blue) and robotic motion trace (black) after temporal and spatial alignment (with enlarged section for clarity). The differences between both signals are shown in red (here: mean RMSE=0.7 mm).

  • View in gallery

    Distribution of RMSE between spatially aligned trajectories for the two extreme motion types with and without compensating for the latency between robot (ground truth) and US tracking. Note the different y-scale in the right subplot.