Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Current Directions in Biomedical Engineering

Joint Journal of the German Society for Biomedical Engineering in VDE and the Austrian and Swiss Societies for Biomedical Engineering

Editor-in-Chief: Dössel, Olaf

Editorial Board: Augat, Peter / Buzug, Thorsten M. / Haueisen, Jens / Jockenhoevel, Stefan / Knaup-Gregori, Petra / Kraft, Marc / Lenarz, Thomas / Leonhardt, Steffen / Malberg, Hagen / Penzel, Thomas / Plank, Gernot / Radermacher, Klaus M. / Schkommodau, Erik / Stieglitz, Thomas / Urban, Gerald A.


CiteScore 2018: 0.47

Source Normalized Impact per Paper (SNIP) 2018: 0.377

Open Access
Online
ISSN
2364-5504
See all formats and pricing
More options …

Using smart glasses for ultrasound diagnostics

Stefan Maas
  • Corresponding author
  • Westfälische Hochschule, Department Electrical Engineering and Applied Natural Sciences, Neidenburger Straße 43, 45877 Gelsenkirchen, Germany
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Marvin Ingler
  • Westfälische Hochschule, Department Electrical Engineering and Applied Natural Sciences, Neidenburger Straße 43, 45877 Gelsenkirchen, Germany
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Heinrich Martin Overhoff
  • Westfälische Hochschule, Department Electrical Engineering and Applied Natural Sciences, Neidenburger Straße 43, 45877 Gelsenkirchen, Germany
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2015-09-12 | DOI: https://doi.org/10.1515/cdbme-2015-0049

Abstract

Ultrasound has been established as a diagnostic tool in a wide range of applications. Especially for beginners, the alignment of sectional images to patient’s spatial anatomy can be cumbersome. A direct view onto the patient’s anatomy while regarding ultrasound images may help to overcome unergonomic examination.

To solve these issues an affordable augmented reality system using smart glasses was created, that displays a (virtual) ultrasound image beneath a (real) ultrasound transducer.

Keywords: smart glasses; ultrasound; diagnostic

1 Introduction

Ultrasound has been established as a diagnostic tool in a wide range of applications. But especially for beginners, the alignment of sectional images to patient’s spatial anatomy can be cumbersome. The examing physician has to switch his/her view between the screen and patient frequently, what impedes an undisturbed diagnostic. A direct view onto the patient’s anatomy while regarding ultrasound images may help to overcome unergonomic examination.

Augmented reality in medicine has been presented since the 1990s [1, 2]. Still solutions suffer from high cost, regarding to immense technical requirements, and unergonomic designs [3, 4]. To solve these issues an affordable augmented reality system using smart glasses was created, that displays a (virtual) ultrasound image beneath a (real) ultrasound transducer.

2 Methods

The used hardware consists of an ultrasound device (Esaote MyLab70 XVG) including a linear 2-D ultrasound transducer, a PC (Intel Core2Duo, 2.6 GHz CPU, Geforce GTX 560) and smart glasses (Epson Moverio BT-200) including a smartphone (TI OMAP 4460, 1.2 GHz, 1 GB RAM) and an built-in camera.

As illustrated in Figure 1 the PC collects raw data via Ethernet from the ultrasound device, recalculates the image data and holds these data available as byte arrays. Subsequent the smart phone collects the byte arrays via WLAN and reconstructs the ultrasound image. An integrated camera enables the smart glasses to collect images of the transducer and the optical marker on it. These images are sent to the smartphone, where the position of the optical marker can be tracked. The ultrasound images are scaled and, if necessary, rotated to the correct viewing angle. Finally the smart glasses display the (virtual) image at the marker position under the (real) ultrasound transducer (s. Figure 2).

Augmented reality ultrasound data flow.
Figure 1

Augmented reality ultrasound data flow.

View through the smart glasses. On top the ultrasound transducer with the attached optical marker. Below the corresponding ultrasound image.
Figure 2

View through the smart glasses. On top the ultrasound transducer with the attached optical marker. Below the corresponding ultrasound image.

The system needs no initial calibration on startup because the client knows the fixed marker position on the transducer. The image position and size is calculated from the position and size of the marker.

The server, running on the PC, was implemented using C#. The client on the smartphone consists of C#-scripts and an app, created with Unity3D and the Qualcomm Augmented Reality SDK “Vuforia”. While the scripts manage the communication between the server and the client, the app creates the ultrasound images, detects the optical marker and places the images with the calculated rotation and scale at the correct position.

3 Results

The ultrasound images can be visualized inside the smart glasses as seen in Figure 2 at about 25 fps and a resolution of 128x256 pixels. Detecting the optical marker is possible within a viewing angle between about 70 and 110 degrees. When moving the transducer the image follows without noticeable delay.

4 Conclusion and outlook

Smart glasses can be used for ultrasound examinations in principle. Those systems are ergonomic and low cost due to minor technical requirements. The framerate and resolution is adequate for most applications. However the detection of the optical marker needs to be improved to realize higher viewing angles.

Current works determine to the implementation of faster data transfer algorithms to realize higher resolutions and framerates.

Funding

This work was funded by the Landesregierung NordrheinWestfalen in the Med in.NRW-program, grant no. GW01-078.

References

  • [1]

    Y. Sato, M. Nakamoto, Y. Tamaki, T. Sasama, I. Sakita, Y. Nakajima, M. Monden and S. Tamura. Image Guidance of Breast Cancer Surgery Using 3-D Ultrasound Images and Augmented Reality Visualization. IEEE Transactions on Medical Imaging 1998; 17: 681-693. Google Scholar

  • [2]

    M. Bajura, H. Fuchs and R. Ohbuchi. Merging Virtual Objects with the Real World: Seeing Ultrasound Imagery within the Patient. Computer Graphics 1992; 26: 203-210. Google Scholar

  • [3]

    D. Magee, Y. Zhu, R. Ratnalingam, P. Gardner and D. Kessel. An augmented reality simulator for ultrasound guided needle placement training. Medical & Biological Engineering & Computing 2007; 45: 957-967. Google Scholar

  • [4]

    M. Nakamoto, Y. Sato, M. Miyamoto, Y. Nakamjima, K. Konishi, M. Shimada, M. Hashizume and S. Tamura, “3D Ultrasound System Using a Magneto-optic Hybrid Tracker for Augmented Reality Visualization in Laparoscopic Liver Surgery,” Medical Image Computing and Computer-Assisted Intervention MICCAI 2002: 148-155 Google Scholar

About the article

Published Online: 2015-09-12

Published in Print: 2015-09-01


Author's Statement

Conflict of interest: Authors state no conflict of interest. Material and Methods: Informed consent: Informed consent has been obtained from all individuals included in this study. Ethical approval: The research related to human use has been complied with all the relevant national regulations, institutional policies and in accordance the tenets of the Helsinki Declaration, and has been approved by the authors’ institutional review board or equivalent committee.


Citation Information: Current Directions in Biomedical Engineering, Volume 1, Issue 1, Pages 196–197, ISSN (Online) 2364-5504, DOI: https://doi.org/10.1515/cdbme-2015-0049.

Export Citation

© 2015 by Walter de Gruyter GmbH, Berlin/Boston.Get Permission

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

[1]
Bjørn Hofmann, Dušan Haustein, and Laurens Landeweerd
Science and Engineering Ethics, 2017, Volume 23, Number 3, Page 701

Comments (0)

Please log in or register to comment.
Log in