Abstract
Robotic devices become increasingly available in the clinics. One example are motorized surgical microscopes. While there are different scenarios on how to use the devices for autonomous tasks, simple and reliable interaction with the device is a key for acceptance by surgeons. We study, how gesture tracking can be integrated within the setup of a robotic microscope. In our setup, a Leap Motion Controller is used to track hand motion and adjust the field of view accordingly. We demonstrate with a survey that moving the field of view over a specified course is possible even for untrained subjects. Our results indicate that touch-less interaction with robots carrying small, near field gesture sensors is feasible and can be of use in clinical scenarios, where robotic devices are used in direct proximity of patient and physicians.
1 Introduction
Interaction with robotic assistance devices is particularly interesting in medical applications, where physicians are typically confronted with a number of different tools and instruments they need to control. One example are robotic cameras and microscopes. For example, operating microscopes are frequently used in neurosurgery, where the physician sits close to the patient’s head, looking at the surgical field through a microscope while performing precise surgery, e.g., along nerves. Given the small field of view, the clinical scenario typically involves frequent manual adjustments, i.e., grasping the handles and moving the microscope to the next region of interest. Clearly, removing the instruments from the surgical field, placing them outside and manually positioning the microscope interrupts the surgical workflow.
Different approaches for interaction and control of devices in the operating room (OR) have been proposed. Conventionally, switches and pedals are widely used for control, e.g., of patient couch and imaging systems. Limitations include the available space, complexity of the control, the degrees of freedom representable, and the need to maintain sterile surfaces. The latter is also an issue for touch interfaces. In contrast, voice control does not require changes in the clinical workflow [6, 8]. However, the results have been mixed, particularly with respect to robust and fail safe operation in an actual OR setting. In addition to voice control, switches, touch panels and pointers have been studied to realize interaction with devices [3, 6]. An approach that recently gained interest is gesture control [5, 9, 10]. Advantages include touch free interaction and simple integration.
While not widely used in clinical practice, a number of robotic surgical microscopes have been developed [2, 4, 7]. One possible use of these devices is to perform automated tasks like scanning resection cavities [1]. However, most of the time the microscope remains a tool in the hands of the physician, and improving the workflow has been another objective.
We consider gesture tracking for motion control of a robotic microscope. Using a Leap Motion camera (LMC) we present a setup illustrating how gesture tracking could be integrated. We studied the feasibility and precision to move the microscope using finger gestures. Our results indicate, that tracking and motion control are feasible.
2 System setup
We propose a setup where the sensor is embedded in a surgical microscope. A clear advantage is the unobstructed view on the surgical field. However, the sensor needs to be small, which holds true for the LMC. The device primarily consists of two cameras and three infrared light emitting diodes illuminating the scene. It is intended to track finger motion in a range of 25 to 600 mm above the device. Given that all processing is done on a computer, the device is small and lightweight, measuring just 75 mm by 25 mm by 6.2 mm. Another advantage is the high temporal resolution. The software interface provides information on hand gestures and can be configured to track different points, e.g., the centroid of the hand, or a finger.

Our experimental setup with a) microscopy robot, b) USB microscope, c) leap motion controller, d) hand phantom, e) hand motion robot, and f) track for manual motion experiments.
To assess the feasibility of tracking and gesture control we have realized a simplified setup consisting of a UR5 (Universal Robots, Denmark) robotic arm, a USB microscope, and the leap motion controller. A second robot (ABB IRB120, Sweden) is used for some experiments evaluating the tracking performance by moving a hand phantom with adjustable fingers. A computer with an Xeon E3-1225v3 CPU and 16 GB RAM running Windows 8.1 Professional is used to process the LMC data and to control the UR5. Figure 1 shows the setup.
The LMC defines a coordinate frame with the x axis along the long centerline, the z axis along the short centerline, and the y axis normal to form a right hand system (see Fig. 2). For the purpose of our experiments, the height of the microscope with respect to the base plane was not changed. The LMC and the microscope were aligned and placed into an adapter and mounted to the tool flange such that the orientation of the coordinate axes was aligned. We did not need to obtain the actual transformation, i.e., particularly the translation, as all motions are relative to the last position of the microscope.

The LMC coordinate system.
3 Experimental evaluation
In our analysis we were primarily interested in the possibility to track motion for interaction with a robotic microscope. We studied two different scenarios. First, the actual microscopy setup was used, but the motion was mimicked with the hand phantom mounted to a second robot (compare Fig. 1). Second, a pattern denoting a corridor for a target trajectory was printed, and 14 test persons were asked to move the center of the microscope image along the pattern. The center was highlighted by a cross-hair and the motion was solely controled by moving the hand.
In the first experiment, the trajectories of the robot representing the microscope and of the robot moving the hand are compared. This is evaluated by calculating the root mean squared error (RMSE)
of the difference in the xy-plane between both TCP positions
In the second experiment the quotient
between the total moved distance Δtotal and the distance Δoutside moved outside the course is calculated for evaluation. In addition the time of the subjects to finish the course is measured.
During the experiments the room temperature was constant at approximately 22 degrees Celsius and the LMC and the UR5 robot worked under normal operating conditions.
4 Results
4.1 Motion tracking
We measured the motion tracking performance for squares of edge length 50mm and 100mm at speeds ranging from 5mm/s to 20mm/s. An example of a resulting trajectory is shown in Fig. 3. The RMSE between the TCP positions is displayed in Table 1. The LMC was mounted with its longer centerline and hence its x-axis is parallel to the y-axis in the image. In the data this is represented by higher errors in y-direction of the example trajectory. Besides, the results are pretty straightforward with RMSE increasing for higher speeds and squares.

Example of the trajectories resulting from motion tracking. The square has an edge length of 50mm and the ABB moves at 5mm/s.
RMSE of the motion tracking.
speed (mm/s) | edge length (mm) | RMSE (mm) |
---|---|---|
5 | 50 | 6.8183 |
10 | 50 | 6.9378 |
10 | 100 | 11.6893 |
20 | 50 | 8.9277 |
20 | 100 | 13.9469 |
4.2 Microscope survey
The survey was completed by 14 untrained participants with one to three tries with the majority completing two tries. The results are displayed in Tab. 2. An example of a resulting trajectory is shown in Fig. 4. Clearly most of the subjects delivered very good results for not being familiar with the system before. Only two participants could not achieve comparable results. The data shows a pretty steep learning curve with most of the participants improving their results in a second run either in accuracy of their movement or in speed. In at least one try 9 of 13 achieved results of pquote < 0.1. As with motion tracking we achieved a sampling frequency greater than 100Hz.
5 Discussion
Our results indicate that the general tracking performance of the LMC using the standard software interface has to be carefully considered when using the device in clinical scenarios. The RMSE for robot hand motion was between 6.8mm and 13.9mm but some delay in the control is defi-nitely measured here, too. Following the course, most test persons performed well with 9 of 14 having a pquote below 0.1 in at least one try. The results for both, the robotic hand motion and the human trajectory following indicate that the errors are small enough to realize interactive motion control.

Course of the survey and an example of a trajectory with pquote = 0.085 and a runtime of 21.2s. While on the course the trajectory is green, when outside it is red. The start is denoted by a blue dot.
Results of the different participants in the survey.
participant | try | pquote | time (s) |
---|---|---|---|
subject 1 | 1 | 0.0207 | 61.2 |
2 | 0 | 45.5 | |
3 | 0.0300 | 21.0 | |
subject 2 | 1 | 0.3891 | 37.9 |
2 | 0.1274 | 19.2 | |
3 | 0.0415 | 20.5 | |
subject 3 | 1 | 0 | 31.6 |
2 | 0 | 34.8 | |
subject 4 | 1 | 0.0396 | 33.9 |
2 | 0.0526 | 45.2 | |
3 | 0.0523 | 21.5 | |
subject 5 | 1 | 0.1712 | 35.6 |
2 | 0.2368 | 40.0 | |
subject 6 | 1 | 0.1666 | 39.4 |
subject 7 | 1 | 0.0094 | 74.8 |
2 | 0.0231 | 56.9 | |
subject 8 | 1 | 0.0691 | 42.3 |
2 | 0.0862 | 38.3 | |
subject 9 | 1 | 0.2206 | 25.9 |
2 | 0.1647 | 20.9 | |
subject 10 | 1 | 0.7358 | 34.4 |
2 | 0.5645 | 61.0 | |
subject 11 | 1 | 0.2379 | 43.2 |
2 | 0.2406 | 30.0 | |
subject 12 | 1 | 0.2273 | 87.8 |
2 | 0.0961 | 56.1 | |
subject 13 | 1 | 0.1578 | 25.9 |
2 | 0.0313 | 26.3 | |
subject 14 | 1 | 0.0850 | 21.2 |
2 | 0.0091 | 21.0 | |
3 | 0.2031 | 12.3 |
Currently, few motorized microscopes are available, with their practical advantages being questionable. We have shown that interactive gesture control of a robotic microscope is feasible. Embedding the sensor in the device would avoid extra setup effort and line of sight problems while adding no extra complexity to the workflow.
Author’s Statement
Conflict of interest: Authors state no conflict of interest. Material and Methods: Informed consent: Informed consent has been obtained from all individuals included in this study. Ethical approval: The conducted research is not related to either human or animals use.
References
[1] Finke, M., Kantelhardt, S., Schlaefer, A., Bruder, R., Lankenau, E., Giese, A., Schweikard, A.: Automatic scanning of large tissue areas in neurosurgery using optical coherence tomography. In: Int J Med Robot 8 (2012), Sep, Nr. 3, S. 327–33610.1002/rcs.1425Search in Google Scholar PubMed
[2] Finke, M., Schweikard, A.: Motorization of a surgical microscope for intra-operative navigation and intuitive control. In: Int J Med Robot 6 (2010), Sep, Nr. 3, S. 269–28010.1002/rcs.314Search in Google Scholar PubMed
[3] Finke, Markus, Stender, Birgit, Bruder, Ralf, Schlaefer, Alexander, Schweikard, Achim: An experimental comparison of control devices for automatic movements of a surgical microscope. In: Proceedings of the 24th International Conference and Exhibition on Computer Assisted Radiology and Surgery (CARS’10), 2010, S. 311–312Search in Google Scholar
[4] Giorgi, C., Eisenberg, H., Costi, G., Gallo, E., Garibotto, G., Casolino, D. S.: Robot-assisted microscope for neurosurgery. In: J Image Guid Surg 1 (1995), Nr. 3, S. 158–163Search in Google Scholar
[5] Hartmann, Florian, Schlaefer, Alexander: Feasibility of touchless control of operating room lights. In: Int J Comput Assist Radiol Surg 8 (2013), Mar, Nr. 2, S. 259–268Search in Google Scholar
[6] Kassell, N. F., Downs, JH 3rd, Graves, B. S.: Telepresence in neurosurgery: the integrated remote neurosurgical system. In: Stud Health Technol Inform 39 (1997), S. 411–419Search in Google Scholar
[7] Oppenlander, Mark E., Chowdhry, Shakeel A., Merkl, Brandon, Hattendorf, Guido M., Nakaji, Peter, Spetzler, Robert F.: Robotic autopositioning of the operating microscope. In: Neurosurgery 10 Suppl 2 (2014), Jun, S. 214–9; discussion 21910.1227/NEU.0000000000000276Search in Google Scholar PubMed
[8] Punt, Marius M., Stefels, Coen N., Grimbergen, Cornelis A., Dankelman, Jenny: Evaluation of voice control, touch panel control and assistant control during steering of an endoscope. In: Minim Invasive Ther Allied Technol 14 (2005), Nr. 3, S. 181– 18710.1080/13645700510033967Search in Google Scholar PubMed
[9] Rossol, Nathaniel, Cheng, Irene, Rui Shen, Basu, Anup: Touchfree medical interfaces. In: Conf Proc IEEE Eng Med Biol Soc 2014 (2014), S. 6597–660010.1109/EMBC.2014.6945140Search in Google Scholar PubMed
[10] Yoshimitsu, Kitaro, Muragaki, Yoshihiro, Maruyama, Takashi, Yamato, Masayuki, Iseki, Hiroshi: Development and initial clinical testing of ”OPECT”: an innovative device for fully intangible control of the intraoperative image-displaying monitor by the surgeon. In: Neurosurgery 10 Suppl 1 (2014), Mar, S. 46–50; discussion 5010.1227/NEU.0000000000000214Search in Google Scholar PubMed
© 2015 by Walter de Gruyter GmbH, Berlin/Boston
This article is distributed under the terms of the Creative Commons Attribution Non-Commercial License, which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.