Interaction with robotic assistance devices is particularly interesting in medical applications, where physicians are typically confronted with a number of different tools and instruments they need to control. One example are robotic cameras and microscopes. For example, operating microscopes are frequently used in neurosurgery, where the physician sits close to the patient’s head, looking at the surgical field through a microscope while performing precise surgery, e.g., along nerves. Given the small field of view, the clinical scenario typically involves frequent manual adjustments, i.e., grasping the handles and moving the microscope to the next region of interest. Clearly, removing the instruments from the surgical field, placing them outside and manually positioning the microscope interrupts the surgical workflow.
Different approaches for interaction and control of devices in the operating room (OR) have been proposed. Conventionally, switches and pedals are widely used for control, e.g., of patient couch and imaging systems. Limitations include the available space, complexity of the control, the degrees of freedom representable, and the need to maintain sterile surfaces. The latter is also an issue for touch interfaces. In contrast, voice control does not require changes in the clinical workflow [6, 8]. However, the results have been mixed, particularly with respect to robust and fail safe operation in an actual OR setting. In addition to voice control, switches, touch panels and pointers have been studied to realize interaction with devices [3, 6]. An approach that recently gained interest is gesture control [5, 9, 10]. Advantages include touch free interaction and simple integration.
While not widely used in clinical practice, a number of robotic surgical microscopes have been developed [2, 4, 7]. One possible use of these devices is to perform automated tasks like scanning resection cavities . However, most of the time the microscope remains a tool in the hands of the physician, and improving the workflow has been another objective.
We consider gesture tracking for motion control of a robotic microscope. Using a Leap Motion camera (LMC) we present a setup illustrating how gesture tracking could be integrated. We studied the feasibility and precision to move the microscope using finger gestures. Our results indicate, that tracking and motion control are feasible.
2 System setup
We propose a setup where the sensor is embedded in a surgical microscope. A clear advantage is the unobstructed view on the surgical field. However, the sensor needs to be small, which holds true for the LMC. The device primarily consists of two cameras and three infrared light emitting diodes illuminating the scene. It is intended to track finger motion in a range of 25 to 600 mm above the device. Given that all processing is done on a computer, the device is small and lightweight, measuring just 75 mm by 25 mm by 6.2 mm. Another advantage is the high temporal resolution. The software interface provides information on hand gestures and can be configured to track different points, e.g., the centroid of the hand, or a finger.
To assess the feasibility of tracking and gesture control we have realized a simplified setup consisting of a UR5 (Universal Robots, Denmark) robotic arm, a USB microscope, and the leap motion controller. A second robot (ABB IRB120, Sweden) is used for some experiments evaluating the tracking performance by moving a hand phantom with adjustable fingers. A computer with an Xeon E3-1225v3 CPU and 16 GB RAM running Windows 8.1 Professional is used to process the LMC data and to control the UR5. Figure 1 shows the setup.
The LMC defines a coordinate frame with the x axis along the long centerline, the z axis along the short centerline, and the y axis normal to form a right hand system (see Fig. 2). For the purpose of our experiments, the height of the microscope with respect to the base plane was not changed. The LMC and the microscope were aligned and placed into an adapter and mounted to the tool flange such that the orientation of the coordinate axes was aligned. We did not need to obtain the actual transformation, i.e., particularly the translation, as all motions are relative to the last position of the microscope.
3 Experimental evaluation
In our analysis we were primarily interested in the possibility to track motion for interaction with a robotic microscope. We studied two different scenarios. First, the actual microscopy setup was used, but the motion was mimicked with the hand phantom mounted to a second robot (compare Fig. 1). Second, a pattern denoting a corridor for a target trajectory was printed, and 14 test persons were asked to move the center of the microscope image along the pattern. The center was highlighted by a cross-hair and the motion was solely controled by moving the hand.
In the first experiment, the trajectories of the robot representing the microscope and of the robot moving the hand are compared. This is evaluated by calculating the root mean squared error (RMSE)(1)
of the difference in the xy-plane between both TCP positions and(2)
In the second experiment the quotient(3)
between the total moved distance Δtotal and the distance Δoutside moved outside the course is calculated for evaluation. In addition the time of the subjects to finish the course is measured.
During the experiments the room temperature was constant at approximately 22 degrees Celsius and the LMC and the UR5 robot worked under normal operating conditions.
4.1 Motion tracking
We measured the motion tracking performance for squares of edge length 50mm and 100mm at speeds ranging from 5mm/s to 20mm/s. An example of a resulting trajectory is shown in Fig. 3. The RMSE between the TCP positions is displayed in Table 1. The LMC was mounted with its longer centerline and hence its x-axis is parallel to the y-axis in the image. In the data this is represented by higher errors in y-direction of the example trajectory. Besides, the results are pretty straightforward with RMSE increasing for higher speeds and squares.
4.2 Microscope survey
The survey was completed by 14 untrained participants with one to three tries with the majority completing two tries. The results are displayed in Tab. 2. An example of a resulting trajectory is shown in Fig. 4. Clearly most of the subjects delivered very good results for not being familiar with the system before. Only two participants could not achieve comparable results. The data shows a pretty steep learning curve with most of the participants improving their results in a second run either in accuracy of their movement or in speed. In at least one try 9 of 13 achieved results of pquote < 0.1. As with motion tracking we achieved a sampling frequency greater than 100Hz.
Our results indicate that the general tracking performance of the LMC using the standard software interface has to be carefully considered when using the device in clinical scenarios. The RMSE for robot hand motion was between 6.8mm and 13.9mm but some delay in the control is defi-nitely measured here, too. Following the course, most test persons performed well with 9 of 14 having a pquote below 0.1 in at least one try. The results for both, the robotic hand motion and the human trajectory following indicate that the errors are small enough to realize interactive motion control.
Currently, few motorized microscopes are available, with their practical advantages being questionable. We have shown that interactive gesture control of a robotic microscope is feasible. Embedding the sensor in the device would avoid extra setup effort and line of sight problems while adding no extra complexity to the workflow.
Finke, M., Kantelhardt, S., Schlaefer, A., Bruder, R., Lankenau, E., Giese, A., Schweikard, A.: Automatic scanning of large tissue areas in neurosurgery using optical coherence tomography. In: Int J Med Robot 8 (2012), Sep, Nr. 3, S. 327–336 Google Scholar
Finke, M., Schweikard, A.: Motorization of a surgical microscope for intra-operative navigation and intuitive control. In: Int J Med Robot 6 (2010), Sep, Nr. 3, S. 269–280 Google Scholar
Finke, Markus, Stender, Birgit, Bruder, Ralf, Schlaefer, Alexander, Schweikard, Achim: An experimental comparison of control devices for automatic movements of a surgical microscope. In: Proceedings of the 24th International Conference and Exhibition on Computer Assisted Radiology and Surgery (CARS’10), 2010, S. 311–312 Google Scholar
Giorgi, C., Eisenberg, H., Costi, G., Gallo, E., Garibotto, G., Casolino, D. S.: Robot-assisted microscope for neurosurgery. In: J Image Guid Surg 1 (1995), Nr. 3, S. 158–163 Google Scholar
Kassell, N. F., Downs, JH 3rd, Graves, B. S.: Telepresence in neurosurgery: the integrated remote neurosurgical system. In: Stud Health Technol Inform 39 (1997), S. 411–419 Google Scholar
Oppenlander, Mark E., Chowdhry, Shakeel A., Merkl, Brandon, Hattendorf, Guido M., Nakaji, Peter, Spetzler, Robert F.: Robotic autopositioning of the operating microscope. In: Neurosurgery 10 Suppl 2 (2014), Jun, S. 214–9; discussion 219 Google Scholar
Punt, Marius M., Stefels, Coen N., Grimbergen, Cornelis A., Dankelman, Jenny: Evaluation of voice control, touch panel control and assistant control during steering of an endoscope. In: Minim Invasive Ther Allied Technol 14 (2005), Nr. 3, S. 181– 187 Google Scholar
Rossol, Nathaniel, Cheng, Irene, Rui Shen, Basu, Anup: Touchfree medical interfaces. In: Conf Proc IEEE Eng Med Biol Soc 2014 (2014), S. 6597–6600 Google Scholar
Yoshimitsu, Kitaro, Muragaki, Yoshihiro, Maruyama, Takashi, Yamato, Masayuki, Iseki, Hiroshi: Development and initial clinical testing of ”OPECT”: an innovative device for fully intangible control of the intraoperative image-displaying monitor by the surgeon. In: Neurosurgery 10 Suppl 1 (2014), Mar, S. 46–50; discussion 50 Google Scholar
About the article
Published Online: 2015-09-12
Published in Print: 2015-09-01
Conflict of interest: Authors state no conflict of interest. Material and Methods: Informed consent: Informed consent has been obtained from all individuals included in this study. Ethical approval: The conducted research is not related to either human or animals use.