Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Current Directions in Biomedical Engineering

Joint Journal of the German Society for Biomedical Engineering in VDE and the Austrian and Swiss Societies for Biomedical Engineering

Editor-in-Chief: Dössel, Olaf

Editorial Board: Augat, Peter / Buzug, Thorsten M. / Haueisen, Jens / Jockenhoevel, Stefan / Knaup-Gregori, Petra / Kraft, Marc / Lenarz, Thomas / Leonhardt, Steffen / Malberg, Hagen / Penzel, Thomas / Plank, Gernot / Radermacher, Klaus M. / Schkommodau, Erik / Stieglitz, Thomas / Urban, Gerald A.


CiteScore 2018: 0.47

Source Normalized Impact per Paper (SNIP) 2018: 0.377

Open Access
Online
ISSN
2364-5504
See all formats and pricing
More options …

Automatic depth scanning system for 3D infrared thermography

Michael Unger
  • Corresponding author
  • University of Leipzig, Innovation Center Computer Assisted Surgery, Semmelweisstr. 14, D-04103 Leipzig
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Adrian Franke / Claire Chalopin
Published Online: 2016-09-30 | DOI: https://doi.org/10.1515/cdbme-2016-0162

Abstract

Infrared thermography can be used as a pre-, intra- and post-operative imaging technique during medical treatment of patients. Modern infrared thermal cameras are capable of acquiring images with a high sensitivity of 10 mK and beyond. They provide a planar image of an examined 3D object in which this high sensitivity is only reached within a plane perpendicular to the camera axis and defined by the focus of the lens. Out of focus planes are blurred and temperature values are inaccurate. A new 3D infrared thermography system is built by combining a thermal camera with a depth camera. Multiple images at varying focal planes are acquired with the infrared camera using a motorized system. The sharp regions of individual images are projected onto the 3D object’s surface obtained by the depth camera. The system evaluation showed that deviation between measured temperature values and a ground truth is reduced with our system.

Keywords: 3D scene; depth of field; sharpness; thermal imaging

1 Introduction

Infrared thermography was introduced in the medical area already in the 1960s [1]. It is employed for diagnosis and monitoring of diseases involving temperature differences. Some applications are the detection of tumors [2], [3] or the monitoring of cardiovascular diseases [4], [5]. Moreover, because acquisition of data is performed contactless, thermography was brought in surgery for pre-operative planning, as intra-operative assistance tool and for post-operative monitoring of surgical outcome. Applications were mostly reported in neurosurgery [6], [7], [8] and in plastic surgery [9], [10]. Furthermore, since body temperature is correlated with tissue perfusion, identification and monitoring of skin transplants can be performed with this imaging technique [11], [12].

Recent technical developments apply thermal imaging onto 3D scenes for visualization purposes [13]. This technique was presented for medical applications too [14], [15], [16]. However, thermal cameras possess a very small depth of field, especially at small distances between camera and object [17]. Soldan [18] extended the depth of field by combining multiple thermal images of the same scene acquired with varying focuses. The optimal focus for an object is chosen by selecting a focus measure function (FMF). The FMF yields a minimum or maximum value at optimum. But good results require selecting an optimal FMF and it’s parameters for a given scene.

To overcome this disadvantage we combined an infrared thermal camera and a depth camera. The latter records a 3D surface of the examined object. Multiple thermal images of the scene are acquired with the infrared camera at varying focus settings. The optimal measurement is selected as a function of it’s depth in the 3D scene. Sharp parts of each thermal image are projected on the 3D object surface in order to reconstruct a sharp 3D thermal scene. An automatic focus control system was implemented with the future goal to use the device in the operating room.

2 3D thermographic imaging system

Then imaging system consisting of an infrared thermal camera Optris PI450 and a Microsoft Kinect V2, providing depth and visual information (Figure 1). The thermal camera has a spatial resolution of 382×288 pixels and a thermal sensitivity of 40 mK. The Kinect V2 provides a depth image with a spatial resolution of 512×424 pixels and a visual image of 1920×1080 pixels. The depth resolution is approximately 1 mm in a range of 0.5 m to 4 m.

The thermographic imaging system consists of a thermal camera and depth camera. A motor focus for the thermal camera allows for automatic image acquisition at different focal planes. A computer system processes the data.
Figure 1

The thermographic imaging system consists of a thermal camera and depth camera. A motor focus for the thermal camera allows for automatic image acquisition at different focal planes. A computer system processes the data.

The sharpness of the thermal image needs to be adjusted manually. To overcome this, we outfitted the camera with a drive for focus control. The hardware consists of a gear motor with a rotary encoder for position measurement which is controlled by an Arduino board. A firmware was developed providing basic functions for referencing the focus as well as getting and setting the focus position.

A software developed by us retrieves the temperature information and a depth image via provided vendor software. The thermal images are acquired using the Optris PI Connect which supplies the data for the inter-process communication via a dynamic link library. The Kinect for Windows SDK is used to acquire the depth and RGB images from the Kinect camera. After reconstructing the all-in-focus image, the 3D scene with applied thermal image mapping is rendered using the Kinect Fusion API.

2.1 Calibration of the system components

The components of the 3D thermographic imaging system must be calibrated in order to combine the different imaging modalities. The calibration process is described in detail in the following sections.

2.1.1 Calibration of the focus

Changing the focus of the thermal camera defines which objects at a certain distance to the camera will be displayed sharply. For automating the image acquisition process, the dependency between focus setting and the focal plane needs to be known. Therefore, the focus was referenced (zero position) by driving it to its end position. Following, the characteristic curve was determined by incrementally setting the focus and measuring the corresponding focal plane. The measurement was done by moving a checkerboard until it was displayed sharply and reading the corresponding object distance from the depth image.

2.1.2 Camera calibration

To be able to combine the spatial data of the depth camera and the temperature data of the thermal camera, the optical intrinsic (lens specific) and extrinsic parameters of the components must be calculated. For this process a known geometric pattern that is visible in both image modalities is needed. We used a checkerboard pattern which had alternating tiles removed (Figure 2). The removed tiles were clearly visible in both the depth and thermal image.

Checker board uses in the calibration process.
Figure 2

Checker board uses in the calibration process.

Firstly, the intrinsic parameters of the thermal camera were calculated. The intrinsic parameters of the depth camera are stored in its firmware. Secondly, the extrinsic parameters describing the spatial relation between the cameras were obtained. Using the resulting transformation matrix, the thermal image can projected onto the 3D scene retrieved by the depth camera.

2.2 Depth scanning

To improve the image quality, sharp parts of multiple thermal images are combined to an overall sharp image. Therefore, images at multiple focal planes are acquired. Points of the 3D surface acquired with the depth camera become the temperature value in the corresponding thermal image obtained using the transformation between thermal and depth images.

2.3 System evaluation

To evaluate the system an object which is visible in the depth and thermal data was constructed. We mounted LED stripes into grids onto a plane board. Three boards were positioned to focal planes at a distances of 500, 1000, 1500 mm. Each board contained 120 LEDs.

The current flowing through the LEDs causes heating which was used to evaluate the image quality. If the plane containing the LEDs is out of focus it gets blurred. This blurring leads to a reduced temperature of the LED measured in the thermal image. The LEDs were heated for half an hour to reach a steady state. Then a manually focused thermal image of each board was acquired. These images were used as a ground truth.

Secondly, our system reconstructed an overall sharp image using depth scanning (Figure 3). We calculated the temperature difference between the ground truth and our algorithm. If the correct focus setting corresponding to the thermal pixel was selected the temperature difference is expected to be zero. The depth scanning algorithm was compared to an image with the focus set to the middle board.

Reconstructed 3D scene with applied thermal map.
Figure 3

Reconstructed 3D scene with applied thermal map.

3 Results

When using depth scanning, the temperature deviation due to blurring is reduced (Table 1). The biggest improvement was achieved at near distance to the object. At far distances the negative effects of the small field of depth have less impact.

Table 1

Temperature deviation due to blurring.

4 Discussion and conclusion

We showed that reconstructing an all-in-focus thermal image improves the image quality and therefor the accuracy of the temperature measurement. The temperature deviation induced by blurring was reduced but was still up to 2.45 K.

Remaining errors may be caused by the rather simple model used for the camera calibration. Adjusting the focus causes a change in the focal length parameter which was assumed fix. This dependency needs to be added to the calibration process.

[18] improved the image quality by calculating an all-in-focus thermal image by applying algorithms developed for visual images. This process requires adjusting parameters depending on the scene. By using a depth camera no parameters need to be adjusted, but more hardware is needed and the calibration process is more complex.

Further evaluation of the imaging system is needed to assess the impact on the image quality in medical use cases.

Author’s Statement

Research funding: Sponsored by the Federal Ministry of Education and Research. Conflict of interest: Authors state no conflict of interest. Material and methods: Informed consent: Informed consent is not applicable. Ethical approval: The conducted research is not related to either human or animal use.

References

  • [1]

    Ring EF, Ammer K. Infrared thermal imaging in medicine. Physiol Meas. 2012;33:R33–46. Google Scholar

  • [2]

    Gerasimova E, Audit B, Roux SG, Khalil A, Gileva O, Argoul F, et al. Wavelet-based multifractal analysis of dynamic infrared thermograms to assist in early breast cancer diagnosis. Front Physiol. 2014;5(Article 176):1–11. Google Scholar

  • [3]

    Çetingül MP, Herman C. Quantification of the thermal signature of a melanoma lesion. Int J Therm Sci. 2011;50:421–31. Google Scholar

  • [4]

    Szentkuti A, Kavanagh HS, Grazio S. Infrared thermography and image analysis for biomedical use. Period Biol. 2011;113:385–92. Google Scholar

  • [5]

    Jin C, Yang Y, Xue ZJ, Liu KM, Liu J. Automated analysis for screening knee osteoarthritis using medical infrared thermography. J Med Biol Eng. 2013;33:471–7. Google Scholar

  • [6]

    Kateb B, Yamamoto V, Yu C, Grundfest W, Gruen JP. Infrared thermal imaging: a review of literature and case report. NeuroImage. 2009;47:T154–62. Google Scholar

  • [7]

    Steiner G, Sobottka SB, Koch E, Schackert G, Kirsch M. Intraoperative imaging of cortical cerebral perfusion by time-resolved thermography and multivariate data analysis. J Biomed Opt. 2011;16:1–6. Google Scholar

  • [8]

    Rathmann P, Lindner D, Halama D, Chalopin C. Dynamische Infrarot-Thermographie (DIRT) zur Darstellung der Kopfhautdurchblutung bei neurochirurgischen Eingriffen, in Tagungsband der 14. Jahrestagung der Deutschen Gesellschaft für Computer und Roboterassistierte Chirurgie (CURAC), Bremen, 2015, pp. 17–19. Google Scholar

  • [9]

    de Weerd L, Weum S, Mercer JB. The value of dynamic infrared thermography (DIRT) in perforatorselection and planning of free DIEP flaps. Ann Plast Surg. 2009;63:274–9. Google Scholar

  • [10]

    Chubb DP, Taylor GI, Ashton MW. True and “Choke” anastomoses between perforator angiosomes: part II. Dynamic thermographic identification. Plast Reconstr Surg. 2013;132:1457–64. Google Scholar

  • [11]

    Lohman RF, Ozturk CN, Ozturk C, Jayaprakash V, Djohan R. An analysis of current techniques used for intraoperative flap evaluation. Ann Plast Surg. 2015;75:679–85. Google Scholar

  • [12]

    Just M, Chalopin C, Unger M, Halama D, Neumuth T, Dietz A, et al. Monitoring of microvascular free flaps following oropharyngeal reconstruction using infrared thermography: first clinical experiences. Eur Arch Otorhinolaryngol. 2015;273:1–9. Google Scholar

  • [13]

    Barone S, Paoli A, Razionale AV. A biomedical application combining visible and thermal 3D imaging. presented at the XVIII Congreso internactional de Ingenieria Grafica, Barcelona, 2006. Google Scholar

  • [14]

    Spalding SJ, Kwoh CK, Boudreau R, Enama J, Lunich J, Huber D, et al. Three-dimensional and thermal surface imaging produces reliable measures of joint shape and temperature: a potential tool for quantifying arthritis. Arthritis Res Ther. 2008;10:R10. Google Scholar

  • [15]

    Grubisic I, Gjenero L, Lipic T, Sovic I, Skala T. Medical 3D thermography system. Period Biol. 2011;113:401–6. Google Scholar

  • [16]

    Chen CY, Yeh CH, Chang BR, Pan JM. 3D reconstruction from IR thermal images and reprojective evaluations. Math Probl Eng. 2015;2015:8. Google Scholar

  • [17]

    Schuster N, Franks J. Depth of field in modern thermal imaging. Presented at the SPIE Defense+ Security. International Society for Optics and Photonics, 2015, pp. 94520J–94520J–11. Google Scholar

  • [18]

    Soldan S. On extended depth of field to improve the quality of automated thermographic measurements in unknown environments. Quant Infrared Thermogr J. 2012;9:135–50. Google Scholar

About the article

Published Online: 2016-09-30

Published in Print: 2016-09-01


Citation Information: Current Directions in Biomedical Engineering, Volume 2, Issue 1, Pages 369–372, ISSN (Online) 2364-5504, DOI: https://doi.org/10.1515/cdbme-2016-0162.

Export Citation

©2016 Michael Unger et al., licensee De Gruyter.. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. BY-NC-ND 4.0

Comments (0)

Please log in or register to comment.
Log in