Infrared thermography was introduced in the medical area already in the 1960s . It is employed for diagnosis and monitoring of diseases involving temperature differences. Some applications are the detection of tumors ,  or the monitoring of cardiovascular diseases , . Moreover, because acquisition of data is performed contactless, thermography was brought in surgery for pre-operative planning, as intra-operative assistance tool and for post-operative monitoring of surgical outcome. Applications were mostly reported in neurosurgery , ,  and in plastic surgery , . Furthermore, since body temperature is correlated with tissue perfusion, identification and monitoring of skin transplants can be performed with this imaging technique , .
Recent technical developments apply thermal imaging onto 3D scenes for visualization purposes . This technique was presented for medical applications too , , . However, thermal cameras possess a very small depth of field, especially at small distances between camera and object . Soldan  extended the depth of field by combining multiple thermal images of the same scene acquired with varying focuses. The optimal focus for an object is chosen by selecting a focus measure function (FMF). The FMF yields a minimum or maximum value at optimum. But good results require selecting an optimal FMF and it’s parameters for a given scene.
To overcome this disadvantage we combined an infrared thermal camera and a depth camera. The latter records a 3D surface of the examined object. Multiple thermal images of the scene are acquired with the infrared camera at varying focus settings. The optimal measurement is selected as a function of it’s depth in the 3D scene. Sharp parts of each thermal image are projected on the 3D object surface in order to reconstruct a sharp 3D thermal scene. An automatic focus control system was implemented with the future goal to use the device in the operating room.
2 3D thermographic imaging system
Then imaging system consisting of an infrared thermal camera Optris PI450 and a Microsoft Kinect V2, providing depth and visual information (Figure 1). The thermal camera has a spatial resolution of 382×288 pixels and a thermal sensitivity of 40 mK. The Kinect V2 provides a depth image with a spatial resolution of 512×424 pixels and a visual image of 1920×1080 pixels. The depth resolution is approximately 1 mm in a range of 0.5 m to 4 m.
The sharpness of the thermal image needs to be adjusted manually. To overcome this, we outfitted the camera with a drive for focus control. The hardware consists of a gear motor with a rotary encoder for position measurement which is controlled by an Arduino board. A firmware was developed providing basic functions for referencing the focus as well as getting and setting the focus position.
A software developed by us retrieves the temperature information and a depth image via provided vendor software. The thermal images are acquired using the Optris PI Connect which supplies the data for the inter-process communication via a dynamic link library. The Kinect for Windows SDK is used to acquire the depth and RGB images from the Kinect camera. After reconstructing the all-in-focus image, the 3D scene with applied thermal image mapping is rendered using the Kinect Fusion API.
2.1 Calibration of the system components
The components of the 3D thermographic imaging system must be calibrated in order to combine the different imaging modalities. The calibration process is described in detail in the following sections.
2.1.1 Calibration of the focus
Changing the focus of the thermal camera defines which objects at a certain distance to the camera will be displayed sharply. For automating the image acquisition process, the dependency between focus setting and the focal plane needs to be known. Therefore, the focus was referenced (zero position) by driving it to its end position. Following, the characteristic curve was determined by incrementally setting the focus and measuring the corresponding focal plane. The measurement was done by moving a checkerboard until it was displayed sharply and reading the corresponding object distance from the depth image.
2.1.2 Camera calibration
To be able to combine the spatial data of the depth camera and the temperature data of the thermal camera, the optical intrinsic (lens specific) and extrinsic parameters of the components must be calculated. For this process a known geometric pattern that is visible in both image modalities is needed. We used a checkerboard pattern which had alternating tiles removed (Figure 2). The removed tiles were clearly visible in both the depth and thermal image.
Firstly, the intrinsic parameters of the thermal camera were calculated. The intrinsic parameters of the depth camera are stored in its firmware. Secondly, the extrinsic parameters describing the spatial relation between the cameras were obtained. Using the resulting transformation matrix, the thermal image can projected onto the 3D scene retrieved by the depth camera.
2.2 Depth scanning
To improve the image quality, sharp parts of multiple thermal images are combined to an overall sharp image. Therefore, images at multiple focal planes are acquired. Points of the 3D surface acquired with the depth camera become the temperature value in the corresponding thermal image obtained using the transformation between thermal and depth images.
2.3 System evaluation
To evaluate the system an object which is visible in the depth and thermal data was constructed. We mounted LED stripes into grids onto a plane board. Three boards were positioned to focal planes at a distances of 500, 1000, 1500 mm. Each board contained 120 LEDs.
The current flowing through the LEDs causes heating which was used to evaluate the image quality. If the plane containing the LEDs is out of focus it gets blurred. This blurring leads to a reduced temperature of the LED measured in the thermal image. The LEDs were heated for half an hour to reach a steady state. Then a manually focused thermal image of each board was acquired. These images were used as a ground truth.
Secondly, our system reconstructed an overall sharp image using depth scanning (Figure 3). We calculated the temperature difference between the ground truth and our algorithm. If the correct focus setting corresponding to the thermal pixel was selected the temperature difference is expected to be zero. The depth scanning algorithm was compared to an image with the focus set to the middle board.
4 Discussion and conclusion
We showed that reconstructing an all-in-focus thermal image improves the image quality and therefor the accuracy of the temperature measurement. The temperature deviation induced by blurring was reduced but was still up to 2.45 K.
Remaining errors may be caused by the rather simple model used for the camera calibration. Adjusting the focus causes a change in the focal length parameter which was assumed fix. This dependency needs to be added to the calibration process.
 improved the image quality by calculating an all-in-focus thermal image by applying algorithms developed for visual images. This process requires adjusting parameters depending on the scene. By using a depth camera no parameters need to be adjusted, but more hardware is needed and the calibration process is more complex.
Further evaluation of the imaging system is needed to assess the impact on the image quality in medical use cases.
Research funding: Sponsored by the Federal Ministry of Education and Research. Conflict of interest: Authors state no conflict of interest. Material and methods: Informed consent: Informed consent is not applicable. Ethical approval: The conducted research is not related to either human or animal use.
Ring EF, Ammer K. Infrared thermal imaging in medicine. Physiol Meas. 2012;33:R33–46. Google Scholar
Gerasimova E, Audit B, Roux SG, Khalil A, Gileva O, Argoul F, et al. Wavelet-based multifractal analysis of dynamic infrared thermograms to assist in early breast cancer diagnosis. Front Physiol. 2014;5(Article 176):1–11. Google Scholar
Çetingül MP, Herman C. Quantification of the thermal signature of a melanoma lesion. Int J Therm Sci. 2011;50:421–31. Google Scholar
Szentkuti A, Kavanagh HS, Grazio S. Infrared thermography and image analysis for biomedical use. Period Biol. 2011;113:385–92. Google Scholar
Jin C, Yang Y, Xue ZJ, Liu KM, Liu J. Automated analysis for screening knee osteoarthritis using medical infrared thermography. J Med Biol Eng. 2013;33:471–7. Google Scholar
Kateb B, Yamamoto V, Yu C, Grundfest W, Gruen JP. Infrared thermal imaging: a review of literature and case report. NeuroImage. 2009;47:T154–62. Google Scholar
Steiner G, Sobottka SB, Koch E, Schackert G, Kirsch M. Intraoperative imaging of cortical cerebral perfusion by time-resolved thermography and multivariate data analysis. J Biomed Opt. 2011;16:1–6. Google Scholar
Rathmann P, Lindner D, Halama D, Chalopin C. Dynamische Infrarot-Thermographie (DIRT) zur Darstellung der Kopfhautdurchblutung bei neurochirurgischen Eingriffen, in Tagungsband der 14. Jahrestagung der Deutschen Gesellschaft für Computer und Roboterassistierte Chirurgie (CURAC), Bremen, 2015, pp. 17–19. Google Scholar
de Weerd L, Weum S, Mercer JB. The value of dynamic infrared thermography (DIRT) in perforatorselection and planning of free DIEP flaps. Ann Plast Surg. 2009;63:274–9. Google Scholar
Chubb DP, Taylor GI, Ashton MW. True and “Choke” anastomoses between perforator angiosomes: part II. Dynamic thermographic identification. Plast Reconstr Surg. 2013;132:1457–64. Google Scholar
Lohman RF, Ozturk CN, Ozturk C, Jayaprakash V, Djohan R. An analysis of current techniques used for intraoperative flap evaluation. Ann Plast Surg. 2015;75:679–85. Google Scholar
Just M, Chalopin C, Unger M, Halama D, Neumuth T, Dietz A, et al. Monitoring of microvascular free flaps following oropharyngeal reconstruction using infrared thermography: first clinical experiences. Eur Arch Otorhinolaryngol. 2015;273:1–9. Google Scholar
Barone S, Paoli A, Razionale AV. A biomedical application combining visible and thermal 3D imaging. presented at the XVIII Congreso internactional de Ingenieria Grafica, Barcelona, 2006. Google Scholar
Spalding SJ, Kwoh CK, Boudreau R, Enama J, Lunich J, Huber D, et al. Three-dimensional and thermal surface imaging produces reliable measures of joint shape and temperature: a potential tool for quantifying arthritis. Arthritis Res Ther. 2008;10:R10. Google Scholar
Grubisic I, Gjenero L, Lipic T, Sovic I, Skala T. Medical 3D thermography system. Period Biol. 2011;113:401–6. Google Scholar
Chen CY, Yeh CH, Chang BR, Pan JM. 3D reconstruction from IR thermal images and reprojective evaluations. Math Probl Eng. 2015;2015:8. Google Scholar
Schuster N, Franks J. Depth of field in modern thermal imaging. Presented at the SPIE Defense+ Security. International Society for Optics and Photonics, 2015, pp. 94520J–94520J–11. Google Scholar
Soldan S. On extended depth of field to improve the quality of automated thermographic measurements in unknown environments. Quant Infrared Thermogr J. 2012;9:135–50. Google Scholar
About the article
Published Online: 2016-09-30
Published in Print: 2016-09-01
Citation Information: Current Directions in Biomedical Engineering, Volume 2, Issue 1, Pages 369–372, ISSN (Online) 2364-5504, DOI: https://doi.org/10.1515/cdbme-2016-0162.
©2016 Michael Unger et al., licensee De Gruyter.. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. BY-NC-ND 4.0