Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Biomedical Engineering / Biomedizinische Technik

Joint Journal of the German Society for Biomedical Engineering in VDE and the Austrian and Swiss Societies for Biomedical Engineering and the German Society of Biomaterials

Editor-in-Chief: Dössel, Olaf

Editorial Board: Augat, Peter / Habibović, Pamela / Haueisen, Jens / Jahnen-Dechent, Wilhelm / Jockenhoevel, Stefan / Knaup-Gregori, Petra / Lenarz, Thomas / Leonhardt, Steffen / Plank, Gernot / Radermacher, Klaus M. / Schkommodau, Erik / Stieglitz, Thomas / Boenick, Ulrich / Jaramaz, Branislav / Kraft, Marc / Lenthe, Harry / Lo, Benny / Mainardi, Luca / Micera, Silvestro / Penzel, Thomas / Robitzki, Andrea A. / Schaeffter, Tobias / Snedeker, Jess G. / Sörnmo, Leif / Sugano, Nobuhiko / Werner, Jürgen /


IMPACT FACTOR 2018: 1.007
5-year IMPACT FACTOR: 1.390

CiteScore 2018: 1.24

SCImago Journal Rank (SJR) 2018: 0.282
Source Normalized Impact per Paper (SNIP) 2018: 0.831

Online
ISSN
1862-278X
See all formats and pricing
More options …
Ahead of print

Issues

Volume 57 (2012)

Iris recognition under the influence of diabetes

Mohammadreza Azimi
  • Institute of Electronics and Information Technology, Warsaw University of Technology, ul. Nowowiejska 15/19, 00-665 Warsaw, Poland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Seyed Ahmad Rasoulinejad / Andrzej Pacut
  • Institute of Electronics and Information Technology, Warsaw University of Technology, ul. Nowowiejska 15/19, 00-665 Warsaw, Poland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2019-07-19 | DOI: https://doi.org/10.1515/bmt-2018-0190

Abstract

In this study, iris recognition under the influence of diabetes was investigated. A new database containing 1318 pictures from 343 irides – 546 images from 162 healthy irides (62% female users, 38% male users, 21% <20 years old, 61% (20) < 40 years old, 12% (40) <60 years old and 6% more than 60 years old) and 772 iris images from 181 diabetic eyes but with a clearly visible iris pattern (80% female users, 20% male users, 1% <20 years old, 17.5% (20) <40 years old, 46.5% (40) <60 years old and 35% more than 60 years old) – were collected. All of the diabetes-affected eyes had clearly visible iris patterns without any visible impairments and only type II diabetic patients with at least 2 years of being diabetic were considered for the investigation. Three different open source iris recognition codes and one commercial software development kit were used for achieving the iris recognition system’s performance evaluation results under the influence of diabetes. For statistical analysis, the t-test and the Kolmogorov-Simonov test were used.

Keywords: biometrics; diabetes mellitus; iris; iris recognition; performance evaluation

Introduction

Biometric techniques containing face recognition, iris recognition, echocardiograph (ECG) signal-based recognition methods, etc. have been extensively used in recent years because of the increasing popularity of biometric systems. The reliability of biometric recognition systems can be affected by different social issues [1], [2]. Iris recognition is one of the most reliable modalities for identification purposes. The iris is a circular, thin structure in the eye that can control the diameter of the pupil, and therefore the amount of light that reaches the retina can be controlled by iris. According to the results of several studies done during last two decades [3], it has been proved that iris pattern is unique, and its detailed structure of the front layer can be used for identification of individuals. Actually, even in identical twins, the iris pattern is completely different, and it is also worth mentioning that the detailed structures of the irises of an individual’s two eyes are completely discriminable from each other.

Among the factors that can challenge an iris recognition system’s reliability and degrade the accuracy of the mentioned systems, the most important are the abnormalities in the iris tissue pattern due to the ocular conditions.

Based on previous research studies concerned with predicting the ethnicity and the gender of a person based on the analysis of features of the iris texture, specifically the research done by Howar and Etter [4], it has been hypothesized that factors such as ethnicity, gender and even eye color can play a significant role in the expected false rejection rate for various individuals across a population. They concluded that Asian and African-American individuals with brown eyes are the most probable groups to be mistakably not recognized as true users by biometric recognition systems.

In this work, we investigated the biometric system performance evaluation results’ degradation due to the non-obvious distortions in iris texture caused by diabetes type II.

Related works

Diabetes is a risk factor for many well-known diseases, and it is a growing epidemic especially among elderly people [5]. This social issue can be the main reason for eye diseases including diabetic retinopathy [6], diabetic macular edema (DME) [7], cataract and glaucoma. According to previous studies, strong academic evidence suggests that diabetes can be diagnosed by examining the iris texture (by conducting a classification-based study). Hence, here, it is hypothesized that the iris recognition system’s accuracy for healthy and diabetic eyes must be different and the reliability of a biometric recognition system can be influenced by the health condition of the participants [8], [9], [10].

Samant and Agarwal [8], provided an automated tool with machine learning techniques to access the correlation between distortion of iris tissues and diabetes mellitus. They also proposed a diagnostic tool along with mainstream diagnosis methods for discerning healthy patients and those suffering from diabetes. The results show a best classification accuracy of 89.63% calculated from a random forest (RF) classifier.

Obviously, identification of irises suffering from ophthalmic diseases would be harder than healthy ones due to the problems that occur for the system during the segmentation and encoding process. Segmentation and feature extraction of those iris patterns which are occluded due to the surrounding tissue pushing the pupillary boundary is harder than the segmentation of healthy iris images [11], [12], [13], [14].

In order to answer this question: “How do iris recognition methods perform in the presence of ophthalmic disorders?”, Trokielewicz et al. [15] studied the effect of eye diseases on the reliability of an iris recognition system using the irises from 92 participants affected by illness (184 eyes), for four different shapes of disorder. The collected database consists of nearly 3000 near infra-red (NIR) and visible wavelength pictures from routine ophthalmological practice. According to the results, obstructed and geometrical deformation of the iris can significantly decrease the matching score between genuine samples as the disease can change the structure of tissue and the geometry of eye. It has also been reported [16] that because of the mentioned problems for diseased eyes, iris segmentation phase is the most sensitive part of the whole recognition process. Dhir et al. showed that cataract surgeries can also affect the performance of iris recognition systems [17].

New database

For this study, a new database was collected to investigate if there is any relation between the iris recognition system’s accuracy and the health condition of the users or not? Hence, we needed volunteers to acquire datasets for the investigation of iris recognition system’s accuracy under the influence of diabetes. The experiment was specifically designed to investigate if there was any relation between the iris recognition system’s accuracy and the health condition of the users or not. Before the experiment, consent agreements were signed by the participants. The participants were also asked to provide non-biometric data, including their names, gender, age and the duration (if applies) of their diabetes illness. The personal data were kept separately to guarantee additional security of the personal data. As a result, all of the participants were fully aware of the experiment as we provided full detailed information on the study and it is important to note that the signed consent forms were also obtained from all of the individuals. The experiment protocol was approved by the Ethics Committee of Warsaw University of Technology. The collected database includes NIR iris images taken from volunteers. All those participants were from the same ethnic group (Iranian) but from different age groups. The data samples were captured using a commercial iris capture device: Iri Shield USB MK 2120U [18], which was connected to a Galaxy A5 smartphone for storage of iris samples. The Iri Shield USB MK 2120U is a mono iris NIR capture device widely used for capturing iris texture pictures and the captured iris images are compliant with ISO/IEC 19794-6 standard. For each user, the whole acquisition time was less than 1 min. All of the samples were obtained during one session. The data collection took approximately 1 month (25 Aug–15 Sept). As is shown in Table 1, the new offered database contains 546 pictures from 162 healthy irides (62% female users, 38% male users, 21% <20 years old, 61% (20) <40 years old, 12% (40) <60 years old and 6% more than 60 years old) and 772 iris images from 181 diabetic eyes but with a clearly visible iris pattern (80% female users, 20% male users, 1% <20 years old, 17.5% (20) <40 years old, 46.5% (40) <60 years old and 35% more than 60 years old). For healthy users (who were not suffering from diabetes mellitus) no medical examination was done and for collection of the non-healthy iris samples we requested individuals with known diabetes to participate in our experiment.

Table 1:

Demographics of collected database.

Regarding the process of preparing the diabetic iris database, we hereby, confirm that:

  1. The process of taking pictures with the NIR camera causes no harm to the subjects and did not damage the volunteers.

  2. Volunteers informed us of how long they were taking diabetes pills/insulin and no tests were performed to confirm their diabetes status.

  3. No drugs were used for testing.

  4. There was no need for medical examination to record and save photos.

In Figure 1, four samples from each group are presented and the samples were captured using a NIR infra-red camera. The pictures are gray scale and the resolution of pictures is 640×480 pixels. It is worth mentioning that all of the volunteers are from same ethnic group (Iranian), and it is also necessary to note that volunteers were living in Iran and they have the same dietary culture.

First row: diabetic eyes, second row: healthy eyes – captured by mono-ocular Iri Shield USB MK 2120U [18].
Figure 1:

First row: diabetic eyes, second row: healthy eyes – captured by mono-ocular Iri Shield USB MK 2120U [18].

Materials and methods – methodology description

Despite diabetes diagnostic techniques which need just a small part of iris texture’s region, for iris recognition purposes the overall structure of the iris must be segmented.

Iris segmentation (weighted adaptive Hough and ellipsopolar transform) [19]

Here, for iris segmentation weighted adaptive Hough and ellipsopolar transform (WAHET) methodology has been used.

The WAHET technique is a two-stage iris segmentation technique;

  1. Finding the center point: the center of multiple approximately concentric rings at iteratively refined resolutions can be determined by removing the detected reflection mask, detecting the edge and finally by applying the weighted adaptive Hough transform.

  2. Extracting the region of interest: the center point must be used for this purpose. Firstly, the initial boundary must be detected and after first iteration, an ellipsolar transform is applied for inner and outer boundary detection. After this stage the extracted iris texture is normalized using Daugman’s rubber sheet model.

Feature extraction

For feature extraction, three descriptors were selected and used: the iris coding method based on differences of discrete cosine transform (DCT) [20], one-dimensional (1D)-log Gabor feature extraction or the Masek and Kovesi algorithm [21] and the Rathgeb and Uhl (cr) algorithm [22].

  1. DCT: Due to DCT’s much lower complexity, it can be considered as a computationally intensive replacement for the Karhunen Loeve transform (KLT). DCT is a real valued transform and it calculates the truncated Chebyshev series processing minimax properties.

  2. 1D-log Gabor feature extraction: The algorithm proposed by Masek and Kovesi [21] examines 1D-intensity signals applying a dyadic wavelet transform and a log-Gabor filter, respectively.

    The frequency response of a log-Gabor filter is given as (Eq. 1):

    G(f)=exp((log(f/f0)2)2(log(s/f0)2))(1)

    where, f0 is the center frequency, and σ is the bandwidth of the filter.

  3. Algorithm of Ratgheb et al.: This feature extraction method is based on comparisons between gray scale values. The features can be extracted by examining the local intensity variations in the iris texture. This technique also includes a post iris texture image processing stage in order to eliminate the small peaks of pixel paths by determining the threshold. Ratgheb et al.’s descriptor needs no complex calculations.

In order to segment the iris and calculate the comparison score between samples the University of Salzburg Iris Toolkit (USIT), available at http://www.wavelab.at/sources/Rathgeb12e/ [23] was used successfully.

Matching

Using USIT, the comparison scores can be achieved based on calculation of the Hamming distance. Hamming distance is the measure of similar bits between two-bit patterns. The formula is given as (Eq. 2):

HD=1Nj=1NXjYj(2)

The all vs. all comparison scenarios were used for achieving the maximum possible number of matching scores.

For validation of the obtained results, the commercial product: Verieye [24] was also used. The selected image was compared to the entire database and a score was outputted for each. The maximum score is 1557 and this score will appear as the comparison score between two identical images. Verieye outputs a score of 0 when it strongly rejects two irises as a match. For segmentation, Verieye uses active shape models that accurately detect contours of the irises which are not perfect circles. The enrollment and matching routines are fast and yield very high matching performance/accuracy.

Results and discussion

In this section, the obtained comparison scores between samples are presented and discussed. According to number of possible comparison scores, up to 900,000 results were achieved.

As can be illustrated in Figure 2, the empirical cumulative distribution functions of genuine scores obtained by a – Verieye, b – DCT, c – 1d-log Gabor and d – cr codes show statistically significant differences between those comparison scores obtained by comparing diabetic iris samples and those matching results achieved by comparison of iris pattern images taken from healthy irides. Based on the obtained biometric results, user identification tends to be more difficult under the influence of diabetes and the accuracy of iris recognition system for healthy irides is higher.

Empirical cumulative distribution function. (A) Verieye; (B) DCTC (similarity score: 1-distance); (C) 1D-logGabor (similarity score: 1-Hamming distance); (D) CR (similarity score: 1-distance).
Figure 2:

Empirical cumulative distribution function.

(A) Verieye; (B) DCTC (similarity score: 1-distance); (C) 1D-logGabor (similarity score: 1-Hamming distance); (D) CR (similarity score: 1-distance).

The results show that if the gallery and probe images are taken from healthy eyes, every recognition system yields the best performance. But for identification of users under the influence of diabetes a reduction in performance is observed.

This is due to the fact that, there are some non-obvious disorders in diabetic eyes which appears to make the identification task more difficult. In order to test if the first null hypothesis which states that: “the mean values of distributions are the same” and second null hypothesis which claims that: “the obtained samples are drawn from the same distributions” can be proved or not?, we used the t-test and Kolmogorov-Simonov tests, respectively. According to chosen the confidence threshold (0.01) and the obtained p-values which are near zero, both null hypothesizes can be definitely rejected and we can conclude that the performance of iris recognition system for healthy eyes is better than diabetic ones (Table 2). In the other words, it is harder to recognize people who are suffering from diabetes according to some non-obvious disorders in their iris texture. The receiving operating characteristic (ROC) curves for healthy and diabetic groups are presented in Figure 3. According to Figure 3, the healthy eyes are easier to recognize in comparison with non-healthy eyes. It is also worth mentioning that the Verieye had the best performance while the 1D-log Gabor was the worst descriptor for feature extraction based on the obtained results.

Table 2:

Hypothesis test results.

ROC curve. (A) Verieye; (B) DCTC; (C) 1D-log Gabor; (D) CR.
Figure 3:

ROC curve.

(A) Verieye; (B) DCTC; (C) 1D-log Gabor; (D) CR.

Donating image samples of iris with high quality from an older population was less easy than from the young age group, however, younger users can provide iris biometric samples with higher quality; thus, age progression can be considered as a source of error and the increase in age may have an effect on the quality of the acquired biometric data. Hence, part of the observed differences between the obtained comparison results for the mentioned two groups may be the consequence of the difference in mean age of healthy and diabetic groups. The area under curve (AUC) can be defined as:

ThecoverageunderROCcurve-empiricalThecoverageunderROCcurve-ideal×100(3)

According to Eq. 3, as the accuracy of recognition system increases the coverage will increase. In an ideal case, when we can enhance the accuracy of system to 100%, (when true acceptance rate equals one with a false rejection rate=0), the AUC would be 100%. As a result, a higher AUC means better performance evaluation results. Table 3 presents the AUC for different ROC curves which are presented in Figure 3.

Table 3:

AUC for different matchers.

The biological age of individuals can obviously cause changes in the biometric data. For instance, for different incidental lighting levels, the human iris’s ability for adopting and accommodating sharply decreases with age, and this in turn can lead to issues about the availability of the features used to recognize a particular individual’s iris pattern.

Hence, as the users are from different age groups, maybe this factor can be considered as one of the influential parameters.

Conclusion

This paper is concerned with the reliability of testing an iris recognition system of participants under the influence of diabetes. A new database has been collected and offered here. We used four different matchers, in order to obtain a similarity scores between the captured samples. Although there is no obvious impairment to the non-healthy irides, according to the results achieved by all four matchers (three open source codes and one commercial closed one), the accuracy of system is higher when we want to recognize healthy people using their iris texture images. The best performance was observed using the Verieye system and the the methodology proposed by Monro et al. (USITv.2). We only used the codes and we did not modify/change any of them. It must be also noted that, due to the physiology of differences in pupil dilation mechanisms, (pupil dilation responsiveness decreasing with age) for users from different age groups, the difference in the mean age of the chosen groups must be considered as an additional source of error.

References

  • [1]

    Azimi M, Pacut A. The effect of gender-specific facial expressions on face recognition systems relaibility, 2018, IEEE International Conference on Automation, Quality and Testing, Robotics (AQTR), 1–4. DOI: 10.1109/AQTR.2018.8402705. Google Scholar

  • [2]

    Hollingsworth K, Bowyer KW, Lagree S, Fenker SP, Flynn PJ. Genetically identical irises have texture similarity that is not detected by iris biometrics. Comput Vis Image Underst 2011;115:1493–502. Web of ScienceCrossrefGoogle Scholar

  • [3]

    Daugman JG. High confidence visual recognition of persons by a test of statistical independence. IEEE T Pattern Anal 1993;15:1148–61. CrossrefGoogle Scholar

  • [4]

    Howar JJ, Etter D. The Effect of Ethnicity, Gender, Eye Color and Wavelength on the Biometric Menagerie. IEEE International Conference on Technologies for Homeland Security (HST), 2013:627–632. DOI: . CrossrefGoogle Scholar

  • [5]

    Rasoulinejad SA, Zarghami A, Hosseini SR, Rajaee N, Rasoulinejad SE, Mikaniki E. Prevalence of age-related macular degeneration among the elderly. Caspian J Intern Med 2015;6:141–7. PubMedGoogle Scholar

  • [6]

    Rasoulinejad SA, Hajian-Tilaki K, Mehdipour E. Associated factors of diabetic retinopathy in patients that referred to teaching hospitals in Babol. Caspian J Intern Med 2015;6:224–8. PubMedGoogle Scholar

  • [7]

    Noor-ul-huda M, Tehsin S, Ahmed S, Niazi FAK, Murtaza Z. Retinal images benchmark for the detection of diabetic retinopathy and clinically significant macular edema (CSME). Biomed Eng Biomed Tech 2019;64:297–307. Web of ScienceGoogle Scholar

  • [8]

    Samant P, Agarwal R. Machine learning techniques for medical diagnosis of diabetes using iris image. Comput Methods Programs Biomed 2018;157:121–8. PubMedCrossrefGoogle Scholar

  • [9]

    Heydari M, Teimouri M, Heshmati Z. Comparison of various classification algorithms in the diagnosis of type 2 diabetes in Iran. Int J Diabetes Dev Ctries 2016;36:167–73. Web of ScienceCrossrefGoogle Scholar

  • [10]

    Chaksar UM, Sutaone MS. On a methodology for detecting diabetic presence from iris image analysis, 2012 International Conference on Power, Signals, Controls and Computation, 1 6. DOI: 10.1109/EPSCICON.2012.6175268. Google Scholar

  • [11]

    Aslam TM, Tan SZ, Dhillon B. Iris recognition in the presence of ocular disease. J R Soc Interface 2009;6:489–93. Web of ScienceCrossrefPubMedGoogle Scholar

  • [12]

    Seyeddain O, Kraker H, Redlbeger A, Dexl AK, Grabner G, Emesz M. Reliability of automatic biometric iris recognition after phacoemulsification or drug-induced pupil dilation. Eur J Ophthalmol 2014;24:58–62. Web of ScienceCrossrefPubMedGoogle Scholar

  • [13]

    Borgen H, Bours P, Wolthusen SD. Simulating the influences of aging and ocular disease on biometric recognition performance. International Conference on Biometrics 2009. LNCS 2009; 5558:857–67. Google Scholar

  • [14]

    Nigam I, Vatsa M, Singh R. Ophthalmic Disorder Menagerie and Iris Recognition. In: Bowyer K., Burge M, editors. Handbook of Iris Recognition. chapter 22. Advances in Computer Vision and Pattern Recognition. London: Springer; 2016:519–39. Google Scholar

  • [15]

    Trokielewicz M, Czajka A, Maciejewicz P. Database of iris images acquired in the presence of ocular pathologies and assessment of iris recognition reliability for disease-affected eyes. IEEE 2nd International Conference on Strony. 2015:495–500. Google Scholar

  • [16]

    Trokielewicz M, Czajka A, Maciejewicz P. Implications of ocular pathologies for iris recognition reliability. Image Vision Comput 2017;58:158–67. CrossrefWeb of ScienceGoogle Scholar

  • [17]

    Dhir L, Habib NE, Monro DM, Rakshit S. Effect of cataract surgery and pupil dilation on iris pattern recognition for personal authentication. Eye 2010;24:1006–10. PubMedWeb of ScienceCrossrefGoogle Scholar

  • [18]

    Available from: http://www.iritech.com/products/hardware/irishield%E2%84%A2-series

  • [19]

    Uhl A, Wild P. Weighted adaptive hough and ellipsopolar transforms for real-time iris segmentation. 5th International Conference on Biometrics (ICB’12). 2012:283–90. Google Scholar

  • [20]

    Zhang D, Monro DM, Rakshit S. DCT-based iris recognition. IEEE Trans Pattern Anal 2007;29:586–95. CrossrefGoogle Scholar

  • [21]

    Masek L, Kovesi P. MATLAB Source Code for a Biometric Identification System Based on Iris Patterns. The School of Computer Science and Software Engineering, The University of Western Australia, Perth, 2003. Google Scholar

  • [22]

    Rathgeb C, Uhl A. Secure Iris Recognition based on Local Intensity Variations. In: Proceedings of the International Conference on Image Analysis and Recognition (ICIAR’10). Springer, LNCS 6112; 2010:266–75. Google Scholar

  • [23]

    Rathgeb C, Uhl A, Wild P, Hofbauer H. Design Decisions for an Iris Recognition SDK. In: Bowyer K, Burge M, editors. Handbook of Iris Recognition. Advances in Computer Vision and Pattern Recognition. London: Springer; 2016. Google Scholar

  • [24]

    Available from: www.neurotechnology.com/verieye.html

About the article

Received: 2018-09-27

Accepted: 2018-12-21

Published Online: 2019-07-19


Funding Source: AMBER with sponsorship from the Marie Sklodowska-Curie EU Framework for Research and Innovation Horizon 2020

Award identifier / Grant number: 675087

This work is done with funding source from AMBER with sponsorship from the Marie Sklodowska-Curie EU Framework for Research and Innovation Horizon 2020, under Grant Agreement No. 675087, Funder Id: http://dx.doi.org/10.13039/100010661. The authors thank to all of participants who took part in this study.


Author statement

Research funding: Funding source from AMBER with sponsorship from the Marie Sklodowska-Curie EU Framework for Research and Innovation Horizon 2020, under Grant Agreement No. 675087.

Conflict of interest: The authors have no conflict of interest to declare.

Informed consent: Informed consent is not applicable.

Ethical approval: The experiment was specifically designed to investigate if there is any relation between an iris recognition system’s accuracy and the health condition of the users or not. Before the experiment, consent agreements were signed by the study participants. The participants were also asked to provide non-biometric data, including their names, gender, age and the duration (if applies) of their diabetes illness. The personal data are kept separately to guarantee additional security of the personal data. As a result, all participants were fully aware of the experiment as detailed information on the study was provided.


Citation Information: Biomedical Engineering / Biomedizinische Technik, 20180190, ISSN (Online) 1862-278X, ISSN (Print) 0013-5585, DOI: https://doi.org/10.1515/bmt-2018-0190.

Export Citation

©2019 Walter de Gruyter GmbH, Berlin/Boston.Get Permission

Comments (0)

Please log in or register to comment.
Log in