Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Current Directions in Biomedical Engineering

Joint Journal of the German Society for Biomedical Engineering in VDE and the Austrian and Swiss Societies for Biomedical Engineering

Editor-in-Chief: Dössel, Olaf

Editorial Board: Augat, Peter / Buzug, Thorsten M. / Haueisen, Jens / Jockenhoevel, Stefan / Knaup-Gregori, Petra / Kraft, Marc / Lenarz, Thomas / Leonhardt, Steffen / Malberg, Hagen / Penzel, Thomas / Plank, Gernot / Radermacher, Klaus M. / Schkommodau, Erik / Stieglitz, Thomas / Urban, Gerald A.

1 Issue per year

Open Access
Online
ISSN
2364-5504
See all formats and pricing
More options …

Next-generation vision testing: the quick CSF

Michael Dorr / Manuel Wille / Tiberiu Viulet / Edward Sanchez / Peter J Bex / Zhong-Lin Lu / Luis Lesmes
Published Online: 2015-09-12 | DOI: https://doi.org/10.1515/cdbme-2015-0034

Abstract

The Contrast Sensitivity Function relates the spatial frequency and contrast of a spatial pattern to its visibility and thus provides a fundamental description of visual function. However, the current clinical standard of care typically restricts assessment to visual acuity, i.e. the smallest stimulus size that can be resolved at full contrast; alternatively, tests of contrast sensitivity are typically restricted to assessment of the lowest visible contrast for a fixed letter size. This restriction to one-dimensional subspaces of a two-dimensional space was necessary when stimuli were printed on paper charts and simple scoring rules were applied manually. More recently, however, computerized testing and electronic screens have enabled more flexible stimulus displays and more complex test algorithms. For example, the quick CSF method uses a Bayesian adaptive procedure and an information maximization criterion to select only informative stimuli; testing times to precisely estimate the whole contrast sensitivity function are reduced to 2-5 minutes. Here, we describe the implementation of the quick CSF method in a medical device. We make several usability enhancements to make it suitable for use in clinical settings. A first usability study shows excellent results, with a mean System Usability Scale score of 86.5.

Keywords: vision; contrast sensitivity function; adaptive methods

1 Introduction

The current gold standard for the assessment of visual function goes back to 1862, when Herman Snellen developed a standard letter chart that has remained in use with only small modifications ever since. By showing progressively smaller letters at full contrast, the Snellen chart assesses visual acuity, i.e. the visual system’s resolving power. However, the visual system’s ability to detect a pattern depends not only on the size of the pattern, but also on its contrast; the Contrast Sensitivity Function (CSF) [4], which relates spatial frequency (size) to threshold contrast, therefore is a more comprehensive descriptor of visual function. Importantly, neurologic and ophthalmic pathologies may affect contrast sensitivity before they affect acuity [8, 13].

Despite its recognized clinical utility, routine assessment of contrast sensitivity in clinical trials and regular care is hampered by practical limitations. Compared to one-dimensional acuity, the two-dimensional CSF necessitates testing at many combinations of spatial frequency and contrast, for which paper charts lack resolution and flexibility. Simple scoring heuristics such as “stop at the last line with three out of five correct letter identifications” can be evaluated manually, but also lack computational efficiency to precisely capture the probabilistic nature of psychometric responses. By randomizing the sequence of tested optotypes, computerized vision testing alleviates some of the shortcomings of paper charts, but test scoring is typically implemented following the same simple rules as in non-computerized testing.

The quick CSF method [9] was originally developed for behavioural testing in psychophysical laboratories and uses a Bayesian adaptive procedure and an information maximization rule to test only informative combinations of spatial frequency and contrast, based on the full trial history. In this paper, we describe the implementation of the quick CSF method in a medical device that may be used to rapidly and precisely estimate visual function in clinical settings.

2 The quick CSF method

The quick CSF method exploits the observation that the human CSF can be accurately described using only four parameters [10, 12]: peak frequency and gain, bandwidth, and a low-frequency truncation parameter. After each trial of a test, the probability distribution p(Θ) over a set of possible CSFs described by four-parameter tuples Θ (in our implementation, about 12 million CSFs) can be computed, given the full trial history. Before each trial, the expected information gain is calculated for all possible stimuli (here, 19 spatial frequencies at 128 contrast levels each; 2432 stimuli overall) over a sampled subset of Θ, and only informative stimuli are chosen for presentation. By avoiding uninformative regions of the search space, the quick CSF method in [9] already estimated the whole CSF in 50-100 trials. For these results, the authors in [9] used sinewave gratings of two possible orientations as stimuli. Gratings are very frequency-selective and correspond well to neuronal receptive fields in primary visual cortex [6], but the small number of alternatives leads to a high guessing rate (50%) and thus lower statistical efficiency. Therefore, we implemented the use of ten Sloan letters, bandpass-filtered with a raised cosine window with peak frequency 4 cycles per letter, as proposed in [5]; as a consequence, the number of trials can be further reduced to 25 for rapid testing times, or 50 for very high precision.

2.1 Usability enhancements

We implemented two additional usability enhancements of the quick CSF. First, in classical forced-choice tasks, observers are asked to guess a response even if they do not perceive the stimulus. While even highly experienced psychophysical observers still can have strong biases when guessing [7], this is particularly difficult for inexperienced observers such as patients; we therefore added an “I don’t know” to the set of responses. Second, the lack of a (visible) stimulus on the screen can be confusing to observers; therefore, three letters are always presented simultaneously in a horizontal line, with the middle and left letters displayed at two and four times the contrast of the right letter, respectively. Because the contrast of the right letter is chosen by the quick CSF method and is usually near threshold contrast, it is very likely that at least one of the letters is easily recognizable by the observer, which helps observers to determine the location and size of the test letters.

3 Hardware

The hardware setup of our quick CSF implementation mainly comprises a small-form factor PC (Intel i5 CPU, 4 GB RAM) and a large-format screen for stimulus display. At 46” diagonal with a resolution of 1920 by 1080 pixels and at a viewing distance of 400 cm, the screen allows the display of stimuli in a spatial frequency range from 1.4 to 36.2 cycles per degree, which includes the whole set of frequencies mandated by the FDA (1.5 to 18 cpd). Also according to FDA standards, mean screen luminance is calibrated to 85 cd/m2. All device functionality including power control is accessed through a handheld tablet device; an NFC reader provides an authentication mechanism through smartcards. External interfacing for data export is provided by an Ethernet and a USB port.

Picture of a device prototype in demo mode, where particularly visible (medium spatial frequency, high contrast) stimuli are shown.
Figure 1

Picture of a device prototype in demo mode, where particularly visible (medium spatial frequency, high contrast) stimuli are shown.

4 Software

4.1 Main computer

The small-form factor PC runs a regular Linux operating system. However, the OS is transparent to the user because all user interaction is performed on the tablet remote control, and the vision test software permanently runs in fullscreen mode. The PC-side software is written in C++; the graphics display further uses OpenGL shaders and implements spatio-temporal dithering to increase bit depth of the screen.

For convenience, the software automatically calculates several features of the CSF: threshold sensitivities at five individual spatial frequencies mandated by the FDA [14]; CSF acuity, the intersection of the CSF with the x-axis (i.e. spatial frequency where contrast threshold is 100%); and a summary statistic, the area under the log CSF in the range from 1.5 to 18 cpd [1]. For an example screenshot of the results display, see Fig. 3.

Screenshot of remote control tablet during a test session. The trial history (correct answers, triangles; mistakes, crosses; ’no answer’ responses, slashes) and the current best estimate of the CSF are shown in the top panel, with spatial frequency on the x-axis and contrast sensitivity on the y-axis. The current trial is highlighted in blue.
Figure 2

Screenshot of remote control tablet during a test session. The trial history (correct answers, triangles; mistakes, crosses; ’no answer’ responses, slashes) and the current best estimate of the CSF are shown in the top panel, with spatial frequency on the x-axis and contrast sensitivity on the y-axis. The current trial is highlighted in blue.

Raw data, such as the trial history and the full distribution of posterior parameters fmax, ymax, β, δ, are stored in a database and can be exported via a web-based database interface or the USB port.

4.2 Tablet remote control

The tablet remote control is based on a customized Android image that always runs the remote control app in the foreground. During the test, the patient reads out the letters on the screen. The examiner, who is presented with the ground truth, codes these responses as correct or incorrect; an “I don’t know” response can also be coded; see Fig. 2 for a screenshot of the user interface. In order to reduce spatial uncertainty, each stimulus presentation is preceded by markers indicating the spatial position and scale of the upcoming stimulus; the examiner can repeat this marker presentation using the “Prompt” button. After response entry, the examiner can initiate the next trial; if necessary, previous responses can be undone as well.

Screenshot of remote control tablet, showing test results after a session. The solid line denotes median sensitivity estimate for the posterior distribution of possible CSFs, with the shaded region denoting the 66% confidence interval.
Figure 3

Screenshot of remote control tablet, showing test results after a session. The solid line denotes median sensitivity estimate for the posterior distribution of possible CSFs, with the shaded region denoting the 66% confidence interval.

5 Usability evaluation

Protoype systems are currently in use in several clinics in the United States and Germany. After extended use, we asked the technicians who operate the device on a regular basis to anonymously fill out system usability scale (SUS) questionnaires [3]. SUS comprises ten questions (five positive, five negative statements) that require a rating on a scale from 1 (“strongly disagree”) to 5 (“strongly agree”).

Fig. 4 shows the response distributions for each of the 10 SUS questions obtained from nine participants. For visualization purposes, we converted scores to a range of ’-’ (score 1 on odd-numbered questions, score 5 on even-numbered questions) to ’++’ (scores 5 and 1 for odd- and even-numbered, respectively), so that darker colours in Fig. 4 are better. Numerically, the mean SUS was 86.5 with a s.d. of 10. The strongest agreement was found with item 3, “I thought the system was easy to use” (mean 4.78, s.d. of .44).

6 Discussion

Moore’s law and the increase in available computational power have made it possible to apply complex algorithms even during brief inter-trial intervals in behavioural testing. Adaptive testing that rests on the exact computation of probability distributions in large multi-dimensional spaces goes beyond manually-scored heuristics and provides greater precision with reduced testing times. We here describe how an algorithm that was developed for psychophysics laboratories has been translated into an easy-to-use medical device for clinical settings. We implemented usability enhancements for the patient, and integrated all device interaction in a tablet-based remote control for the technician’s convenience. A usability study demonstrated that users rated the device highly, with a mean score that corresponds to an “excellent” rating in a large-scale study of SUS scores [2].

Results for System Usability Scale questionnaires. For visualization, (dis-)agreement with positive (negative) questions on a scale from 1 to 5 was converted to a scale from ’- -’ to ’++’.
Figure 4

Results for System Usability Scale questionnaires. For visualization, (dis-)agreement with positive (negative) questions on a scale from 1 to 5 was converted to a scale from ’- -’ to ’++’.

A first clinical study has already shown that quick CSF test results correlate better with patient-reported outcomes than established far and near visual acuity measures [11]. Further studies are currently ongoing to demonstrate the quick CSF’s sensitivity to track subtle changes in visual function due to disease progression or treatment effects.

References

  • [1]

    R Applegate, G Hilmantel, and H Howland. Area under the log contrast sensitivity function: A concise method of following changes in visual performance. OSA Technical Digest Series, 1:98–101, 1997. Google Scholar

  • [2]

    Aaron Bangor, Philip Kortum, and James Miller. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of Usability Studies, 4(3):114–123, 2009. Google Scholar

  • [3]

    John Brooke. SUS – a ’quick and dirty’ usability scale. Usability evaluation in industry, 189(194):4–7, 1996. Google Scholar

  • [4]

    F. W. Campbell and D. G. Green. Optical and retinal factors affecting visual resolution. J Physiol, 181(3):576–593, Dec 1965. Google Scholar

  • [5]

    Fang Hou, Luis Lesmes, Peter Bex, Michael Dorr, and ZhongLin Lu. Using 10AFC to further improve the efficiency of the quick CSF method. Journal of Vision, 15(2):1–18, 2015. Google Scholar

  • [6]

    D H Hubel and T N Wiesel. Functional architecture of macaque monkey visual cortex. Proc R Soc Lond, 198:1–59, 1977. Google Scholar

  • [7]

    Frank Jäkel and Felix A Wichmann. Spatial four-alternative forced-choice method is the preferred psychophysical method for naive observers. Journal of Vision, 6:1307–22, 2006. Google Scholar

  • [8]

    L F Jindra and V Zemon. Contrast sensitivity testing: a more complete assessment of vision. J Cataract Refract Surg, 15(2):141–148, Mar 1989. Google Scholar

  • [9]

    Luis Andres Lesmes, Zhong-Lin Lu, Jongsoo Baek, and Thomas D. Albright. Bayesian adaptive estimation of the contrast sensitivity function: The quick CSF method. Journal of Vision, 10(3), 2010. Google Scholar

  • [10]

    A. M. Rohaly and C. Owsley. Modeling the contrast-sensitivity functions of older adults. J Opt Soc Am A, 10(7):1591–1599, Jul 1993. Google Scholar

  • [11]

    J P Stellmann, K L Young, J Pöttgen, M Dorr, and C Heesen. Introducing a new method to assess vision: Computer-adaptive contrast-sensitivity testing predicts visual functioning better than charts in multiple sclerosis patients. Multiple Sclerosis Journal: Experimental, Translational and Clinical, 1(2055217315596184):1–8, 2015. Google Scholar

  • [12]

    Andrew B Watson and Albert J Ahumada. A standard model for foveal detection of spatial contrast. Journal of Vision, 5(9):717–740, 2005. Google Scholar

  • [13]

    R L Woods and J M Wood. The role of contrast sensitivity charts and contrast letter charts in clinical practice. Clinical & Experimental Optometry, 78(2):43–57, 1995. Google Scholar

  • [14]

    American National Standards Institute Committee Z80. American National Standard for Ophthalmics: Multifocal Intraocular Lenses. Optical Laboratories Association, 11096 Lee Highway, Fairfax, VA, 2007. Google Scholar

About the article

Published Online: 2015-09-12

Published in Print: 2015-09-01


Author’s Statement

Conflict of interest: The authors declare personal financial interests in and/or employment with Adaptive Sensory Technology, a company commercializing the technology presented here. Material and Methods: Informed consent: Informed consent is not applicable. Ethical approval: The conducted research is not related to either human or animals use.


Citation Information: Current Directions in Biomedical Engineering, Volume 1, Issue 1, Pages 131–134, ISSN (Online) 2364-5504, DOI: https://doi.org/10.1515/cdbme-2015-0034.

Export Citation

© 2015 by Walter de Gruyter GmbH, Berlin/Boston.Get Permission

Comments (0)

Please log in or register to comment.
Log in