Eye movement monitoring (or eye tracking) reveals where in space the eyes linger, when and for how long. Here, we demonstrate how eye tracking can be used to investigate the integrity of memory in multiple participant populations, without requiring verbal, or otherwise explicit, reports.
Cite this ArticleCopy Citation
Ryan, J. D., Riggs, L., McQuiggan, D. A. Eye Movement Monitoring of Memory. J. Vis. Exp. (42), e2108, doi:10.3791/2108 (2010).
Translate text to:
AbstractExplicit (often verbal) reports are typically used to investigate memory (e.g. "Tell me what you remember about the person you saw at the bank yesterday."), however such reports can often be unreliable or sensitive to response bias 1, and may be unobtainable in some participant populations. Furthermore, explicit reports only reveal when information has reached consciousness and cannot comment on when memories were accessed during processing, regardless of whether the information is subsequently accessed in a conscious manner. Eye movement monitoring (eye tracking) provides a tool by which memory can be probed without asking participants to comment on the contents of their memories, and access of such memories can be revealed on-line 2,3. Video-based eye trackers (either head-mounted or remote) use a system of cameras and infrared markers to examine the pupil and corneal reflection in each eye as the participant views a display monitor. For head-mounted eye trackers, infrared markers are also used to determine head position to allow for head movement and more precise localization of eye position. Here, we demonstrate the use of a head-mounted eye tracking system to investigate memory performance in neurologically-intact and neurologically-impaired adults. Eye movement monitoring procedures begin with the placement of the eye tracker on the participant, and setup of the head and eye cameras. Calibration and validation procedures are conducted to ensure accuracy of eye position recording. Real-time recordings of X,Y-coordinate positions on the display monitor are then converted and used to describe periods of time in which the eye is static (i.e. fixations) versus in motion (i.e., saccades). Fixations and saccades are time-locked with respect to the onset/offset of a visual display or another external event (e.g. button press). Experimental manipulations are constructed to examine how and when patterns of fixations and saccades are altered through different types of prior experience. The influence of memory is revealed in the extent to which scanning patterns to new images differ from scanning patterns to images that have been previously studied 2, 4-5. Memory can also be interrogated for its specificity; for instance, eye movement patterns that differ between an identical and an altered version of a previously studied image reveal the storage of the altered detail in memory 2-3, 6-8. These indices of memory can be compared across participant populations, thereby providing a powerful tool by which to examine the organization of memory in healthy individuals, and the specific changes that occur to memory with neurological insult or decline 2-3, 8-10.
Equipment used during data acquisition
The eye tracker used in the current protocol is an EyeLink II system (SR Research Ltd; Mississauga, Ontario, Canada). This head-mounted, video-based eye tracker records eye position in X, Y-coordinate frame at a sampling rate of either 500 or 250 Hz, with a spatial resolution of < 0.1°. One camera is used to monitor head position by sending infrared markers to sensors placed on the four corners of the display monitor that is viewed by the participants. Two additional cameras are mounted on the headband situated below each of the eyes, and infrared illuminators are used to note the pupil and corneal reflections. Eye position may be based on pupil and corneal reflections, or based on the pupil only. The padded headband of the eye tracker can be adjusted in two planes to comfortably fit the head size of an adult participant. Most eyeglasses and contact lenses can be accommodated by the eye tracker.
Two PCs are used to support eye movement recording. One computer serves as the display computer, which presents the calibration screens, necessary task instructions and the images used in the experimental paradigm to the participants. The display computer details the data collection parameters that are then governed by the second, host computer. The host computer calculates real-time gaze position and records the eye movement data for later analysis, as well as any button press or keyboard responses made the participant. Participant setup and operations of the eye tracker are performed via the host PC.
In this protocol, the timing and order by which experimental stimuli are to be presented to the participants, and the manner in which eye position is to be collected by the host PC are programmed through Experiment Builder, a software program specifically developed by SR Research Ltd to interface with the eye tracker host computer. However, stimulus presentation can also be conducted through other software programs (e.g., Presentation, Neurobehavioral Systems; Albany, CA). Conversion of the eye movement data to a series of fixation and saccade events that are time-locked to stimulus presentation (or another external event) is achieved through the host computer and can be interrogated with Data Viewer, a software program developed by SR Research Ltd; however, again, other programs can be used to derive the required eye movement measures. Here, the detection of fixations and saccades are dependent on an online parser, which separates raw eye movement samples into meaningful states (saccades, blinks and fixations). If the velocity of two successive eye movement samples exceeds 22 degrees per second over a distance of 0.1°, the samples are labeled as a saccade. If the pupil is missing for 3 or more samples, the eye activity is marked as a blink within the data stream. Non-saccade and non-blink activity are considered fixations.
Eye tracking procedures.
Below, we detail the procedures for obtaining eye movement recordings for each participant.
- Consent. Prior to participant setup, participants are shown the eye tracker headband. The experimenter explains to the participants that head and eye positions are monitored through the cameras that are contained on the headband.
- For a given experimental paradigm, participants are seated a fixed distance from the monitor to maintain the same visual angle across participants.
- Eye tracker camera setup. The eye tracker helmet is adjusted so that the helmet is snug and unlikely to move but not uncomfortable around the head. The helmet is further adjusted so that the head camera can send infrared illumination to the external markers on the display monitor. Each of the eye cameras are situated on individual rods that extend from the helmet which allow for adjustment in all axes. The eye cameras are positioned just under and slightly away from each eye without obstructing the participant s view of the display monitor (see Figure 1). The cameras are focused to get a clear and stable image of the pupil and corneal reflections. Status panels on the host computer indicate whether pupil and corneal reflections are being acquired. The illumination threshold can be adjusted to obtain the most stable recording of the pupil and corneal reflections. The experimenter then selects the appropriate experimental paradigm, and designates a filename for the ensuing eye movement recordings.
Figure 1. Example of a head-mounted video-based eye tracker (left) and the display and host PCs (right). Note that the display monitor in front of the participant shows the location of the pupil (blue dot with crosshair) and corneal reflection (yellow dot with crosshair) to aid in camera setup.
- Calibration. To obtain accurate recordings of the eye position on the display monitor, a calibration procedure is initiated following camera setup. During the calibration procedure, the participants are instructed to look at a series of targets that appear at various locations on the display. Typically, nine target locations are used, but calibration can be performed with as few as three target locations. From these nine recorded locations, eye position at any point on the screen can be interpolated.
- Validation. Following calibration, a validation procedure is used to check the accuracy of the eye movement recording. The same nine target locations are provided to the participant to fixate, and the difference is computed between the current fixation position and the previously recorded fixation position. If the average accuracy for recording of position for one (or both) eyes exceeds acceptable levels (average error < 0.5° and maximum error of any point < 1.0°), calibration and validation procedures are repeated. Following acceptance of the accuracy levels, the experimenter can determine if eye movement recording should be monocular or binocular. In the case of monocular recording, the experimenter can determine from which eye data is to be recorded, or the host PC can automatically select the eye with the lower average error and the lower maximum error. While calibration and validation can occur with slight head movement, it is advantageous to have participants sit as still as possible. Participants are also instructed not to anticipate the upcoming locations of targets, but rather to move to, and fixate deliberately on, the targets only when they appear and to stay fixated on the target until it disappears.
- Experimental Paradigm. Instructions particular to a given paradigm are presented to the participant (e.g., "Please freely view each of the following images."), and data recording is initiated for the session.
- Drift correction. Prior to the onset of each experimental stimulus, a drift correction can be performed by having the participants look at a central fixation target while pressing a button on a joypad or keyboard to accept the fixation position. When drift correction is employed, a trial will not begin until the participant successfully fixates on the central fixation region within 2.0° of the center target. If the participant consistently fixates greater than 2.0°, the participant must again undergo calibration and validation procedures. Any difference less than 2.0° between the recorded eye position and the initial fixation position obtained during the calibration/validation recordings is then noted as error and accounted for in the data.
- Stimulus presentation. Eye position, including fixation and saccade events, is recorded following drift correction procedures for each stimulus presentation. The experimenter can monitor the accuracy of the recordings online via inspection of the host computer.
- Following the eye movement recording session, measures that characterize the eye movement scanning patterns for different types of images (e.g. novel, repeated) can be derived with a variety of software packages, such as SR Research Ltd s Data Viewer. In addition, viewing to specific regions of a stimulus (e.g. unaltered, manipulated) can be characterized by analyzing eye movements with respect to experimenter-drawn regions of interest that are created for each image either at the paradigm programming phase or after data collection.
- Participants are debriefed regarding the purpose of the experiment at the conclusion of the session.
Multiple measures can be derived from the eye movement recordings, including measures that describe overall viewing to the image (including the characteristics of each fixation/saccade), and measures that describe the pattern of viewing that has been directed to a particular region of interest within an image 3. Measures of overall viewing to an image may include (but are not limited to): the number of fixations and the number of saccades made to the image, the average duration of each fixation, and the total amount of viewing time that was spent fixating on the image. Measures that describe the pattern of viewing to a particular region of interest may include (but are not limited to): the number of fixations made to the region of interest, the amount of time spent within a region of interest, and the number of transitions made into/out of a region of interest. Further, measures may be derived from the eye movement recordings that outline the timing by which (i.e., how early) a particular eye movement event has occurred, such as when, following stimulus onset, the eyes fixate on a specific region of interest, when the first saccade is made on an image, and the entropy (constraint/randomness) inherent in the sequence of eye movement patterns.
For any given image, eye movement measures can detail where the eyes were fixated, when and for how long. To obtain an index of memory, we can give viewers different types or amounts of exposure to distinct sets of images, and then compare viewing patterns across those sets and across participants. For instance, to probe memory for repetition of an image, scanning patterns can be contrasted between novel images and images that have been viewed multiple times throughout a testing session. Viewing images repeatedly throughout a testing session results in a decrease in overall viewing of the image 2, 4-5, 8. This can be seen in Figure 2. In this representative result from Riggs et al. 11, participants viewed novel pictures of faces once in each of five testing blocks; with increasing exposure, there was a decrease in the amount of fixations that viewers make to the faces. To probe memory for particular details of an image, scanning patterns can be contrasted between images that have been repeatedly viewed in their original, unaltered form (repeated images) and images that have similarly been viewed repeatedly throughout a testing session, but a change has been introduced to some element within a scene during the final exposure (manipulated images). In such cases, scanning patterns are attracted differentially to the region that has been altered within a manipulated image compared to the same, unaltered region of repeated images 2-3, 7-8. However, such eye movement indices of memory are not present in certain populations, such as when healthy older adults and patients with amnesia due to medial temporal lobe damage are assessed for their memory of the spatial relations among objects in scenes, as outlined in Figure 3 2, 8. Therefore, findings from eye movement monitoring can be used to contrast memory among groups of participants with differing neuropsychological status 2-3, 8-10.
Figure 2. Eye movements reveal memory for repetition. In this representative result from our laboratory , participants viewed faces across 5 study blocks; as the number of viewings increased (from 1-5), the number of fixations to the faces decreased.
Figure 3. Eye movements reveal memory for changed details. Younger adults [2 (Experiments 1, 2), 8 (Free Viewing Condition)] directed a greater proportion of their total eye fixations to a critical region in a manipulated image that has undergone a change from a prior viewing compared to when the region has not undergone a change, as in novel and repeated images. Such effects of memory were absent in healthy older adults [8 (Free Viewing Condition], and in amnesic patients .
Eye movement monitoring is an efficient, useful tool with which to assess memory function in a variety of populations. This protocol describes the use of a head-mounted video-based eye tracker, but the protocol can be easily adapted to the use of remote eye tracking devices, as remote eye trackers remove the need for helmet adjustment and simplify the camera adjustments. However, with a remote eye tracker, head movement must be constrained to maintain accuracy of eye recordings. Accurate calibration of the eye movements is paramount for obtaining useful, and interpretable, data.
Indices of memory obtained through eye movement monitoring obviate the need for acquiring explicit (i.e., verbal) reports of memory, which may be advantageous for rapid investigation of memory in populations with compromised communication skills. Eye tracking may also be used in concert with explicit reports to determine whether there is information that is maintained in memory but is not available for conscious introspection. Additionally, eye movement patterns can be probed to determine when the influence of memory induces a change in those patterns. All together, when compared to explicit reports, measures derived from eye movement monitoring provide more comprehensive detail regarding what is maintained in memory, and when it is accessed 2-3.
Comparing eye movement patterns across population groups provides insight into how the integrity of memory function may change with age, and/or altered neuropsychological status. Interrogating eye movement indices of memory in individuals with lesions to particular areas of the brain can reveal those neural regions that are critical for forming and maintaining particular kinds of information 2-3, 9-10. With further research that examines the reliability of obtaining indices of memory for individual participants with minimal trials outside the laboratory environment, eye tracking may become a useful methodology to monitor and validate memory in training environments, clinical settings and/or law-enforcement situations, such as in eyewitness identification procedures 12.
No conflicts of interest declared.
AcknowledgementsThe authors wish to acknowledge their collaborators on studies that employ eye movement monitoring of memory, with particular thanks to Eyal Reingold, Jiye Shen, Neal J. Cohen, Robert R. Althoff, Deborah Hannula, and Dave Warren. This work has been supported by Natural Sciences and Engineering Research Council of Canada (NSERC), Canadian Institutes of Health Research (CIHR), the Canada Research Chairs Program and the Canadian Foundation for Innovation (CFI).
|Eye Tracker||SR Research Ltd.||Eyelink II|
|Experimental Control Software||SR Research Ltd.||Experiment Builder|
|Eye Movement Analysis Program||SR Research Ltd.||Data Viewer|
- Bradfield, A. L., Wells, G. L., Olsen, E. A. The damaging effect of confirming feedback on the relation between eyewitness certainty and identification accuracy. J. Appl. Psychol. 87, (1), 112-120 (2002).
- Ryan, J. D., Althoff, R. R., Whitlow, S., Cohen, N. J. Amnesia is a deficit in relational memory. Psych. Sci. 11, (6), 454-461 (2000).
- Ryan, J. D., Cohen, N. J. The nature of change detection and online representations of scenes. J. Exp. Psych. Hum. Percept. Perfor. 30, 98-1015 (2004).
- Althoff, R. R., Cohen, N. J. Eye-movement-based memory effect: a re-processing effect in face perception. J. Exp. Psychol. Learn Mem. Cog. 25, (4), 997-1010 (1999).
- Heisz, J. J., Shore, D. More efficient scanning for familiar faces. J. Vis. 8, (1), 1-10 (2008).
- Parker, R. E. Picture processing during recognition. J. Exp. Psych. Hum. Percept. Perfor. 4, 284-293 (1978).
- Hollingworth, A., Williams, C. C., Henderson, J. M. To see and remember: Visually specific information is retained in memory from previously attended objects in natural scenes. Psychon Bull. Rev. 8, 761-768 (2001).
- Ryan, J. D., Leung, G., Turk-Browne, N. B., Hasher, L. Assessment of age-related changes in inhibition and binding using eye movement monitoring. Psychol. Aging. 22, (2), 239-250 (2007).
- Hannula, D. E., Ryan, J. D., Tranel, D., Cohen, N. J. Rapid onset relational memory effects are evident in eye movement behavior, but not in hippocampal amnesia. J. Cog. Neuro. 19, (10), 1690-1705 (2007).
- Bate, S., Haslam, C., Tree, J. J., Hodgson, T. L. Evidence of an eye movement-based memory effect in congenital prosopsagnosia. Cortex. 44, (7), 806-819 (2008).
- Riggs, L., McQuiggan, D., Chan, J., Anderson, A. K., Ryan, J. D. Emotion- modulated viewing of faces by association. Forthcoming Forthcoming.
- Kramer, A. F., McCarley, J. S. Oculomotor behaviour as a reflection of attention and memory processes: Neural mechanisms and applications to human factors. Theor. Issues Ergon. Sci. 4, (1), 21-55 (2003).
Formal Correction: Erratum: Eye Movement Monitoring of Memory
Posted by JoVE Editors on 09/16/2010. Citeable Link.
A correction was made to Eye Movement Monitoring of Memory. There was an error in the author, Douglas A. McQuiggan's, name. The author's name has been corrected to:
Douglas A. McQuiggan