Behavior
A subscription to JoVE is required to view this content.
You will only be able to see the first 2 minutes.
The JoVE video player is compatible with HTML5 and Adobe Flash. Older browsers that do not support HTML5 and the H.264 video codec will still use a Flash-based video player. We recommend downloading the newest version of Flash here, but we support all versions 10 and above.
If that doesn't help, please let us know.
Gaze in Action: Head-mounted Eye Tracking of Children's Dynamic Visual Attention During Naturalistic Behavior
Chapters
Summary November 14th, 2018
Young children do not passively observe the world, but rather actively explore and engage with their environment. This protocol provides guiding principles and practical recommendations for using head-mounted eye trackers to record infants' and toddlers' dynamic visual environments and visual attention in the context of natural behavior.
Transcript
This method can help researchers understand what the world looks like from a young child's perspective, as well as how they allocate their real attention within that view. Compared with screen-based eye tracking, which is widely used in behavioral science, head-mounted eye tracking allows us to monitor where children look during everyday activities like toy play and picture book reading. Demonstrating the procedure will be graduate students Catalina Suarez-Rivera and Yayun Zhang, and lab manager Daniel Pearcy.
Before beginning an experiment, modify a system to work with a custom-made infant cap. Select a scene camera that is adjustable in terms of positioning, and has a wide enough angle to capture a field of view appropriate for addressing the research questions of interest. Select an eye camera that is adjustable in terms of positioning, and has an infrared LED positioned in such a way that the cornea of the child's eye will reflect this light.
The eye tracking system should be as unobtrusive and lightweight as possible, to provide the greatest chance that young children will tolerate wearing the equipment. Then attach the scene and the eye cameras to a hook and loop strap that is affixed to the opposite side of a piece of hook and loop strap sewn onto a child-sized cap, to embed the system into the cap, and position the cameras so that they will be out of the center of the child's view. For eye-tracking data collection, have two researchers present:one to interact with, and to occupy the child, and one to place and position the eye tracking system.
Fully engage the child in an activity that occupies the child's hands, so that the child does not reach up to move or grab the eye-tracking system, and place the eye-tracking system onto the child's head. Position the scene camera low on the forehead to best approximate the child's field of view, and center the scene camera view on what the child will be looking at during the study. To obtain high-quality gaze data, position the eye camera to detect both the pupil and corneal reflection with no cheek or eyelash occlusion throughout the eye's full range of motion.
The trickiest part of this protocol is placing the equipment on the child's head and adjusting the cameras without upsetting the child. Speed, confidence, and practice are essential. Once the scene and eye images are as high-quality as can be obtained, draw the child's attention to different locations in their field of view to collect calibration data.
Take care that the child's body positioning during the calibration matches the position that will be used during the study. When all the calibration points have been obtained, begin collecting the eye-tracking data. Taking note of any points at which the eye-tracking system gets bumped or misaligned, to allow recalibration as necessary, and to allow separate coding of the data before and after the misalignment.
To calibrate the eye-tracking data at the end of the study, open an appropriate calibration software program, and adjust the thresholds of the various detection parameters within the calibration software to obtain a good eye image. During the first round of calibration, identify calibration points at moments when the child is clearly looking to a distinct point in the scene image, keeping in mind that these can be points intentionally created by the researcher during the data collection, or points from within the study, in which the point of gaze is easily identifiable, as long as the pupil is accurately detected for those frames. Create a series of calibration points to establish the mapping between the scene and the eye.
If the eye-tracking system changed position at any time during the study, create separate calibrations for the portions before and after the change in position. To code the regions of interest, compile a list of all the regions of interest that should be coded based on the research questions, and use the child's eye image, scene image, and point of gaze track to determine which region of interest is being visually attended. Scroll through frames one by one to watch for moments of the pupil within the eye image as the primary cue that the region of interest may have changed.
When a visible movement of the eye occurs, check whether the child was shifting the point of gaze to a new region of interest, or to no defined region of interest. Although the regions of interest are coded separately for each frame, use frames before and after the frame being analyzed to gain contextual information that may aid in determining the correct region of interest. Here, sample region of interest streams for two 18-month old children are shown.
Each colored block represents continuous frames during which the child looked at a particular region of interest. Children showed individual differences in their selectivity for different subsets of toys, as evidenced by the differences in proportion of the interactions that each child spent looking at each of the toy regions of interest. Although the total proportion of time both children spent looking at all of the toys was somewhat similar, the proportions of time spent on individual toys varied greatly, both within and between subjects.
Moreover, how these proportions of looking time were achieved also differed, with Child Two's mean look duration almost double that of Child One. Another property demonstrated by these data is that both children rarely looked to the faces of their parents during the session, and that when they did, each gaze duration was typically for less than one second. Researchers can place head-mounted eye trackers on children and their social partner simultaneously, as well as integrate this procedure with techniques such as motion-tracking and heart rate monitoring, to provide high-density, multi-modal datasets for answering a variety of questions.
The use of these techniques has transformed our understanding of many topics in the developmental literature, including joint and sustained attention, changing visual experiences with age and motor development, and the role of visual experiences in word learning. This protocol has been successfully employed with clinical populations, including children with cochlear implants and children diagnosed with autism spectrum disorders.
Related Videos
You might already have access to this content!
Please enter your Institution or Company email below to check.
has access to
Please create a free JoVE account to get access
Login to access JoVE
Please login to your JoVE account to get access
We use/store this info to ensure you have proper access and that your account is secure. We may use this info to send you notifications about your account, your institutional access, and/or other related products. To learn more about our GDPR policies click here.
If you want more info regarding data storage, please contact gdpr@jove.com.
Please enter your email address so we may send you a link to reset your password.
We use/store this info to ensure you have proper access and that your account is secure. We may use this info to send you notifications about your account, your institutional access, and/or other related products. To learn more about our GDPR policies click here.
If you want more info regarding data storage, please contact gdpr@jove.com.
Your JoVE Unlimited Free Trial
Fill the form to request your free trial.
We use/store this info to ensure you have proper access and that your account is secure. We may use this info to send you notifications about your account, your institutional access, and/or other related products. To learn more about our GDPR policies click here.
If you want more info regarding data storage, please contact gdpr@jove.com.
Thank You!
A JoVE representative will be in touch with you shortly.
Thank You!
You have already requested a trial and a JoVE representative will be in touch with you shortly. If you need immediate assistance, please email us at subscriptions@jove.com.
Thank You!
Please enjoy a free 2-hour trial. In order to begin, please login.
Thank You!
You have unlocked a 2-hour free trial now. All JoVE videos and articles can be accessed for free.
To get started, a verification email has been sent to email@institution.com. Please follow the link in the email to activate your free trial account. If you do not see the message in your inbox, please check your "Spam" folder.