Waiting
Login processing...

Trial ends in Request Full Access Tell Your Colleague About Jove

Behavior

Assessing the Multiple Dimensions of Engagement to Characterize Learning: A Neurophysiological Perspective

Published: July 1, 2015 doi: 10.3791/52627

Summary

This paper aims to describe the techniques involved in the collection and synchronization of the multiple dimensions (behavioral, affective and cognitive) of learners’ engagement during a task.

Abstract

In a recent theoretical synthesis on the concept of engagement, Fredricks, Blumenfeld and Paris1 defined engagement by its multiple dimensions: behavioral, emotional and cognitive. They observed that individual types of engagement had not been studied in conjunction, and little information was available about interactions or synergy between the dimensions; consequently, more studies would contribute to creating finely tuned teaching interventions. Benefiting from the recent technological advances in neurosciences, this paper presents a recently developed methodology to gather and synchronize data on multidimensional engagement during learning tasks. The technique involves the collection of (a) electroencephalography, (b) electrodermal, (c) eye-tracking, and (d) facial emotion recognition data on four different computers. This led to synchronization issues for data collected from multiple sources. Post synchronization in specialized integration software gives researchers a better understanding of the dynamics between the multiple dimensions of engagement. For curriculum developers, these data could provide informed guidelines for achieving better instruction/learning efficiency. This technique also opens up possibilities in the field of brain-computer interactions, where adaptive learning or assessment environments could be developed.

Introduction

Engagement plays a crucial role in learning. For Clark and Mayer2, “all learning requires engagement,” regardless of delivery media. Zhang et al.3 also suggested that increased student engagement can improve learning outcomes, such as problem solving and critical thinking skills. Defining engagement remains a challenge. In their literature review, Fredricks, Blumenfeld and Paris1 defined engagement by its multifaceted nature: “Behavioural engagement draws on the idea of participation; it includes involvement in academic and social or extracurricular activities. (…) Emotional engagement encompasses positive and negative reactions to teachers, classmates, academics, and school and is presumed to create ties to an object and influence willingness to do the work. Finally, cognitive engagement draws on the idea of mental investment; it incorporates thoughtfulness and willingness to exert the effort necessary to comprehend complex ideas and master difficult skills.”

Fredricks, Blumenfeld and Paris1 also claimed that a focus on behavior, emotion, and cognition, within the concept of engagement, may provide a richer characterization of learning. These authors pointed out that a robust body of research addresses each component of engagement separately, but these components had not been studied in conjunction. They also observed that little information is available about interactions between the dimensions and that more studies could contribute to planning finely tuned teaching interventions. As a step in that direction, this paper describes a research methodology that was developed to gather and analyze quantitative and qualitative data, synchronously, on behavioral, emotional and cognitive engagement during learning tasks.

Bringing the Neurosciences into Education

Behavior, and consequently behavioral engagement, has long been the central focus of studies in education: research designs focused mainly on changes in knowledge and behavior occurring over long periods of time, between pre- and post-tests, and over intervals of hours, weeks, months or years. Discriminating between behavioral, emotional, and cognitive engagement remains a challenge because the last two dimensions are not systematically observable externally. Cognition and emotions must either be inferred from observations or assessed with self-report measures. From an external point of view, it remains difficult to determine whether students are trying to get their work done as quickly as possible or using deep-level learning strategies to master a specific content. In point of fact, Fredricks, Blumenfeld and Paris1 were unable to find any published studies using direct, objective measures of cognitive engagement.

Recent technological developments in the field of neurosciences have created new possibilities for research in education. New data collection methods and analysis algorithms developed in the field of neuroergonomics seem very promising for qualitative and quantitative studies during learning tasks. Other disciplines, such as economics, psychology, marketing, and ergonomics, have been using neurophysiological measurements to assess cognitive engagement for some time4-8. Neurophysiological measures, coupled with efficient analysis algorithms, allow one to study a phenomenon without disturbing it. By their nature, self-report questionnaires disengage students from learning. Neurophysiological measures allow research designs to be carried out in more authentic learning environments. These tools include equipment to monitor heart rate, breathing rate, blood pressure, body temperature, pupil diameter, electrodermal activity, electroencephalography (EEG), etc.

Gathering Synchronized Data on Behavioral, Emotional, and Cognitive Engagement

As representative outcomes following the use of this protocol, this paper will present partial results of a study in which learners had to solve, on a computer screen, ten problems in mechanical physics. These problems were developed in previous work9. Neurophysiological data were collected while the learners were solving the problems and relaxing during a 45 s break, with their eyes closed, after each problem.

As mentioned above, behavioral engagement data consist of software interactions (mouse movements and clicks), eye gaze, performance and answers to questions produced by a learner interacting with the system while accomplishing the task1. An eye-tracking system was used to collect software interactions and eye gaze data. Performance data (time to solve a problem, correctness of answers) were collected on a survey website that was used to present the task. This website was also used to gather self-report data collected with a questionnaire adapted from Bradley and Lang10. Emotional engagement involves characterization of emotions. According to Lang11, emotions are characterized in terms of valence (pleasant/unpleasant) and arousal (calm/aroused). Emotional engagement data were accordingly collected, using automatic facial emotion recognition software that quantifies emotional valence and an electrodermal activity encoder/sensor for arousal12,13. Electrodermal activity (EDA) refers to the recorded electrical resistance between two electrodes when a very weak electrical current is steadily passed between them. Cacioppo, Tassinary and Berntson14 showed that the resistance recorded varies according to the subject’s arousal. Thus, psychophysiological data, such as valence or arousal, are considered as correlates of emotional engagement.

Finally, cognitive engagement data are collected through electroencephalography (EEG). EEG measures, on the scalp, the synchronized electrical activity of groups of neurons in the brain. Electrical signals recorded from the scalp are often oscillatory and composed of frequency components. By convention, these frequencies are grouped in sequences, known as bands. For example, alpha, beta and theta bands are the focus of this study. According to neuroscientific studies14, these bands reflect different cognitive processing abilities in specific areas of the brain. Thus, the analysis of the power spectral density (PSD) of specific frequencies, combined with numerous studies7,15 on alertness and attention, allows researchers to quantify cognitive engagement during a task. As Mikulka et al.16 noted, research has shown a direct relationship between beta activity and cognitive alertness and an indirect relationship between alpha and theta activity and alertness. Thus, Pope, Bogart and Bartoleme7 developed an engagement index that computes the PSD of three bands: beta / (alpha + theta). This ratio was validated in other studies on engagement16,17,18. To characterize cognitive engagement over time, a fast Fourier transform (FFT) converts the EEG signal from each active site (F3, F4, O1, O2) into a power spectrum. The EEG engagement index at time T is computed by the average of each engagement ratio within a 20 sec sliding window preceding time T. This procedure is repeated every second and a new sliding window is used to update the index.

Since the aim of this methodology is to provide a rich analysis of the multiple dimensions of engagement, data synchronization is crucial. As Leger et al.19 remind readers, equipment manufacturers strongly recommend using only one computer per measurement tool to guarantee their specified precision level. Thus, when multiple computers are employed, synchronization between recording computers becomes a critical step. The recordings cannot all be started at the exact same time, and each data stream has its specific time frame (e.g., sec 0 of eye tracking ≠ sec 0 of EEG or physiological data). This is extremely important: desynchronization between data streams means errors in the quantification of each dimension of engagement. There are different ways of synchronizing concurrent physiological and behavioral recordings. These methods may be divided into two main approaches; direct and indirect20. The protocol presented in the next section is based on an indirect approach where an external device, a syncbox, is used to send transistor-transistor logic (TTL) signals to all the recording equipment (as shown in Figure 1). As each piece of equipment has a different start time, the TTL markers are recorded in the log files with a relative delay. Markers are then used to realign the signals and thus ensure proper synchronization after each recording. A behavioral analysis software program that allows external file integration is used to re-synchronize the timeline of each data stream and to perform quantitative and qualitative analysis of each dimension of engagement.

Figure 1
Figure 1. Architecture of the Data Collection System. The lab environment in which behavioral (eye-tracking), emotional (EDA and facial emotion) and cognitive (EEG) engagement data are collected contains many computers. This raises a synchronization challenge for data that are referenced on their respective computer clocks. To be able to analyze all data in the same reference time, the lab setup involves a syncbox that sends TTL signals to all data streams. Please click here to view a larger version of this figure.

To evaluate the precision of the methodology in terms of synchronization, 45 sec pauses were introduced before each of the mechanical physics problems. During these pauses, subjects had to relax and to close their eyes. As seen in other studies4,9,16,17,18, these pauses should induce significant variations in the collected signal: the two eye pupil dots in eye-tracking immediately disappear (behavioral engagement) and an immediate drop in cognitive engagement (EEG signal) is observed. These specific components of the signal are used to evaluate the general validity of the synchronization. The recent publication of papers that fully or partially rely on this synchronization procedure, in the fields of information systems19, human-machine interactions21 and education9, 22, provides evidence of its effectiveness.

Subscription Required. Please recommend JoVE to your librarian.

Protocol

This protocol received an ethical certificate from the Comité institutionnel de la recherche avec des êtres humains (CIER) de l’Université du Québec à Montréal (UQAM) that was endorsed by HEC-Montreal for the Tech3Lab research facility. The protocol describes each of the specific steps that are performed in our lab environment and equipment. Although precise software paths are provided to clarify the methodology, this technique is transferable and can be replicated with other proprietary eye-tracking, automatic facial emotion recognition, electrodermal activity and electroencephalography equipment and software.

1. Setup of the Lab Environment

  1. Turn on the eye-tracker, the EEG amplifier, the four recording computers and the speakers.
  2. Prepare the setup of the recording equipment:
    1. Prepare the EEG setup with required material according to the manufacturer’s recommended procedures. Prepare the EEG software for the upcoming participant. Start the eye-tracking software and create a new participant profile in the software. Start the video recording software and the cameras.
    2. Start the synchronization software with the specific subroutine created for the project with markers at 60 sec. Start the physiological measurement software (to record electrodermal activity) and open the specific layout created for the project. Adjust the participant’s chair to the highest level.

2. Participant Preparation

  1. Ask the participant to read and sign the ethical consent form.
  2. Carry out skull measurements for EEG:
    1. Find the Cz location on the participant’s head (according to 10 - 20 reference systems). Immerse the EEG net in the saline solution (potassium chloride) (see step 1.2.1) and start a timer (10 min) in accordance with the manufacturer’s standards.
  3. Read the objective of the study and the steps in the experiment to the participant, “The objective of this study is to observe your brain activity while you answer physics problems. First we will install the sensors, then you will be asked to solve 10 Newtonian physics problems on the computer. We will ask you to take a 45-sec break after each problem with your eyes closed. After each problem, you will be asked to rate your assessment of the problem.”
  4. Tell the subject that the total duration of the experiment will be 90 min.
  5. Install the physiological sensors, according to the manufacturer’s recommendations: two gelified sensors on the top of the left hand.
  6. Install the EEG cap, according to the manufacturer’s recommendations and perform an impedance check with a threshold at 40 kΩ (according to the manufacturer’s specifications).

3. Data Collection

  1. Make sure that all the recording software is ready to be started in synchrony:
    1. Physiology (EDA data): Click the “start” button.
    2. Video recording: Click the “open” button.
    3. Eye-tracking: Click the “on hold” button.
    4. EEG: Click the “record” button.
    5. Synchronization software: Click the “green circle” button.
  2. Eye gaze calibration:
    1. Perform a five-point on-screen calibration and observe the participant while he/she follows the red dots (click “Tools/Settings/Calibration …”). Repeat this procedure until sufficient accuracy is achieved, according to the manufacturer’s standards.
  3. Project task instructions onto the participant’s screen: ask if he/she has any questions after reading them, and if he/she is ready to start the experiment.
  4. Ask the participant to solve 10 Newtonian physics problems.
  5. If needed, perform an impedance check during one of the 45 s breaks (not before problem 5).
  6. Make sure the participant takes the full 45 s break before each problem (to determine the baseline).

4. End of Data Collection

  1. Stop data acquisition on all computers and remove the sensors from the participant.

5. After the Participant Has Left

  1. Clean the EEG cap with germicide and tidy up the equipment, according to the manufacturer’s recommendations. Save all the data files collected and create a backup on the FTP server.
  2. Fill in the participant spreadsheet: note any particular event or problem during data collection. Erase all cookies from the web browser.

6. Data Pre-processing and Export to the Integration Software

  1. EEG
    1. Import EEG data into EEG data analysis software:
      1. Create three empty folders on the computer named “Raw data”, “History” and “Export” to paste the raw EEG data into the newly created Raw data file.
      2. In the EEG data analysis software, click “File/New Project …” and choose the raw data location by clicking Browse, then selecting the newly created raw data file. Choose the location of the “History” and “Export” folders in the same way.
      3. Click “OK”. (The window should contain all the participant’s EEG data).
    2. Pre-process the brain signal:
      1. Apply a filter and a notch (click “Transformations/IIR filters…”). In the window, enable the low cutoff at 1.5 Hz with a slope of 12 dB and the high cutoff at 50 Hz with a slope of 12 dB. Also enable a notch at 60 Hz frequency.
      2. Because a DC amplifier is used, DC detrend the signal (Click “Transformations/DC Detrend…” and enable “based on time” at 100 msec before marker and 100 msec before DC connection).
      3. Perform a raw data inspection (Click “Transformation/Raw data inspection…” and select semi-automatic artifact removal). Select the following: maximal voltage 60 µV/ms; Max-min: 200 µV in 200 ms interval; amplitude: -400 to +400 µV).
      4. Perform an automatic ICA with classic sphering for eye blink removal (myographic artifacts do not need to be removed because their range is outside of the frequencies of interest). (Click “Transformations/ICA…”. At the end of the ICA, process the inverse ICA.)
      5. Re-reference (“Transformations/Re-reference…”) the signal and select “common average”.
      6. Export (click “Export/Generic Data Export…”) the signal and markers in text format (Select the “.vhdr” box) for an eventual Matlab construction of the engagement index. Also select the “Write header file” and “Write marker file” boxes.
    3. Import the signal in Matlab.
      1. Start Matlab and type “eeglab” so the GUI of EEGLab appears and Import the data for one participant at a time. In the GUI, select item menu “File/Import Data/Using EEGLab functions and plugins/From Brain Vis Rec .vhdr file”.
      2. In the command window, paste a script16 that generates an engagement index.  
        NOTE: The cognitive engagement script is computed by the average of each Beta/(Alpha + Theta) ratio within a 20 sec sliding window preceding time T. This procedure is repeated every second and a new sliding window is used to update the index.
    4. In MS Excel, open the text file of the engagement index that is generated at the end of the script by Matlab and apply a z-score normalization on EEG data to allow intersubject comparison. (For each value, compute this formula in Excel: Z=(value – overall mean)/overall standard deviation.)
    5. Save the z-score engagement index signal in a CSV file in MS Excel. (Click File/Save as … and select CSV in the format type.)
    6. Repeat the procedure (from step 6.1.2.2.) for each participant.
  2. Physiology:
    1. Import EDA data in physiological data analysis software.
    2. Apply these parameters to pre-process the physiological signal:
      1. Apply a logarithmic transformation to normalize the distribution of the conductance as per Venables and Christie’s23 method.
      2. Flat the signal on a 10 sec sliding window24.
    3. Within the physiological software, compute a z-score normalization on the EDA data to allow intersubject comparison. (Z=(value – overall mean)/overall standard deviation).
      1. Highlight all the data with the cursor from the EDA channel.
      2. In the top menu, select the EDA channel, and select “mean” to obtain the mean value of the overall channel. Also select the EDA channel and “stddev” to obtain the standard deviation value of the overall channel.
      3. To compute the z-score equation, click “Transformation/Waveform Math …” and select the EDA channel in Source 1 . Select “–” (minus) in the mathematical operation window and select K in source 2. Select “New destination” in the destination menu and enter the mean value of the EDA channel (see step 6.2.3.2). Select “Transform entire wave”, click OK and click “Transformation /Waveform Math …”. Select the EDA-K channel in source 1, select “/” (divide) in the mathematical operation window, select K in source 2, select “New destination” in destination and enter the standard deviation value of the EDA channel (step 6.2.3.2). Select “Transform entire wave” and click OK.
    4. Export the signal (arousal) in a CSV file. (Click File/Save as … and select CSV in the format type.)
  3. Automatic facial emotion recognition:
    1. Import video data from the media recorder into automatic facial emotion recognition software. (Click “File/New…/Participant…”. After selecting a new participant in the project menu by clicking on it, click “File/New/Analysis/Video…”. Click the magnifying glass next to Analysis 1 and choose the desired video file.
      1. Select an offline analysis for “every third frame” and activate “continuous calibration”.
      2. Export valence data in a CSV file. (Click “Options/Settings/Logging…”, check the “Write valence value to the log file” box. Click “File/Export…”, choose the location where the log files will be exported, and check the “Save detailed log” box.)
      3. Open the CSV file in MS Excel. Copy the valence data column in a single column of SPSS software. Click “Analyse/Descriptives Statistics/Descriptives” and select the just-pasted variable name. Check the box “Save standardized values in variables”. A column with a z-score will appear. Copy-paste these z-scores over the old data in the Excel file.
    2. Save the Excel file with the z-scores of the signal (valence) in CSV format.

7. Data Integration and Synchronization

  1. In behavioral analysis software:
    1. Import eye-tracking videos (behavioral engagement). (Click “File/Import/Video in a New Observation…”. Name the new observation and choose the desired video file.)
    2. Code each video with pertinent behaviors and contextual events (time markers, right/wrong answers).
    3. Import all external data with the appropriate header: z-score of EEG signal (cognitive engagement), z-score of EDA signal (emotional engagement), z-score of valence data (emotional engagement). (Click “File/Import/External Data…”. Select the appropriate file type and select the correct CSV file.)
  2. Synchronize time between computers according to these formulas:
    1. Time in eye gaze from time in EEG = Time in eye gaze + second marker in EEG – first marker in eye gaze.
    2. Time in eye gaze from time in facial emotion recognition = Time in eye gaze + first marker in facial emotion – first marker in eye gaze.
    3. Time in eye gaze from time in electrodermal activity = Time in eye gaze + first marker in electrodermal activity – first marker in eye gaze.
  3. Enter the offset data by pressing “Ctrl + Shift + =”, to open the Offset menu. Select “Numerical offset” to enter the time in seconds between each pair of data sources [OK?]), according to the calculations above.
  4. Generate a report according to the variables of interest in the study.
    1. Select interesting variables that will be generated in the report (click “Analyze/Select Data/New Profile Data…”). From the left, slide the desired variables between the “Start” box and the “Results” box, on the right.
    2. Generate the report. (Click “Analyze/Numerical Analysis/New…”, click “Statistics” and check the mean box in the external data menu. Finish by clicking “Calculate”.)
  5. Export the data into statistical analysis software and perform analysis according to the study objectives.

Subscription Required. Please recommend JoVE to your librarian.

Representative Results

Figures 2 and 3 show screenshots of the results of the integration and synchronization of behavioral, emotional and cognitive engagement data in a behavioral analysis software application. In both figures, the left-hand section organizes the research subjects and the coding scheme. In the middle section, a video (with red dots) shows the subject’s eye gaze during the task. The subject’s behavioral engagement can be inferred based on what he/she is looking at during the task and what actions are taken. In the lower section, a time marker is synchronously scrolling in three tracks of data: the EDA (arousal) and facial emotion valence for emotional engagement and the EEG engagement index for cognitive engagement. When data are collected from all the subjects, the software also provides basic descriptive statistics that can eventually be used to perform intersubject analysis in other statistical analysis software.

Figure 2
Figure 2. Multidimensional Engagement Data at the Beginning of a Problem-solving Task. A screenshot of a subject at the beginning of a problem-solving phase. The learner is reading the introduction to the problem: the eye gaze is on the third line. At this time (the red line represents a time cursor), the subject’s arousal has just passed a peak of anticipation of the problem to be solved but is still high compared to baseline, emotional valence seems neutral, and EEG cognitive engagement seems at its maximum. Please click here to view a larger version of this figure.

Figure 3
Figure 3. Multidimensional Engagement Data During a Pause in the Task. Data from a pause before a problem-solving task. This pause is useful to establish the subject’s baseline just before the task. Here, because the subject’s eyes are closed, the valence data are not available. Cognitive engagement (EEG signal) is rising slightly from its minimum. The subject is slowly re-engaging with the task, anticipating the end of the pause. Arousal (EDA signal) is constantly declining. Please click here to view a larger version of this figure.

Subscription Required. Please recommend JoVE to your librarian.

Discussion

In terms of critical steps within the protocol, it should first be pointed out that data quality is always the main focus for neurophysiological collection techniques. In this methodology, research assistants must pay special attention to instructing the subjects to minimize head movements that will interfere with valence monitoring (losing correct face angle for the camera) or generate myographic artifacts in the EEG. On the other hand, a balance must be maintained between the authenticity of real problem solving and interventions made for more ergonomic data collection. It is also important to note that EEG data collection is subject to electromagnetic fluctuations in the environment. Traditional EEG facilities try to isolate their apparatus from electromagnetic fluctuations with Faraday cages. However, because some of the equipment used in this methodology would generate electromagnetic fluctuations (mainly the eye-tracking device) inside the Faraday cage, this approach would be ineffective. We overcome the electromagnetic issues by paying particular attention to grounding and shielding all electrical devices.

As for modifications and troubleshooting with the technique, the initial synchronization strategy relied on the synchronization software’s capacity to precisely “start” data collection on multiple computers and programs together. Because critical and inconsistent delays between computers and programs were observed, post-collection resynchronization became necessary. Consequently, a syncbox device was added to the architecture. The syncbox sends a TTL marker to all the computers and programs that collect data. Synchronization becomes a matter of calculating the delay between the first syncbox markers.

One limitation of the technique that needs to be mentioned is the precision of signal analysis, which is limited by the cognitive engagement index. Because of the basic assumptions of the FFT, this index is generated on a 1 sec epoch basis: the cognitive engagement script generates a value every second. In this paradigm, which focuses on authentic problem solving, this timeframe is acceptable, but more precise studies of engagement might encounter some limitations with this timeframe for analysis.

With respect to existing/alternative methods, it must be noted that emotional valence can also be derived with blood volume pressure18, 25 sensors. This technique could also be integrated into future research to evaluate its accuracy compared to the valence signal from facial emotion recognition software. We should also mention that the cognitive engagement index used in this study is a well-known one that has been used in previous published research. Some manufacturers of lightweight EEG devices claim to provide a similar measure, but it is difficult to assess the quality of the raw and processed data since their algorithms are unpublished.

Finally, this technique presents many possible applications in different fields. Of course, it will be of value in the field of education. Among other possibilities, this engagement assessment technique could be a powerful tool to inform course designers. For example, as Martens, Gulikers and Bastiaens26 observed, “quite often, developers tend to add multimedia add-ons, simulations, and so on, mainly because technology makes it possible, even though they are not based on careful educational analysis and design.” Thus, neurophysiological data could inform designers if a specific add-on is valuable, if the content is too complex, if the proposed learning strategies are efficient, etc. In addition, real-time assessment of learner engagement opens up possibilities for adaptive e-learning or e-assessment environments. We can foresee a learner, wearing a lightweight EEG helmet, being warned by the system when his/her engagement level is declining and, for example, prompted to pause or react accordingly. It would also be possible to develop adaptive assessment tasks, based on engagement indexes. A fair amount of research and development are currently being conducted in the innovative field of brain-computer interfaces (BCI).

Subscription Required. Please recommend JoVE to your librarian.

Disclosures

The authors have nothing to disclose.

Acknowledgments

The authors acknowledge the financial support of the Social Sciences and Humanities Research Council of Canada (SSHERC), Natural Sciences and Engineering Research Council of Canada (NSERC), Fonds de Recherche Nature et Technologies du Québec (FQRNT) and Fonds de Recherche sur la Société et Culture du Québec (FQRSC).

Materials

Name Company Catalog Number Comments
EGI GSN-32  EGI Dense array EEG
Netstation v.5.0 EGI EEG data collection software: EEG is collected with 32-electrode dense array electroencephalography (dEEG) geodesic sensor net using Netstation acquisition software and EGI amplifiers (Electrical Geodesics, Inc). The vertex (recording site Cz) is the reference electrode for recording. Impedance is kept below 50 kΩ with a sampling rate of 250 Hz. 
Facereader v.4 Noldus Facial emotion recognition software
Syncbox Noldus Syncbox start the co-registration of EEG and gaze data by sending a Transistor-Transistor Logic (TTL) signal to the EGI amplifier and a keystroke signal to the Tobii Studio v 3.2.
Logitech C600  Webcam 960-000396 Webcam used to gather video data sent to mediarecorder and that will be analyzed in Facereader
The Observer XT Noldus Integration and synchronization software: The Noldus Observer XT (Noldus Information Technology) is used to synchronize all behavioral, emotional and cognitive engagement data. 
On-Screen LED illumination Noldus Neon positioned on computer screen in order to correctly light the face of subjects
MediaRecorder Noldus Video data collection software
Tobii 60X Tobii Collect eye-movement patterns :  used to record subjects’ eye movement patterns at 60Hz during the experiment. 
Tobii Studio v.3.2 Tobii Eye-tracking data collection and analysis software
Analyzer 2 Brainvision EEG signal processing software
Acqknowledge v.4.0 Biopac ACK100M Physiological signal acquisition and processing software
Control III germicide solution Maril Products. 10002REVA-20002-1 Disinfectant solution used with EEG helmets : recommended by EGI
Unipark QuestBack AG Online survey environment

DOWNLOAD MATERIALS LIST

References

  1. Fredricks, J. A., Blumenfeld, P. C., Paris, A. H. School engagement: Potential of the concept, state of the evidence. Rev. Educ. Res. 74 (1), 59-109 (2004).
  2. Clark, R. C., Mayer, R. E. E-learning and the Science of Instruction. , Pfeiffer. San Francisco. (2011).
  3. Zhang, D., Zhou, L., Briggs, R. O., Nunamaker, J. F. Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Inform. Manage. 43 (1), 15-27 (2006).
  4. Freeman, F. G., Mikulka, P. J., Prinzel, L. J., Scerbo, M. W. Evaluation of an adaptive automation system using three EEG indices with a visual tracking task. Biol. Psychol. 50, 61-76 (1999).
  5. Glimcher, P., Rustichini, A. Neuroeconomics: The consilience of brain and decision. Science. 306 (5695), 447-452 (2004).
  6. Lieberman, M. D. Social Cognitive neuroscience: A review of core processes. Annu. Rev. Physiol. 58, 259-289 (2007).
  7. Pope, A. T., Bogart, E. H., Bartolome, D. S. Biocybernetic system evaluates indices of operator engagement in automated task. Biol. Psychol. 40 (1-2), 187-195 (1995).
  8. Physiological indicators for the evaluation of co-located collaborative play. Mandryk, R., Inkpen, K. CSCW '04 Proceedings of the 2004 ACM conference on Computer supported cooperative work, 2004 Nov 6-10, Chicago, IL, USA, , (2004).
  9. Allaire-Duquette, G., Charland, P., Riopel, M. At the very root of the development of interest: Using human body contexts to improve women’s emotional engagement in introductory physics. Eur. J. Phy. Ed. 5 (2), 31-48 (2014).
  10. Bradley, M. M., Lang, P. J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psy. 25 (1), 49-59 (1994).
  11. Lang, P. J. The emotion probe: Studies of motivation and attention. Am. Psychol. 50 (5), 372-385 (1995).
  12. Ekman, P., Felt Friesen, W. false, and miserable smiles. J. Nonverbal Behav. 6 (4), 238-252 (1982).
  13. Advances in face and gesture analysis. Van Kuilenburg, H., Den Uyl, M. J., Israël, M. L., Ivan, P. Proceedings of Measuring Behavior 2008, 2008 Aug 26-29, Maastricht, The Netherlands, , 371-372 (2008).
  14. Cacioppo, J., Tassinary, L. G., Berntson, G. G. Handbook of Psychophysiology. , Cambridge University Press. Cambridge, UK. (2007).
  15. Lubar, J. F., Swartwood, M. O., Swartwood, J. N., O’Donnell, P. H. Evaluation of the effectiveness of EEG neurofeedback training for ADHD in a clinical setting as measured by changes in T.O.V.A. scores, behavioral ratings, and WISC R performance. Biofeedback Self-reg. 20 (1), (1995).
  16. Mikulka, P. J., Freeman, F. G., Scerbo, M. W. Effects of a biocybernetic system on the vigilance decrement. Hum. factors. 44 (4), 654-664 (2002).
  17. Freeman, F. G., Mikulka, P. J., Scerbo, M. W., Scott, L. An evaluation of an adaptive automation system using a cognitive vigilance task. Biol. Psychol. 67 (3), 283-297 (2004).
  18. Chaouachi, M., Chalfoun, P., Jraidi, I., Frasson, C. Affect and mental engagement: Toward adaptability for intelligent systems. Proceedings of the Twenty-Third International Florida Artificial Intelligence Research Society Conference (FLAIRS 2010), Association for the Advancement of Artificial Intelligence. , 355-360 (2010).
  19. Courtemanche, F., Ortiz de Guinea, A., Titah, R., Fredette, M., Labonté-LeMoyne, É Precision is in the eye of the beholder: Application of eye fixation-related potentials to information systems research. J. Assoc. Inf. Syst. 15 (10), 651-678 (2014).
  20. Courtemanche, F., Ortiz de Guinea, A., Titah, R., Fredette, M., Labonté-Lemoyne, E. Applying eye fixation-related potentials to information systems research: Demonstration of the method during natural IS use and guidelines for research. J. Assoc. Inf. Syst. 15 (10), (2014).
  21. Courtemanche, F. Un outil d’évaluation neurocognitive des interactions humain-machine Doctoral thesis. , Université de Montréal. Montreal, QC. (2014).
  22. Assessing multiple dimensions of learner engagement during science problem solving using psychophysiological and behavioral measures. Charland, P., Léger, P. M., Mercier, J., Skelling-Desmeules, Y. Fourth Scientific International Symposium of the Association for Research in Neuroeducation, Caen, France, , (2014).
  23. Venables, P. H., Christie, M. J. Electrodermal activity. Techniques in Psychophysiology. Martin, I., Venables, P. , Wiley. Chichester, UK. 3-67 (1980).
  24. Electrodermal Activity. Boucsein, W. , 2nd ed, Springer. New York. (2012).
  25. Sarlo, M., Palomba, D., Buodo, G. M., Minghetti, R., Stegagno, L. Blood pressure changes highlight gender differences in emotional reactivity to arousing pictures. Biol. Psychol. 70 (3), 188-196 (2005).
  26. Martens, R. L., Gulikers, J., Bastiaens, T. The impact of intrinsic motivation on e-learning in authentic computer tasks. J. Comput. Assist. Lear. 20 (5), 368-376 (2004).

Tags

Engagement Multiple Dimensions Behavioral Engagement Emotional Engagement Cognitive Engagement Neurophysiological Perspective Theoretical Synthesis Teaching Interventions Technological Advances Neurosciences Methodology Data Collection Electroencephalography Electrodermal Data Eye-tracking Data Facial Emotion Recognition Data Synchronization Issues Integration Software Curriculum Developers Instruction/learning Efficiency Brain-computer Interactions Adaptive Learning Assessment Environments
Assessing the Multiple Dimensions of Engagement to Characterize Learning: A Neurophysiological Perspective
Play Video
PDF DOI DOWNLOAD MATERIALS LIST

Cite this Article

Charland, P., Léger, P. M.,More

Charland, P., Léger, P. M., Sénécal, S., Courtemanche, F., Mercier, J., Skelling, Y., Labonté-Lemoyne, E. Assessing the Multiple Dimensions of Engagement to Characterize Learning: A Neurophysiological Perspective. J. Vis. Exp. (101), e52627, doi:10.3791/52627 (2015).

Less
Copy Citation Download Citation Reprints and Permissions
View Video

Get cutting-edge science videos from JoVE sent straight to your inbox every month.

Waiting X
Simple Hit Counter