Articles by Jesus G. Cruz-Garza in JoVE
A Novel Experimental and Analytical Approach to the Multimodal Neural Decoding of Intent During Social Interaction in Freely-behaving Human Infants Jesus G. Cruz-Garza1, Zachery R. Hernandez1, Teresa Tse1,2,3, Eunice Caducoy1,3, Berdakh Abibullaev1, Jose L. Contreras-Vidal1,2 1Laboratory for Noninvasive Brain-Machine Interface Systems, Department of Electrical and Computer Engineering, University of Houston, 2Department of Biomedical Engineering, University of Houston, 3Department of Biology and Biochemistry, University of Houston This protocol presents a novel methodology for the neural decoding of intent from freely-behaving infants during unscripted social interaction with an actor. Neural activity is acquired using non-invasive high-density active scalp electroencephalography (EEG). Kinematic data is collected with inertial measurement units and supplemented with synchronized video recording.
Other articles by Jesus G. Cruz-Garza on PubMed
Neural Decoding of Expressive Human Movement from Scalp Electroencephalography (EEG) Frontiers in Human Neuroscience. 2014 | Pubmed ID: 24782734 Although efforts to characterize human movement through electroencephalography (EEG) have revealed neural activities unique to limb control that can be used to infer movement kinematics, it is still unknown the extent to which EEG can be used to discern the expressive qualities that influence such movements. In this study we used EEG and inertial sensors to record brain activity and movement of five skilled and certified Laban Movement Analysis (LMA) dancers. Each dancer performed whole body movements of three Action types: movements devoid of expressive qualities ("Neutral"), non-expressive movements while thinking about specific expressive qualities ("Think"), and enacted expressive movements ("Do"). The expressive movement qualities that were used in the "Think" and "Do" actions consisted of a sequence of eight Laban Effort qualities as defined by LMA-a notation system and language for describing, visualizing, interpreting and documenting all varieties of human movement. We used delta band (0.2-4 Hz) EEG as input to a machine learning algorithm that computed locality-preserving Fisher's discriminant analysis (LFDA) for dimensionality reduction followed by Gaussian mixture models (GMMs) to decode the type of Action. We also trained our LFDA-GMM models to classify all the possible combinations of Action Type and Laban Effort quality (giving a total of 17 classes). Classification accuracy rates were 59.4 ± 0.6% for Action Type and 88.2 ± 0.7% for Laban Effort quality Type. Ancillary analyses of the potential relations between the EEG and movement kinematics of the dancer's body, indicated that motion-related artifacts did not significantly influence our classification results. In summary, this research demonstrates that EEG has valuable information about the expressive qualities of movement. These results may have applications for advancing the understanding of the neural basis of expressive movements and for the development of neuroprosthetics to restore movements.