-1::1
Simple Hit Counter
Skip to content

Products

Solutions

×
×
Sign In

EN

EN - EnglishCN - 简体中文DE - DeutschES - EspañolKR - 한국어IT - ItalianoFR - FrançaisPT - Português do BrasilPL - PolskiHE - עִבְרִיתRU - РусскийJA - 日本語TR - TürkçeAR - العربية
Sign In Start Free Trial

RESEARCH

JoVE Journal

Peer reviewed scientific video journal

Behavior
Biochemistry
Bioengineering
Biology
Cancer Research
Chemistry
Developmental Biology
View All
JoVE Encyclopedia of Experiments

Video encyclopedia of advanced research methods

Biological Techniques
Biology
Cancer Research
Immunology
Neuroscience
Microbiology
JoVE Visualize

Visualizing science through experiment videos

EDUCATION

JoVE Core

Video textbooks for undergraduate courses

Analytical Chemistry
Anatomy and Physiology
Biology
Cell Biology
Chemistry
Civil Engineering
Electrical Engineering
View All
JoVE Science Education

Visual demonstrations of key scientific experiments

Advanced Biology
Basic Biology
Chemistry
View All
JoVE Lab Manual

Videos of experiments for undergraduate lab courses

Biology
Chemistry

BUSINESS

JoVE Business

Video textbooks for business education

Accounting
Finance
Macroeconomics
Marketing
Microeconomics

OTHERS

JoVE Quiz

Interactive video based quizzes for formative assessments

Authors

Teaching Faculty

Librarians

K12 Schools

Products

RESEARCH

JoVE Journal

Peer reviewed scientific video journal

JoVE Encyclopedia of Experiments

Video encyclopedia of advanced research methods

JoVE Visualize

Visualizing science through experiment videos

EDUCATION

JoVE Core

Video textbooks for undergraduates

JoVE Science Education

Visual demonstrations of key scientific experiments

JoVE Lab Manual

Videos of experiments for undergraduate lab courses

BUSINESS

JoVE Business

Video textbooks for business education

OTHERS

JoVE Quiz

Interactive video based quizzes for formative assessments

Solutions

Authors
Teaching Faculty
Librarians
K12 Schools

Language

English

EN

English

CN

简体中文

DE

Deutsch

ES

Español

KR

한국어

IT

Italiano

FR

Français

PT

Português do Brasil

PL

Polski

HE

עִבְרִית

RU

Русский

JA

日本語

TR

Türkçe

AR

العربية

    Menu

    JoVE Journal

    Behavior

    Biochemistry

    Bioengineering

    Biology

    Cancer Research

    Chemistry

    Developmental Biology

    Engineering

    Environment

    Genetics

    Immunology and Infection

    Medicine

    Neuroscience

    Menu

    JoVE Encyclopedia of Experiments

    Biological Techniques

    Biology

    Cancer Research

    Immunology

    Neuroscience

    Microbiology

    Menu

    JoVE Core

    Analytical Chemistry

    Anatomy and Physiology

    Biology

    Cell Biology

    Chemistry

    Civil Engineering

    Electrical Engineering

    Introduction to Psychology

    Mechanical Engineering

    Medical-Surgical Nursing

    View All

    Menu

    JoVE Science Education

    Advanced Biology

    Basic Biology

    Chemistry

    Clinical Skills

    Engineering

    Environmental Sciences

    Physics

    Psychology

    View All

    Menu

    JoVE Lab Manual

    Biology

    Chemistry

    Menu

    JoVE Business

    Accounting

    Finance

    Macroeconomics

    Marketing

    Microeconomics

Start Free Trial
Loading...
Home
JoVE Journal
Bioengineering
Motor Imagery Performance Through Embodied Digital Twins in a Virtual Reality-Enabled Brain-Compu...
Motor Imagery Performance Through Embodied Digital Twins in a Virtual Reality-Enabled Brain-Compu...
JoVE Journal
Bioengineering
A subscription to JoVE is required to view this content.  Sign in or start your free trial.
JoVE Journal Bioengineering
Motor Imagery Performance Through Embodied Digital Twins in a Virtual Reality-Enabled Brain-Computer Interface Environment

Motor Imagery Performance Through Embodied Digital Twins in a Virtual Reality-Enabled Brain-Computer Interface Environment

Full Text
1,570 Views
10:14 min
May 10, 2024

DOI: 10.3791/66859-v

Kishor Lakshminarayanan1, Rakshit Shah2, Vadivelan Ramu1, Deepa Madathil3, Yifei Yao4, Inga Wang5, Brahim Brahmi6, Mohammad Habibur Rahman7

1Department of Sensors and Biomedical Tech, School of Electronics Engineering,Vellore Institute of Technology, 2Department of Orthopaedic Surgery,University of Arizona, 3Jindal Institute of Behavioural Sciences,O. P. Jindal Global University, 4Soft Tissue Biomechanics Laboratory, Med-X Research Institute, School of Biomedical Engineering,Shanghai Jiao Tong University, 5Department of Occupational Science & Technology,University of Wisconsin-Milwaukee, 6Electrical Engineering,Collège Ahuntsic, 7Department of Mechanical Engineering, BioRobotics Lab,University of Wisconsin-Milwaukee

Motor imagery in a virtual reality environment has wide applications in brain-computer interface systems. This manuscript outlines the use of personalized digital avatars that resemble the participants performing movements imagined by the participant in a virtual reality environment to enhance immersion and a sense of body ownership.

My research in neuro rehabilitation centers around improving upper limb function for individuals with neurological injuries or disorders using a combination of EEG, motor imagery, and virtual reality technologies. The primary goal is to understand how these technologies can be integrated to enhance the effectiveness of motor skills rehabilitation. Virtual reality environments have become more sophisticated, enabling realistic simulations, tailored to individual rehabilitation needs, which help in refining motor skills in a nuanced manner.

Hybrid systems that combine VR with other technologies like robotic arms and haptic feedback devices, are also on the rise. One of the current experimental challenges with the use of brain-computer interfaces revolves around the issue of intersubject variability and the need for extensive training. Each individual's brain signal can be vastly different, which means that BCIs often need to be extensively customized or calibrated for each user.

Current treatments often lack the immersion and interactivity necessary for maximum efficacy. To fill this gap, our protocol utilizes motor imagery with an innovative twist, integrating digital twins represented by personalized 3D avatars in a virtual reality setting. This integration enhances immersion, making the rehabilitation process not just a mental exercise, but also an engaging experience.

Our research protocol offers significant advantages over other techniques, particularly in terms of ease of setup and cost effectiveness. A standout feature is the creation and utilization of 3D avatars that closely resemble the subjects. This is achieved using simple, readily available tools and software that can generate personalized avatars from basic input data such as photographs.

Begin by assembling the 16 channel EEG data acquisition system. Attach the DAISY module, which has eight EEG channels to the board containing eight EEG channels. Using a Y-splitter cable, connect the reference electrode to the bottom reference pin on the DAISY board and the board, both labeled as SRB.

Connect the ground electrode to the bias pin on the bottom board. Next, connect the 16 EEG electrodes to the bottom board pins and the DAISY bottom pins labeled N1P to N8P. Insert the electrodes on the gel-free cap at the labeled locations as per the international 10-20.

Soak 18 sponges provided for the EEG electrodes in a saline solution for 15 minutes. Insert the soaked sponges on the underside of each electrode to establish contact between the scalp and the electrode. Then let the participants sit comfortably in a quiet room.

Place the gel-free EEG cap on the participant's scalp and ensure the cap is aligned to fit over the participant's ears. Connect the USB dongle to the laptop. Open the EEG GUI, click on EEG System.

Under the data source option, select serial from dongle, 16 channels, and auto connect. Inside the data acquisition screen, select the signal widget to check the signal quality of the connected electrodes. At each electrode site, verify an optimal impedance level of less than 10k ohm.

If the impedance is higher than 10k ohm, add a few drops of saline solution to the sponge under the electrode. Then close the GUI. Next, open the acquisition server software and select the appropriate EEG board under driver.

Click connect and then play to establish a connection with the EEG system. For game design, open the game engine software and select motor imagery training project. Enable VR support by clicking on edit, then project settings, followed by XR plugin management.

Check the box for the VR headset listed under virtual reality SDKs. Delete the default camera and drag the VR camera from the VR integration package into the scene. Also, place the imported animation file in the scene and adjust the scale and orientation as needed.

For motor imagery training, set the OSC listener game object with pre-written scripts to trigger model animations for left and right hand movements based on OSC messages. Next, in the game engine software, open file and click on build settings. Select PC, Mac and Linux standalone, then target Windows, followed by clicking build and run.

For the motor imagery testing project, use the OSC listener game object configured with scripts to receive OSC signals indicative of the participant's imagined hand movements and make the avatar perform the imagined movement. To begin, open the software tool to design and run motor imagery scenarios. Navigate to file and load the six motor imagery BCI scenarios labeled signal verification, acquisition, CSP training, classifier training, testing, and confusion matrix.

Navigate to the signal verification scenario and apply a band pass filter between one to 40 hertz with a filter order of four to the raw signals using designer boxes. Guide the participants to undergo motor imagery tasks, imagining hand movements in response to visual cues. Open the file for motor imagery training and display the prepared 3D avatar standing over a set of bongos through the VR headset.

Navigate to the acquisition scenario and double click the Graz motor imagery stimulator to configure the box. Configure 50 trials of five second each for both left and right hand movements. Incorporate a 20 second baseline period followed by intervals of 10 seconds rest after every 10 trials to avoid mental fatigue.

Configure the left and right hand trials to be randomized and add a cue before the trial, indicating the hand to be imagined. Connect an OSC box with the IP address and port to transmit the cue for the hand to be imagined to the motor imagery training game engine program. Then sanitize the VR headset with wipes and place it on the participant's head to facilitate an immersive interaction while capturing EEG data.

Direct the participants to imagine executing the movement of their hand along with the 3D avatar, following the same pace as the avatar when it hits the bongo with the corresponding hand, with a text cue displaying which hand is to be imagined. Following the acquisition, run the CSP training scenario to analyze the EEG data from the acquisition stage. Create filters to distinguish between left and right hand imagery and compute CSP.

After the CSP training, navigate to the classifier training scenario and run it to prepare the system for real-time avatar control. Then navigate to the testing scenario and allow the participants to control their 3D avatars in real-time using brain-computer interface technology. To interpret the imagined actions in real-time, load the classifiers trained during the scenario on EEG data in the appropriate boxes.

Brief participants on the testing procedure, emphasizing the need to clearly imagine hand movements as prompted by text cues. Conduct 20 trials for each participant, divided equally between imagining movements of the left and right hand and randomized. Connect and configure an OSC box to transmit the cue information, which will be displayed as text and indicate the hand to be imaged in the game engine program.

Connect to another OSC box to transmit the predicted value for the left and right hand movements for the game engine program. Run the testing scenario and the motor imagery testing game engine program. Observe that the program plays the corresponding animation based on hand movement.

Five healthy adults, aged 21 to 38, participated in the study under both motor imagery training and testing conditions. An average confusion matrix for all subjects was used to evaluate the classifier's accuracy in distinguishing between left and right motor imagery signals during both sessions. Topographical patterns of CSP weights for motor imagery training were visualized for both motor imagery directions.

A time frequency analysis was conducted on EEG data from contralateral sensor motor areas to identify event related spectral perturbations during motor tasks.

View the full transcript and gain access to thousands of scientific videos

Sign In Start Free Trial

Explore More Videos

Motor ImageryNeuro RehabilitationUpper Limb FunctionNeurological DisordersBrain-computer InterfaceVirtual RealityDigital Twins3D AvatarsIntersubject VariabilityEEGHybrid SystemsHaptic FeedbackRehabilitation TechniquesImmersive Experience

Related Videos

Using an EEG-Based Brain-Computer Interface for Virtual Cursor Movement with BCI2000

12:07

Using an EEG-Based Brain-Computer Interface for Virtual Cursor Movement with BCI2000

Related Videos

18.3K Views

Brain Imaging Investigation of the Neural Correlates of Observing Virtual Social Interactions

10:45

Brain Imaging Investigation of the Neural Correlates of Observing Virtual Social Interactions

Related Videos

12K Views

Multifunctional Setup for Studying Human Motor Control Using Transcranial Magnetic Stimulation, Electromyography, Motion Capture, and Virtual Reality

08:09

Multifunctional Setup for Studying Human Motor Control Using Transcranial Magnetic Stimulation, Electromyography, Motion Capture, and Virtual Reality

Related Videos

11.3K Views

Visualization Method for Proprioceptive Drift on a 2D Plane Using Support Vector Machine

07:05

Visualization Method for Proprioceptive Drift on a 2D Plane Using Support Vector Machine

Related Videos

9.5K Views

Creating Virtual-hand and Virtual-face Illusions to Investigate Self-representation

06:53

Creating Virtual-hand and Virtual-face Illusions to Investigate Self-representation

Related Videos

13.7K Views

Using Virtual Reality to Transfer Motor Skill Knowledge from One Hand to Another

05:12

Using Virtual Reality to Transfer Motor Skill Knowledge from One Hand to Another

Related Videos

548K Views

Characterization of the Sense of Agency over the Actions of Neural-machine Interface-operated Prostheses

05:21

Characterization of the Sense of Agency over the Actions of Neural-machine Interface-operated Prostheses

Related Videos

8.2K Views

Real-time Video Projection in an MRI for Characterization of Neural Correlates Associated with Mirror Therapy for Phantom Limb Pain

11:29

Real-time Video Projection in an MRI for Characterization of Neural Correlates Associated with Mirror Therapy for Phantom Limb Pain

Related Videos

10.2K Views

Real-Time Proxy-Control of Re-Parameterized Peripheral Signals using a Close-Loop Interface

11:54

Real-Time Proxy-Control of Re-Parameterized Peripheral Signals using a Close-Loop Interface

Related Videos

5K Views

Motor Imagery Brain-Computer Interface in Rehabilitation of Upper Limb Motor Dysfunction After Stroke

09:42

Motor Imagery Brain-Computer Interface in Rehabilitation of Upper Limb Motor Dysfunction After Stroke

Related Videos

1.9K Views

JoVE logo
Contact Us Recommend to Library
Research
  • JoVE Journal
  • JoVE Encyclopedia of Experiments
  • JoVE Visualize
Business
  • JoVE Business
Education
  • JoVE Core
  • JoVE Science Education
  • JoVE Lab Manual
  • JoVE Quizzes
Solutions
  • Authors
  • Teaching Faculty
  • Librarians
  • K12 Schools
About JoVE
  • Overview
  • Leadership
Others
  • JoVE Newsletters
  • JoVE Help Center
  • Blogs
  • Site Maps
Contact Us Recommend to Library
JoVE logo

Copyright © 2025 MyJoVE Corporation. All rights reserved

Privacy Terms of Use Policies
WeChat QR code