This content is Open Access.
The JoVE video player is compatible with HTML5 and Adobe Flash. Older browsers that do not support HTML5 and the H.264 video codec will still use a Flash-based video player. We recommend downloading the newest version of Flash here, but we support all versions 10 and above.
If that doesn't help, please let us know.
Real-Time Proxy-Control of Re-Parameterized Peripheral Signals using a Close-Loop Interface
Chapters
Summary May 8th, 2021
We present protocols and methods of analyses to build co-adaptive interfaces that stream, parameterize, analyze, and modify human body and heart signals in close-loop. This setup interfaces signals derived from the peripheral and central nervous systems of the person with external sensory inputs to help track biophysical change.
Transcript
The overall purpose of this interface is to study the effect of poly signal augmentation on gaining bodily awareness. This is accomplished by designing a generic closed loop interface, that can be used with different wearable technologies on different experimental setups and can be applied on different populations. In this work we present a generic interface to sample interfaces, their applications and their effect on the human system.
The first step of the design is the use of various wearable technologies that allow the recording of signals that come from different levels of the nervous system. The second step is the use of Lab Streaming layer to achieve synchronized recording and real-time streaming of the signals. On the first step, the stream data are pulled by code, developed in the language of our preference where the real-time analysis and feature extraction of a selected signal is taking place.
Hence, on the four step the participants are experiencing the sensory augmentation of extracted features such as the auditory mapping of their heart rate to the tempo of the song or the visual representation of their motions. Finally, by continuously augmenting in real-time the body information, we close the loop of the unfolding interaction between the interface and the participant. Investigations on how the brain may control our bodies have generated the design of brain machine interfaces.
Which harness the nervous system signals to control an external device such as an exoskeleton or a robotic arm. We here present closed loop interfaces which harness the signals of the nervous system and augment them using a sensory module to assist participants in gaining control over their bodies. Some essential or features of our design are one, the synchronous recording of data coming from different technologies to investigate various levels of the nervous system and two is the data streaming and analysis of the data for real-time augmentation In both sample studies, we used audio feedback but in the audio closed loop interface the salsa dancers were responding to the tempo of the music which was controlled in real-time, based on the rhythm of the heart.
To accomplish this we use the musical programming language, Max to control the playback speed of audio files. Sensors capture the dancers, heartbeats and filtered heart R peaks are generated using a Python script. These peak values are then transmitted in real-time to Max via open sound control.
First we helped the participant put on the LED based motion capture costume and attached to it, their wireless led controller. After turning on the server, open a web browser, enter the IP address of the server machine and sign in. If this step is successful the configuration manager will open.
Then open the interface of the motion capture system and click connect. to start streaming the data from the led markers. Once the connection is established, the position of the markers will be displayed on the virtual world of the interface.
Right click on the skeleton on the right side of the window and select new skeleton to choose marker mapping. Then right click on skeleton again and select generate skeleton. Make sure that the participant is posing in T-pose.
If all the steps are correctly performed the skeleton will be generated. To stream the skeleton data to LSL select settings and options from the main menu. Open OWL Emulator and makes sure that you have clicked start live streaming.
Next, help the same participant put on the EEG head cap. Fill electrodes with with high conductive gel and place the electrode cables. Then plug them in the wireless monitor and turn it on.
Open the interface of the EEG system and select use wifi device. Select the device and click use this device, click on the head icon. Select the protocol that allows the recording of all 32 sensors and click load.
Make sure that the stream data fits and all are displayed on the interface. To collect heart activity data, use one of the EEG channels to connect the EEG extension cable. Use a sticky electrode to stick the other end of the extension right below the left rib cage of the participant.
Locate the LSL application of the motion capture system in the LSL folder and run it by double clicking on the corresponding icon. On the interface set the proper server address, click link. For the EEG and ECG data streaming, no extra steps are required.
Next we locate the lab recorder application, which is also located in the LSL folder. Run the application by double clicking on it. If not all data types of the motion capture and the EEG system are displayed on the panel record for streams, click update.
Select directory and name on storage location panel and then click start to begin the data collection. Execute the MATLAB, Python or other code that receives, processes and augments the stream data. LSL enables the streaming of the data in numerous programming platforms.
They use the goals corresponding to their presentative examples described in the manuscript. Visit our GitHub link. In the market there are various technologies that create sensory outputs.
Some common examples are:speakers, lights, monitors and other less common ones, such as haptic devices vibrators, gustatory and olfactory interfaces. In the audio closed loop interface we manage to digitally augment the heart rate. Using the musical programming language Max, we are able to control the playback speed of audio files.
The filtered heart R peaks which are printed to the console are transmitted from the Python script to Max. There the inter peak interval time is measured and converted to beats per minute. The data are scaled to create a range between zero, the slowest possible playback speed, and maximum playback speed.
On this continuum one equals normal playback speed, 0.5 equals half speed and two equals double playback speed. The design of the generic interface can be used in various populations. Its protocol and the examples that are used here to provide proof of concept are not limited to a specific group.
Moreover, the closed loop interfaces are designed to be intuitively explored and learned. Though there should be no need for instructions as part of the experimental procedure. In the study of audio closed loop interface of a real dyadic interaction, two salsa dancers interacted with an interface which uses the female dancers heart rate to alter the speed of the song.
The dancers performed a well-rehearsed routine and an improvised dance. In each condition they performed the original version of the song once and then it's heart altered version twice. From the data collected we estimate the gamma Stochastic signatures of the extracted micro movement spikes.
Here we observe the estimated probability density functions of the heart and the music data. The dope figures demonstrate the sets of the heart, which is an autonomic organ, from the first condition dancing the song in its original form. To the second and third condition dancing the heart altered song.
The left figure corresponds to the three recordings of the spontaneous dancing, whereas the right figure to the three recordings of the deliberate dancing. On the bottom figures we can observe the corresponding sets of the audio played. Here the sets have an opposing direction.
In the study of audio visual closed loop interface of an artificial dyadic interaction, six participants interacted with an interface which creates their live mirrored avatar. In addition, it embeds hip position dependent sounds. The participants were naive as to the purpose of the study;they had to walk around the room and figure out how to control the sound that would surprisingly emerge as they pass through the region of interest.
The figures demonstrate the probability density function and the corresponding gamma signatures of the hip speed data of 60 friend control participants, C1 to C6 when they were inside and outside the region of interest. The outcomes highlight the personalized differences in the behavior of the participants when inside or outside the region. Empirically we have found that signatures located in the right lower corner of the gamma plane are those of athletes and dancers performing highly skilled movements, signatures that lie on the left upper region come from data sets of nervous systems with pathologies.
Therefore, we can notice that the signatures of the hip speed of control three and four reveal healthier motor patterns when inside the volume. In contrast the rest of the participants display the opposite pattern. This approach can be used both to gain a greater understanding of the relationship between a dancer's movements and sound.
It can also be used to explore novel approaches to music composition that are led by real-time bodily information, converting music making to an experience the body signal augmentation The development and research on closed loop interfaces could benefit several disorders of the nervous system. They could also be used in gaming, movement training and sports.
Related Videos
You might already have access to this content!
Please enter your Institution or Company email below to check.
has access to
Please create a free JoVE account to get access
Login to access JoVE
Please login to your JoVE account to get access
We use/store this info to ensure you have proper access and that your account is secure. We may use this info to send you notifications about your account, your institutional access, and/or other related products. To learn more about our GDPR policies click here.
If you want more info regarding data storage, please contact gdpr@jove.com.
Please enter your email address so we may send you a link to reset your password.
We use/store this info to ensure you have proper access and that your account is secure. We may use this info to send you notifications about your account, your institutional access, and/or other related products. To learn more about our GDPR policies click here.
If you want more info regarding data storage, please contact gdpr@jove.com.
Your JoVE Unlimited Free Trial
Fill the form to request your free trial.
We use/store this info to ensure you have proper access and that your account is secure. We may use this info to send you notifications about your account, your institutional access, and/or other related products. To learn more about our GDPR policies click here.
If you want more info regarding data storage, please contact gdpr@jove.com.
Thank You!
A JoVE representative will be in touch with you shortly.
Thank You!
You have already requested a trial and a JoVE representative will be in touch with you shortly. If you need immediate assistance, please email us at subscriptions@jove.com.
Thank You!
Please enjoy a free 2-hour trial. In order to begin, please login.
Thank You!
You have unlocked a 2-hour free trial now. All JoVE videos and articles can be accessed for free.
To get started, a verification email has been sent to email@institution.com. Please follow the link in the email to activate your free trial account. If you do not see the message in your inbox, please check your "Spam" folder.