We designed a virtual reality test to assess instrumental activities of daily living (IADL) with a motion capture system. We propose a detailed kinematic analysis to interpret the participant's various movements, including trajectory, moving distance, and time to completion to evaluate IADL capabilities.
The inability to complete instrumental activities of daily living (IADL) is a precursor to various neuropsychological diseases. Questionnaire-based assessments of IADL are easy to use but prone to subjective bias. Here, we describe a novel virtual reality (VR) test to assess two complex IADL tasks: handling financial transactions and using public transportation. While a participant performs the tasks in a VR setting, a motion capture system traces the position and orientation of the dominant hand and head in a three-dimensional Cartesian coordinate system. Kinematic raw data are collected and converted into 'kinematic performance measures,' i.e., motion trajectory, moving distance, and time to completion. Motion trajectory is the path of a particular body part (e.g., dominant hand or head) in space. Moving distance refers to the total distance of the trajectory, and time to completion is how long it took to complete an IADL task. These kinematic measures could discriminate patients with cognitive impairment from healthy controls. The development of this kinematic measuring protocol allows detection of early IADL-related cognitive impairments.
Instrumental activities of daily living (IADL), such as handling financial transactions, using public transportation, and cooking, are medical markers since they require multiple neuropsychological functions1. Impaired IADL capabilities are thus considered precursors to neurological diseases, such as mild cognitive impairment (MCI) and dementia2. Gold's comprehensive review of IADL tasks3 indicated that more cognitively demanding tasks, such as managing finances and using public transportation, were the earliest predictor of MCI and dementia.
To date, the most commonly used assessments of IADL are self-reported questionnaires, informant-based questionnaires, and performance-based assessments4. Questionnaire-based assessments of IADL are cost-effective and easy to use, but are prone to subjective bias. For instance, when self-reporting, patients tend to over- or under-estimate their IADL capabilities5. Similarly, informants misjudge IADL capabilities due to the observer's misperceptions or knowledge gaps4. Thus, performance-based assessments that ask patients to carry out specific IADL tasks have been preferred, although many of the tasks are inappropriate for a general clinical setting6.
Recently, virtual reality (VR) studies have shown that this technology could have significant applications in medicine and healthcare, which includes everything from training to rehabilitation to medical assessment7. All participants can be tested under the same VR conditions, which mimic the real world. For instance, Allain et al.8 developed a virtual coffee-making task and showed that patients with cognitive impairment performed the task poorly. Klinger et al.9 developed another VR environment for mailing and shopping tasks and found a meaningful relationship between task completion time in VR and neuropsychological test results. Previous VR studies of IADL assessment have mostly focused on simple performance measures such as reaction time or accuracy when using conventional input devices such as a mouse and keyboard8,9. More detailed performance data about IADL is thus needed to efficiently screen for patients with MCI4.
Kinematic analysis of real-time motion capture data is a powerful approach to quantitatively document detailed performance data associated with IADL tasks. For example, White et al.10 developed a virtual kitchen that captures the participant's joint angle data during daily living tasks and used captured data to quantitatively assess the effectiveness of physical therapy. Dimbwadyo-Terrer et al.11 developed an immersive VR environment to assess upper limb performance when conducting basic daily living tasks and showed that kinematic data recorded in a VR environment highly correlated with functional scales of the upper limb. These kinematic analyses with motion capture systems could provide further opportunity to quickly assess a patient's cognitive impairment12. Inclusion of the detailed kinematic data in screening for patients with MCI significantly improved the classification of patients compared to healthy controls13.
Here, we describe a protocol to assess the kinematics of daily living movements with motion capture systems in an immersive VR environment. The protocol comprised two complex IADL tasks: "Task 1: Withdraw money" (handling financial transactions) and "Task 2: Take a bus" (using public transportation). While the tasks were performed, a motion capture system traced the position and orientation of the dominant hand and head. After completing Task 1, dominant hand trajectory, moving distance, and time to completion were collected. In Task 2, head trajectory, moving distance, and time to completion were collected. The Representative Results section in this article details the preliminary test of patients with MCI (i.e., IADL capabilities are impaired) compared to healthy controls (i.e., IADL capabilities are intact).
All experimental procedures described here were approved by the Institutional Review Board of Hanyang University, according to the Declaration of Helsinki (HYI-15-029-2). 6 healthy controls (4 males and 2 females) and 6 MCI patients (3 males and 3 females) were recruited from a tertiary medical center, Hanyang University Hospital.
1. Recruit Participants
- Recruit MCI patients (i.e., impaired IADL capabilities) and healthy controls (i.e., normal IADL capabilities) aged between 70 - 80 years.
- With the help of a neurologist with more than 10 years of clinical experience, review the patients' medical histories, and exclude patients with a history of neurological/psychiatric diseases or brain surgery.
NOTE: Use the following neuropsychological tests: Mini Mental State Examination-Dementia Screening, Korean Instrumental Activities of Daily Living, Free and Cued Selective Reminding Test, Digit Span Test-Forward/Backward, Trail Making Test-A/B13, and the criteria of Albert et al.14 to diagnose MCI.
2. Install VR Software and Connect Computers
- Setup the hardware in the dedicated room similar to Figure 1. Perform this protocol in a room-sized immersive virtual environment (4 x 2.5 x 2.5 m3) containing 4 computers, 4 stereoscopic three-dimensional (3D) projectors, and 8 motion tracking cameras to track the position and orientation of the dominant hand and head during the two IADL tasks.
NOTE: The VR technologies used in this article are computer hardware and software that offer immersive and interactive 3D experiences, by which realistic objects and events can be presented in a virtual environment. The details of the hardware and software are described in the Materials Table.
- Ensure all the computers are equipped with the required software (Visual Studio 2012 redistributable package (x86), DirectX, and MiddleVR, or equivalent). For MiddleVR, i.e., middleware software, check the website15 to obtain the latest versions of the libraries for the input devices, stereoscopy, clustering, and interactions.
- Connect computers to stereoscopic 3D projectors. Graphical settings are 1920 x 1080 pixel resolution.
- Create a Windows 10 HomeGroup to connect the 4 computers to a home network. On the primary computer, create a folder and share it with other HomeGroup computers.
- On the primary computer, initiate the middleware software. Click "Cluster" button. Set the primary computer as a server and other computers as clients. This will synchronize the state of all devices. Click "3D Nodes" button. Specify the position, orientation, and size of the virtual environment screen.
- Complete the settings based on the website15 and save the configuration file.
3. Set Up Motion Capture Systems in a Virtual Environment
- Mount 8 motion tracking cameras in a virtual environment to fully cover the capture volume. Fix cameras securely so that they remain stationary during capture. Ensure objects in a virtual environment will be visible by at least 2 cameras at all times.
- Install OptiTrack Motive software, i.e., motion capture software, on the primary computer using the installation manual16. Connect the primary computer with the motion capture systems with Category 6 Ethernet cables.
- Calibrate the motion capture systems with the following steps, as detailed in the software manual16.
- Remove all extraneous reflections or unnecessary markers from the capture volume.
- Click the "Mask Visible" button to mask unwanted reflections or ambient interference.
- Click the "Start Wanding" button. Use the calibration wand to support the capture of sample frames in order to compute respective positions and orientations in 3D space.
- Click the "Calculate" button to calibrate the system using collected samples.
- Check the calibration results (in order from worst to best): Poor, Fair, Good, Great, Excellent, and Exceptional. If the result is better than Great, click the "Apply" button. If not, click the "Cancel" button and repeat the wanding process.
- Place the calibration square inside the 3D space where you want the origin to be located. Click "Set Ground Plane" button to establish a tracked 3D coordinate system origin.
- Select associated reflective markers for the dominant hand and head. Click the "Rigid Body" button and then click the "Create From Selected Markers" button.
- On the motion capture software, open the "Streaming" menu. Verify that the port number listed is 3883, and select the "Broadcast frame data" box in the "VRPN Streaming Engine" category. Click "Ctrl" + "S" to save the calibration file.
- On the primary computer, initiate the middleware software. Click the "Devices" button. Add a VRPN Tracker to obtain tracking data from the motion capture system, and then save the configuration file.
4. Prepare a Virtual Environment for Use
- Remove all reflective objects (i.e., watches, rings, earrings, metals, etc.) from the virtual environment.
- Turn on computers, stereoscopic 3D projectors, and motion capture systems (360 frames per second).
- Once 4 computers are running, launch the VRDaemon software. For example, double click on "VRDaemon.exe" which located in "C:\Program Files (x86)\MiddleVR\bin."
- On the primary computer, initiate the motion capture software. Click the button near the top menu labeled "Open Existing Project." Load the camera calibration file.
- On the primary computer, initiate the middleware software. Click the "Simulations" button. Load the appropriate simulation and configuration files from a shared folder.
- On the middleware software, press the "Run" button to execute an immersive virtual application with the selected simulation and configuration files.
5. Familiarize the Participant with the Virtual Environment
- Provide the participant with stereoscopic glasses weighing around 50 g. The display frequency of the stereoscopic glasses is 192 Hz. Ensure that the stereoscopic glasses are comfortably placed over the eyes and ears; see Figure 2A.
- Attach reflective markers weighing less than 1 g to the participant's dominant hand and head. Be careful to attach the reflective markers tightly; see Figure 2B. Inform the participant that they can freely move or rotate in the virtual environment using head movement and can click virtual objects with the dominant hand. A virtual hand appears in the virtual environment to mimic the position of the participant's index finger; see Figure 3.
- Ask the participant to freely move (i.e., stand up, sit down, go left, and go right) in the virtual environment for 5 min to familiarize themselves with the VR environment. Then ask the participant to click virtual buttons for 5 min in order to become familiar with how to interact with virtual objects with the dominant hand. Provide another 10 min training session if the participant asks for one.
- Check whether the participant is immune to VR sickness with a simulator sickness questionnaire17.
CAUTION: The synchronized motion tracking on the stereoscopic display can cause VR sickness, which can result in discomfort, headache, stomach awareness, nausea, vomiting, pallor, sweating, fatigue, drowsiness, disorientation, and apathy. If the participant complains of fatigue or the simulator sickness score is too high, stop the protocol.
6. Perform "Task 1: Withdraw money"
CAUTION: Counterbalance the sequences of Task 1 and Task 2 to remove the carry-over effect.
- Explain to the participant the details of the task and provide the 8 action steps to complete the task in the virtual environment. The steps are (1) insert the card into the ATM, (2) select the 'withdraw' menu, (3) select the amount to withdraw, (4) select the bill type, (5) enter the PIN (personal identification number), (6) select the receipt option, (7) remove the card, and (8) take the money from the ATM (see Figure 4).
- On the primary computer, initiate the middleware software. On the "Simulations" tab, select a simulation file for Task 1 and a configuration file. Press the "Run" button; "Task 1: Withdraw money" will run in the virtual environment.
NOTE: For the Task 1 file, see the attached "Task 1 Withdraw Money.zip" file in Supplemental File 1. Note that the virtual task was developed with the Unity 3D engine.
- If the "Task 1: Withdraw money" runs in the virtual environment, instruct the participant to perform as follows: "Please withdraw 70,000 KRW (equivalent to around 60 USD) from the ATM for shopping. Select two different types of notes, one 50,000 KRW note for 50,000 KRW and two 10,000 KRW notes for 20,000 KRW. The password for your transaction is today's date. For instance, if the experiment is carried out on the 11th of November, then the PIN is 1111. Please keep the receipt for further reference."
- Once the task is finished, check the saved kinematic data in CSV files (comma-separated values) for further analysis from a shared folder.
NOTE: Using the motion capture systems, during "Task 1: Withdraw money" record the position and orientation of the dominant hand when conducting a task with a recording frequency of 1 ms.
- Give about 5 min break to the participant before starting "Task 2: Take a bus."
7. Perform "Task 2: Take a bus"
- Explain to the participant the details of the task and provide instructions on how to complete "Task 2: Take a bus" as follows: "Please wait at the bus stop and take the target bus. The target bus information will be given on the VR screen by a specific line number, color, and destination. When the target bus arrives, be sure to walk out of the bus stop and to the front door of the target bus. 8 different target buses will be randomly generated and presented." See Figure 5.
- On the primary computer, initiate the middleware software. On the "Simulations" tab, select a simulation file for Task 2 and a configuration file. Press the "Run" button, then "Task 2: Take a bus" will run in the virtual environment.
NOTE: For the Task 2 file, see the attached "Task 2 Take a Bus.zip" file in Supplemental File 2. Note that the virtual task was developed with the Unity 3D engine.
- If the "Task 2: Take a bus" runs in the virtual environment, instruct the participant to wait in the bus stop. Click the "Spacebar" key on the keyboard to make buses arrive at the bus stop.
- Once the task is finished, check the saved kinematic data in CSV files for further analysis from a shared folder.
NOTE: Using the motion capture systems, during "Task 2: Take a bus" record the position and orientation of the head when conducting the task with a recording frequency of 1 ms.
- The protocol is complete. Help the participant remove the stereoscopic glasses and detach the reflective markers from the dominant hand and head.
CSV files from "Task 1: Withdraw money" were analyzed using the statistical software R to calculate the dominant hand trajectory, moving distance, and time to completion. The trajectory of the dominant hand movement is visualized (Figure 6). The moving distance of the dominant hand is calculated by summing the total distances between sequential hand positions while performing Task 1. The distance between positions is the Euclidian distance. Time to completion means the time taken to finish the whole task (i.e., from step 1 "insert the card into the ATM" to step 8 "take money from the ATM"). For the R code for statistical analysis, see the attached "Task 1 R Code.docx" file in Supplemental File 3.
CSV files from "Task 2: Take a bus" are analyzed to calculate the head trajectory, moving distance, and time to completion using the R statistical software. The trajectory of the head movement is visualized (Figure 7). The moving distance of the head is calculated by summing the total distances between sequential head positions when performing Task 2. The distance between two positions is the Euclidian distance. The time to completion means the time taken from the start to the end of the whole task with eight target buses. For the R code for statistical analysis, see the attached "Task 2 R Code.docx" file in Supplemental File 4.
Anthropometric characteristics and the kinematic measures from patients with MCI and healthy controls are shown in Table 1. This VR test with motion capture systems presents new opportunities for measuring the kinematics of complex IADL tasks. By following the protocol presented here, researchers can obtain kinematic performance data for "Task 1: Withdraw money" (handling financial transactions) and "Task 2: Take a bus" (using public transportation).
Indeed, a case-control study with this protocol was performed with several statistical analyses (i.e., multivariate analysis of variance, a Pearson correlation analysis, and a forward stepwise linear discriminant analysis), which can be found in our empirical study13.
Figure 1: A room-sized immersive virtual environment Please click here to view a larger version of this figure.
Figure 2: Preparation before the assessment. (A) The subject wears stereoscopic glasses. (B) Reflective markers are attached to the dominant hand and head. Please click here to view a larger version of this figure.
Figure 3: Virtual hand representation in the virtual environment. (A) A white sphere represents the position of the index finger. The participant clicks a virtual number "2" button. (B)The participant clicks a virtual number "4" button. Please click here to view a larger version of this figure.
Figure 4: Task 1: Withdraw money from ATM. (A) Participant enters a PIN code into the ATM. (B) Participant withdraws money from the ATM. Please click here to view a larger version of this figure.
Figure 5: Task 2: Take a bus. (A) Participant waits at the bus stop. (B) Participant walks out of the bus stop and into the target bus. Please click here to view a larger version of this figure.
Figure 6: Task 1: Hand movement trajectory in 3D Cartesian space. (A) Healthy controls. (B) MCI patients. Please click here to view a larger version of this figure.
Figure 7: Task 2: Head movement trajectory in 3D Cartesian space. (A) Healthy controls. (B) MCI patients. Please click here to view a larger version of this figure.
|MCI patients||Healthy controls|
|Number (male)||6 (3)||6 (4)|
|Age (year)||72.4 ± 1.9||72.6 ± 1.7|
|Task 1: Withdraw money|
|Moving distance (m)||34.7 ± 9.1||52.5 ± 10.5|
|Time to completion (min)||1.8 ± 0.3||1.3 ± 0.2|
|Task 2: Take a bus|
|Moving distance (m)||100.3 ± 11.4||128.5 ± 14.2|
|Time to completion (min)||13.5 ± 0.2||13.5 ± 0.2|
Table 1: Anthropometric characteristics and kinematic measures. Values are means ± SD.
Supplemental File 1: Task 1 Withdraw Money.zip. Please click here to download this file.
Supplemental File 2: Task 2 Take a Bus.zip. Please click here to download this file.
Supplemental File 3: Task 1 R Code.docx. Please click here to download this file.
Supplemental File 4: Task 2 R Code.docx. Please click here to download this file.
We detailed a kinematic measuring protocol of daily living movements with motion capture systems in an immersive VR environment. First, the experimental setting guided to how to set up, prepare, and familiarize participants with the immersive VR environment. Second, we developed two standardized IADL tasks in VR. Third, Step 3 and Step 5 in the Protocol section are the most critical steps to minimize VR sickness. When setting up the motion capture systems in the virtual environment (Step 3), it is important to mount the tracking camera high enough to fully cover the capture volume, fix the cameras stably to prevent movement during capture, ensure that at least two cameras can simultaneously capture an object, and remove any extraneous reflections or unnecessary markers from the virtual environment. While familiarizing the participants with VR (Step 5), it is crucial to provide enough training for them to become accustomed to the virtual experience. If the participants experience any VR sickness symptoms (e.g., discomfort, headache, nausea, vomiting, pallor, sweating, fatigue, drowsiness, disorientation, and apathy), the experiment should be stopped. Finally, the kinematic raw data were translated by R statistical software.
A limitation and challenge of our protocol is that the virtual IADL tasks should be validated by comparison with real IADL tasks. Though previous studies demonstrated that both virtual and real tasks were highly correlated in terms of reaction time, accuracy8, clinical, and functional measures11, the current kinematic measuring protocol should be compatible with many conventional neuropsychological assessments. Building upon this validation, we need to scale up this protocol with different IADL tasks. Another limitation is that this protocol analyzes only typical kinematic measures, so more sophisticated kinematic performance measures in a virtual environment, such as acceleration, movement accuracy, and efficiency, should be included.
The significance of the current kinematic measurement protocol is that it is fast, safe, easy to perform, and non-invasive for detection of early IADL deficits. A former study using this protocol confirmed that kinematic measures in conjunction with a neuropsychological test result best discriminated MCI patients from healthy controls13. Quantification of specific functional deficits could well provide a basis for locating the source and extent of neurological damage and therefore aid in clinical decision-making for individualizing therapies18. In this context, the protocol proposed in this article could be used for evidence-based clinical decision-making.
Considering future applications, this protocol could be used for other neuropsychological diseases such as traumatic brain injury19. Also, it might be interesting to analyze specific subtasks in the current protocol to identify which types are more challenging. Moreover, recent VR studies to train stroke patients showed improvements in memory and attention functions following a VR-based game intervention20. It would be of great interest to apply this protocol to additional neuropsychological rehabilitation contexts.
The authors declare no conflicts of interest.
K.S. and A.L. contribute equally. This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (NRF-2016R1D1A1B03931389).
|Computer||N/A||N/A||Computer requirements: - Single socket H3 (LGA 1150) supports
- Intel® Xeon® E3-1200 v3, 4th gen. Core i7/i5/i3 processors
- Intel® C226 Express PCH
- Up to 32GB DDR3 ECC/non-ECC 1600MHz UDIMM in 4 sockets
- Dual Gigabit Ethernet LAN ports
- 8x SATA3 (6Gbps)
- 2x PCI-E 3.0 x16, 3x PCI-E 2.0 x1, and 2x PCI 5V 32-bit slots
- 6x USB 3.0 (2 rear + 4 via headers)
- 10x USB 2.0 (4 rear + 6 via headers)
- HD Audio 7.1 channel connector by Realtek ALC1150
- 1x DOM power connector and 1x SPDIF Out Header
- 800W High Efficiency Power Supply
- Intel Xeon E3-1230v3
- DDR3 PC12800 8GB ECC
- WD 1TB BLUE WD 10EZEX 3.5"
- NVIDIA QUADRO K5000 & SYNC
|Stereoscopic 3D Projector||Barco||F35 AS3D WUXGA||Resolution:
- WQXGA (2,560 x 1,600)
- Panorama (2,560 x 1,080)
- WUXGA (1,920 x 1,200), 1080p (1,920 x 1,080)
|Stereoscopic Glasses||Volfoni||Edge 1.2||For further information, visit http://volfoni.com/en/edge-1-2/|
|Motion Capture Systems||NaturalPoint OptiTrack||17W||For further information, visit http://optitrack.com/products/prime-17w/|
|OptiTrack (Motion capture software)||NaturalPoint OptiTrack||Motive 2.0||For further information, visit https://optitrack.com/downloads/motive.html|
|MiddleVR (Middleware software)||MiddleVR||MiddleVR For Unity||For further information, visit http://www.middlevr.com/middlevr-for-unity/|
|VRDaemon (Middleware software)||MiddleVR||MiddleVR For Unity||For further information, visit http://www.middlevr.com/middlevr-for-unity/|
|Unity3D (Game engine)||Unity Technologies||Personal||For further information, visit https://unity3d.com/unity|
- Reppermund, S., et al. Impairment in instrumental activities of daily living with high cognitive demand is an early marker of mild cognitive impairment: the Sydney Memory and Ageing Study. Psychol. Med. 43, (11), 2437-2445 (2013).
- Graf, C. The Lawton instrumental activities of daily living scale. Am. J. Nurs. 108, (4), 52-62 (2008).
- Gold, D. A. An examination of instrumental activities of daily living assessment in older adults and mild cognitive impairment. J. Clin. Exp. Neuropsychol. 34, (1), 11-34 (2012).
- Jekel, K., et al. Mild cognitive impairment and deficits in instrumental activities of daily living: a systematic review. Alzheimers. Res. Ther. 7, (1), 17 (2015).
- Suchy, Y., Kraybill, M. L., Franchow, E. Instrumental activities of daily living among community-dwelling older adults: discrepancies between self-report and performance are mediated by cognitive reserve. J. Clin. Exp. Neuropsychol. 33, (1), 92-100 (2011).
- Desai, A. K., Grossberg, G. T., Sheth, D. N. Activities of Daily Living in patients with Dementia. CNS drugs. 18, (13), 853-875 (2004).
- Ma, M., Jain, L. C., Anderson, P. Virtual, augmented reality and serious games for healthcare 1. 68, Springer Science & Business Pubs. Berlin. (2014).
- Allain, P., et al. Detecting everyday action deficits in Alzheimer's disease using a nonimmersive virtual reality kitchen. J. Int. Neuropsychol. Soc. 20, (5), 468-477 (2014).
- Klinger, E., et al. AGATHE: A tool for personalized rehabilitation of cognitive functions based on simulated activities of daily living. IRBM. 34, (2), 113-118 (2013).
- White, D., Burdick, K., Fulk, G., Searleman, J., Carroll, J. A virtual reality application for stroke patient rehabilitation. ICMA. 2, 1081-1086 (2005).
- Dimbwadyo-Terrer, I., et al. Activities of daily living assessment in spinal cord injury using the virtual reality system Toyra: functional and kinematic correlations. Virtual Real. 20, (1), 17-26 (2016).
- Preische, O., Heymann, P., Elbing, U., Laske, C. Diagnostic value of a tablet-based drawing task for discrimination of patients in the early course of Alzheimer's disease from healthy individuals. J. Alzheimers. Dis. 55, (4), 1463-1469 (2017).
- Seo, K., Kim, J. K., Oh, D. H., Ryu, H., Choi, H. Virtual daily living test to screen for mild cognitive impairment using kinematic movement analysis. PLOS ONE. 12, (7), e0181883 (2017).
- Albert, M. S., et al. The diagnosis of mild cognitive impairment due to Alzheimer's disease: Recommendations from the National Institute on Aging-Alzheimer's Association workgroups on diagnostic guidelines for Alzheimer's disease. Alzheimers. Dement. 7, (3), 270-279 (2011).
- MiddleVR. User Guide. [FR]. Available from: http://www.middlevr.com/doc/current/ c2014-c2017 (2017).
- OptiTrack. Motive Quick Start Guide. Available from: https://optitrack.com/public/documents/motive-quick-start-guide-v1.10.0.pdf c1996-c2017 (2017).
- Kennedy, R. S., Lane, N. E., Berbaum, K. S., Lilienthal, M. G. Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol. 3, (3), 203-220 (1993).
- Singh, N. B., Baumann, C. R., Taylor, W. R. Can Gait Signatures Provide Quantitative Measures for Aiding Clinical Decision-Making? A Systematic Meta-Analysis of Gait Variability Behavior in Patients with Parkinson's Disease. Front. Hum. Neurosci. 10, 319 (2016).
- Hernandez, F., et al. Six degree-of-freedom measurements of human mild traumatic brain injury. Ann. Biomed. Eng. 43, (8), 1918-1934 (2015).
- Gamito, P., et al. Cognitive training on stroke patients via virtual reality-based serious games. Disabil. Rehabil. 39, (4), 385-388 (2017).