The goal was to design, build, and pilot a novel virtual reality task to detect and characterize unilateral spatial neglect, a syndrome affecting 23-46% of acute stroke survivors, expanding the role of virtual reality in the study and management of neurologic disease.
Unilateral spatial neglect (USN) is a syndrome characterized by inattention to or inaction in one side of space and affects between 23-46% of acute stroke survivors. The diagnosis and characterization of these symptoms in individual patients can be challenging and often requires skilled clinical staff. Virtual reality (VR) presents an opportunity to develop novel assessment tools for patients with USN.
We aimed to design and build a VR tool to detect and characterize subtle USN symptoms, and to test the tool on subjects treated with inhibitory repetitive transcranial magnetic stimulation (TMS) of cortical regions associated with USN.
We created three experimental conditions by applying TMS to two distinct regions of cortex associated with visuospatial processing- the superior temporal gyrus (STG) and the supramarginal gyrus (SMG) – and applied sham TMS as a control. We then placed subjects in a virtual reality environment in which they were asked to identify the flowers with lateral asymmetries of flowers distributed across bushes in both hemispaces, with dynamic difficulty adjustment based on each subject’s performance.
We found significant differences in average head yaw between subjects stimulated at the STG and those stimulated at the SMG and marginally significant effects in the average visual axis.
VR technology is becoming more accessible, affordable, and robust, presenting an exciting opportunity to create useful and novel game-like tools. In conjunction with TMS, these tools could be used to study specific, isolated, artificial neurological deficits in healthy subjects, informing the creation of VR-based diagnostic tools for patients with deficits due to acquired brain injury. This study is the first to our knowledge in which artificially generated USN symptoms have been evaluated with a VR task.
Unilateral spatial neglect (USN) is a syndrome characterized by inattention to or inaction in one side of space that affects between 23-46% of acute stroke survivors, most commonly involving injury to the right cerebral hemisphere and resulting in a tendency to ignore the left side of space and/or the survivor's body1,2. Although the majority of patients with USN experience significant recovery in the short term, subtle USN symptoms often persist3. USN can increase patient risk for falls and impede activities of daily living2,4 It has also been shown to negatively impact both motor and global functional outcomes5,6.
Deficits in USN can be conceptualized as existing across multiple dimensions, such as whether a person ignores one side of space with respect to their own body (egocentric) or with respect to an external stimulus (allocentric)7,8,9, or whether a person is unable to direct their attention (attentional) or actions (intentional) toward one side of space10. Patients often exhibit a complex constellation of symptoms that can be characterized along more than one of these dimensions. This variability of USN syndromes is thought to result from varying degrees of injury to specific neuroanatomical structures and neuronal networks, which are complex11. Allocentric neglect has been associated with lesions of the angular gyrus (AG) and superior temporal gyrus (STG), while the posterior parietal cortex (PPC) including the supramarginal gyrus (SMG) has been implicated in egocentric processing12,13,14,15. Attentional neglect is thought to involve lesions in the right IPL16, while intentional neglect is thought to be secondary to damage of the right frontal lobe17 or basal ganglia18.
Clinical assessment of USN currently relies on pen-and-paper neuropsychological instruments. These conventional assessment tools may be less sensitive than more technologically sophisticated tools, resulting in misdiagnosing or under-diagnosing some patients with USN19. Better characterization of residual deficits could facilitate the delivery of therapy to patients with milder USN and potentially improve their overall recovery, but such characterization would require very sensitive diagnostic tools. USN poses similar challenges in the laboratory setting, where it can be difficult to isolate from the motor and visual impairments that commonly accompany USN among stroke patients.
Virtual reality (VR) presents a unique opportunity to develop new tools for the diagnosis and characterization of USN. VR is a multisensory 3D environment presented in the first person with real time interactions in which individuals are able to perform tasks involving ecologically valid objects20. It is a promising tool for assessing USN; the ability to precisely control what the user sees and hears allows developers to present a wide variety of virtual tasks to the user. In addition, the sophisticated hardware and software packages currently available allow for real-time collection of a wealth of data about the user's actions, including eye, head, and limb movements, far exceeding the metrics offered by traditional diagnostic tests21. These data streams are instantaneously available, opening up the possibility for real-time adjustment of diagnostic tasks based on user performance (e.g. targeting the ideal difficulty level for a given task). This feature can facilitate task adaptation to the wide range of severity seen in USN, which is regarded as a priority in the development of new diagnostic tools for USN22. In addition, immersive VR tasks may impose an increased burden on the patients' attentional resources23,24, resulting in increased errors which can facilitate the detection of neglect symptoms; indeed, some VR tasks have been shown to have increased sensitivity when compared to conventional paper-and-pencil measures of USN24,25.
In this study, the goal was to create an assessment tool that requires no expertise in neurology to operate and that can reliably detect and characterize even subtle cases of USN. We built a virtual reality-based, game-like task. We then induced a USN-like syndrome in healthy subjects with transcranial magnetic stimulation (TMS), a noninvasive brain stimulation technique that utilizes electromagnetic pulses emitted from a handheld stimulation coil, which pass through the scalp and skull of the subject and induce electric currents in the subject's brain that stimulate neurons26,27. This technique has been utilized in the study of USN by others13,17,28,29,30, though to our knowledge never in conjunction with a VR-based assessment tool.
Many researchers are already working on diagnostic and therapeutic applications of VR systems. Recent reviews31,32 explored a number of projects aimed at the assessment of USN with VR-based techniques, and a number of other studies with this aim have been published33,34,35,36,37,38,39,40,41. The majority of these studies do not utilize the full complement of VR technology that is currently available to the consumer market (e.g., a head-mounted display (HMD) and eye-tracking inserts), limiting their datasets to a smaller number of easily-quantifiable metrics. In addition, all of these studies were performed on patients with acquired brain injury leading to USN, requiring screening methods to assure that patients could at least participate in the assessment tasks (e.g., excluding patients with large visual field deficits or cognitive impairment). It is possible that more subtle cognitive, motor, or visual deficits passed under the threshold of these screening methods, possibly confounding the results of these studies. It is also possible that such screening biased the samples of participants in these studies toward a particular subtype of USN.
To avoid the screening biases of prior studies, we recruited healthy subjects and artificially simulated USN symptoms with a standard TMS protocol that is well-described in a recent manuscript15, with the goal of inducing allocentric USN-like symptoms by targeting the STG and egocentric USN-like symptoms by targeting the SMG. We designed the task to actively adjust its difficulty trial to trial and to differentiate between different subtypes of USN, specifically allocentric vs. egocentric symptoms. We also used standard paper & pencil assessments of USN to formally demonstrate that the deficits we induced with rTMS are USN-like. We believe the method will be useful to other researchers who want to test novel VR tools for the assessment and rehabilitation of USN.
This study was approved by the local Institutional Review Board and meets all criteria set forth by Good Clinical Practice Guidelines. All participants provided informed consent before any study procedures began. Study participants were expected to participate in three separate sessions (outlined in Table 1). The elements of the experiment are described in stepwise fashion below. Session order was randomized.
Session A | Pre-rTMA VR Task | Resting Motor Threshhold* | rTMR at STG or SMG | Post-rTMS VR Behavioral Task |
5/10 pulses elicit MEP o finger twitch (*First session only) | 110% of RMT for 20 min at 1 Hz (1200 pulses total) | |||
15 min | 60 min | 20 min | 15 min | |
Session B | Pre-rTMA VR Task | Resting Motor Threshhold* | rTMR at Vertex | Post-rTMS VR Behavioral Task |
5/10 pulses elicit MEP o finger twitch (*First session only) | 110% of RMT for 20 min at 1 Hz (1200 pulses total) | |||
15 min | 60 min | 20 min | 15 min | |
Session C | Pre-rTMS paper & Pencil Behavioral Task | Resting Motor Threshhold* | rTMR at STG or SMG | Post-rTMS paper & Pencil Behavioral Task |
Bell's test; Ota's circle cancellation; stay cancellation; line bisection task | 5/10 pulses elicit MEP o finger twitch (*First session only) | 110% of RMT for 20 min at 1 Hz (1200 pulses total) | Bell's test; Ota's circle cancellation; stay cancellation; line bisection task | |
10 min | 60 min | 20 min | 10 min |
Table 1. Structure for each study session. Session order was randomized. Estimated time for each item in italics. MEP=motor evoked potential; rTMS=Repetitive Transcranial Magnetic Stimulation; P&P=Paper and Pencil Stroke Diagnostic Tests; RMT=Resting Motor Threshold
1. Paper & pencil behavioral tasks
2. TMS procedures
3. VR behavioral task
Data were collected from healthy individuals using the protocol outlined above to demonstrate how the different variables that can be extracted from the virtual reality task can be analyzed to detect subtle differences between groups.
In this study, 7 individuals (2 male) with an average age of 25.6 and an average of 16.8 years of education each underwent three separate sessions of TMS. These subjects were broken into two groups: four participants received repetitive TMS at the supramarginal gyrus (SMG), while three other participants received TMS stimulation at the superior temporal gyrus (STG). All participants received sham TMS during a separate session, which was used as a covariate in analyses to account for individual variability in response to TMS. During each session, participants were administered the virtual reality task before and after TMS stimulation to examine change in performance.
First, the average head angle (Figure 8) was examined to determine whether the virtual reality task was sensitive enough to identify a difference between the SMG and STG groups. Head angle change scores were calculated by subtracting pre-TMS scores from the post-TMS scores. An ANCOVA was run to determine whether there was a difference between groups in head angle following TMS stimulation. Sham TMS head angle change scores were used as a covariate to account for individual differences. While keeping in mind that the analyses were conducted using a small pilot sample, a significant difference was found in head angle change scores between the two groups, F(1,4) = 10.25, p = 0.03, where the SMG group had an average change score directed more towards the right side of space compared to the STG group (Figure 9).
A similar pattern was found using the line bisection test, in which the SMG group placed the line significantly more towards the right in the post-TMS administration compared to pre-TMS, t(4) = 2.78, p = 0.04. This finding was not found in the STG group, t(3) = 3.18, p = 0.56. While there was no significant difference in head angle before and after TMS in the virtual reality task in either the SMG or STG groups, the finding that the SMG group had an average head angle change score directed significantly more to the right compared to the STG group demonstrates a similar finding. This finding from the virtual reality task is consistent with the results of the traditional paper-and-pencil task, as both demonstrated a pattern in which the SMG group may have had a subtle neglect and looked more towards the right compared to the STG group. Data gathered from the virtual reality task can be visualized on an individual participant level to examine performance before and after the TMS stimulation, as can be seen in Figure 9.
Next, flowers were separated by which side of the flower contained the defective flower petal (i.e., right petal vs. left petal, see Figure 10) to specifically assess for signs of allocentric neglect on an individual target level. While there was no difference in head angle change scores between the two groups for flowers with shorter petals on the left side, F(1,4) = 0.09, p = 0.78, there was a significant difference in head angle change scores between the two groups for flowers with a smaller petal on the right side, F(1,4) = 9.52, p = 0.04. Specifically, participants in the SMG group had a tendency to look further to the right (higher flower-to-head angle, see Figure 11) when searching for the short petal on the right side of the flower. The angle of the subject's head with respect to the bush (bush angle, see Figure 12) is also available for analysis, allowing for the detection of allocentric neglect with respect to the bush. These analyses demonstrate how variables can be made more specific to capture subtle, specific aspects of neglect.
There are a number of other ways the data may be analyzed. We examined the average number of seconds that participants looked at each flower to determine whether one group had more difficulty identifying defective flowers (as characterized by more seconds spent looking at the flower). In this example, data were extracted from flowers that had a defective petal that was 95% the size of the rest of the petals, as this scale was hypothesized to be the most sensitive. A mixed ANCOVA was run to compare group (SMG vs. STG) and flower visual field (right vs. left). Pre- and post-TMS change scores were calculated and used as the outcome variable to examine whether either group showed an increase in time spent looking at flowers following TMS. The sham TMS condition for both left and right flowers were once again used as covariates to account for individual variability. While there was no significant difference between groups, F(1,3) = 0.12, p = 0.76, there was a marginally significant difference in flower visual field, F(1,3) = 5.62, p = 0.098 (Figure 13). The effect does not reach statistical significance; and more subjects should be assessed moving forward. Despite this, these data serve as an example of how data can be limited to specific flower types and visual field within the virtual reality environment. As these analyses demonstrate, comparing participants' performance can provide researchers with a sensitive and dynamic way to measure the effects of TMS or neglect more generally depending on the examiners' specific research question.
Figure 1: Line bisection task stimulus sheet Please click here to view a larger version of this figure.
Figure 2: Bell's test stimulus sheet Please click here to view a larger version of this figure.
Figure 3: Star cancellation test stimulus sheet Please click here to view a larger version of this figure.
Figure 4: Ota circle cancellation stimulus sheet Please click here to view a larger version of this figure.
Figure 5: Repetitive TMS stimulation; neuronavigational software (left), magnetic stimulation unit (center), and air-cooled coil in position over author CH (right). Please click here to view a larger version of this figure.
Figure 6: Virtual forest environment seen by the subject during the VR task Please click here to view a larger version of this figure.
Figure 7: Layout of three curvedbox hedges with target and decoy flowers distributed across Please click here to view a larger version of this figure.
Figure 8: Head angle – angle between anterior axis of the head and torso Please click here to view a larger version of this figure.
Figure 9. This figure demonstrates two analyses using head angle during task performance:
(Left) SMG vs. STG group head angle change scores. On this scale, a score of 0 indicates that they looked at the center of each flower, while positive scores indicate that they looked towards the right, and negative scores indicate that they looked towards the left. The SMG group had positive scores, indicating that they looked more to the right on average following stimulation, whereas the STG group had negative scores, indicating that they looked more to the left following stimulation. SMG and STG group had significantly different head angle change scores. (Right). Mean head angle plotted for each participant pre-TMS and post-TMS. The STG group did not show strong differences before and after TMS stimulation, unlike the SMG participants who appeared to look more towards the right visual field following stimulation (as represented by positive numbers). Please click here to view a larger version of this figure.
Figure 10: Asymmetric target flowers, with smaller petals on the left (left) and smaller petals on the right (right). Please click here to view a larger version of this figure.
Figure 11: Flower to head angle – angle subtended by the head's anterior axis and the flower from the head at the instant in which the flower was picked/identified Please click here to view a larger version of this figure.
Figure 12: Bush angle – angle subtended by the flower and the center of the flower's bush from the head at the instant in which the flower was picked/identified Please click here to view a larger version of this figure.
Figure 13. Mean change score for seconds spent looking at each flower before and after TMS. Negative scores indicate that participants spent less time looking at flowers in the post-TMS administration compared to the pre-TMS administration, whereas positive numbers indicate more time spent looking at flowers post-TMS. Data are separated by whether flowers were located in the left vs. right visual field within the virtual environment. Data were also separated by group (SMG vs. STG). Flowers were restricted to those with a defective petal at a scale of 0.95. Though not statistically significant, there was a marginal effect of flower visual field. Qualitatively, there appears to be greater variability for flowers in the left visual field compared to the right. Please click here to view a larger version of this figure.
We successfully induced and measured USN symptoms with TMS and VR, respectively. While we did not have significant results when compared to sham trials, we were able to compare multiple metrics of egocentric neglect (average head angle, time spent looking at flowers in either hemispace) and allocentric neglect (performance in selecting flowers with asymmetric petals on the left vs. the right side) between the different experimental groups, and found significant differences in average head angle between subjects stimulated at the STG and those stimulated at the SMG and marginally significant effects in the average visual axis. Of interest, there is still debate concerning the proportional contribution of temporal (STG) and parietal (PPC) contribution to USN-relevant spatial processing12,43, and the increased rightward head angle we detected in the SMG-stimulated group may provide some support for the implication of PPC of the egocentric variety of USN.
There were multiple critical steps in this protocol. This method is limited by the subtle clinical effects achieved with rTMS, so proper stimulation parameters and cortical region targeting is critical – TMS stimulation intensity should always be based on the rMT and TMS coil targeting should always be precisely determined with high-resolution MRI images and proper targeting software like Brainsight. The method is also limited by the relatively short duration of the inhibitory effect created by rTMS stimulation (~20 minutes, or roughly the duration of stimulation26), so rapid transition from rTMS stimulation back to the VR or paper & pencil tasks is of paramount importance to detect this effect. Assuring that the VR equipment is set up and the software is properly calibrated during the pre-TMS VR sessions helps maximize the proportion of post-stimulation time spent collecting data.
As enumerated in the introduction, a number of groups have developed novel VR-based tools for the assessment of USN. Many of these systems also utilize the distinct measurement advantages of computerized tasks, and some groups have attempted to differentiate the various subtypes of USN including extrapersonal vs. peripersonal neglect symptoms and egocentric vs. allocentric symptoms37,40. We believe that the method adds two novel contributions to this existing work. First, we provide a broader array of datasets (head position, eye tracking, etc.) that can be analyzed to detect and characterize even subtle cases of USN. Second, we induced USN symptoms in healthy volunteers using TMS, helping assure that the VR-based diagnostic tool was isolating induced USN symptoms and avoiding the possible confounding effects of visual, motor, and cognitive comorbidities seen in acquired brain injury patients. In addition, the task contrasts with a trend in recent studies that focuses on navigation tasks. We contend that a task that demands interaction with a number of distributed objects across both left and right hemispaces is potentially more demanding and may increase the sensitivity of the VR task as a diagnostic tool. In addition, this format allows for more of a game-like task with multiple trials, which in turn allows for titration of the task's difficulty level from round to round. This type of titration helps the task avoid ceiling and floor effects (i.e., the task being too hard for those with significant deficits or too easy for those with subtle deficits).
There are many possible future applications of the method. With regard to the study of USN, we believe that the addition of eye-tracking data will enable VR tasks to differentiate between attentional and intentional symptoms by separating data measuring asymmetry of search pattern from data measuring asymmetry of motor action. Furthermore, TMS can be used to isolate specific neurologic deficits beyond USN, creating a means by which investigators can design and validate a wide variety of novel VR tools to help diagnose and characterize these deficits in patients who suffer from acquired brain injury. Although the technique involves healthy participants and artificial neurologic deficits in an effort to reliably isolate and characterize USN specifically, we believe that VR tools that are validated by the method can then be applied in populations of patients with mixed neurologic deficits (motor, visual, etc.) by way of user interface innovations such as EEG- or EMG-based brain-computer interfaces44,45. In addition, VR-based tasks like the one we present here can also be modified to serve as cognitive rehabilitation tools, a growing area of research and development31,46.
We faced a number of frustrating issues in testing. The eye tracking became uncalibrated upon small shifts in the HMD's position and the software sometimes failed. The application needed more development and suffered from correctable issues like subject starting position and range of flower placement (some flowers were placed outside of the subject's field of view and invalidated some trials). We had too few subjects. Nevertheless, we were still able to detect the subtle perturbations of two neural networks associated with USN with the novel VR tool. While the ambitious experiment yielded marginal results, we believe many of the challenges it faced will be ameliorated as the technology continues to improve. We argue that the promise of the results, in combination with other encouraging trends within the field, support the idea that VR systems are an excellent substrate for the development of novel diagnostic tools for USN.
The authors have nothing to disclose.
This work was supported by the University Research Fund (URF) from the University of Pennsylvania, and the American Heart Association's Student Scholarships in Cerebrovascular Disease & Stroke. Special thanks to the researchers, clinicians and staff of the Laboratory for Cognition and Neural Stimulation for their ongoing support.
AirFilm Coil (AFC) Rapid Version | Magstim | N/A | Air-cooled TMS coil |
Alienware 17 R4 Laptop | Dell | N/A | NVIDIA GeForce GTX 1060 (full specs at https://topics-cdn.dell.com/pdf/alienware-17-laptop_users-guide_en-us.pdf) |
BrainSight 2.0 TMS Neuronavigation Software | Rogue Research Inc | N/A | TMS neural targeting software |
CED 1902 Isolated pre-amplifier | Cambridge Electronic Design Limted | N/A | EMG pre-amplifier |
CED Micro 401 mkII | Cambridge Electronic Design Limted | N/A | Multi-channel waveform data acquisition unit |
CED Signal 5 | Cambridge Electronic Design Limted | N/A | Sweep-based data acquisition and analysis software. Used to measure TMS evoked motor responses. |
HTC Vive Binocular Add-on | Pupil Labs | N/A | HTC Vive, Vive Pro, or Vive Cosmos eye tracking add-on with 2 x 200Hz eye cameras. |
Magstim D70 Remote Coil | Magstim | N/A | Hand-held TMS coil |
Magstim Super Rapid 2 plus 1 | Magstim | N/A | Transcranial Magnetic Stimulation Unit |
Unity 2018 | Unity | N/A | cross-platform VR game engine |
Vive Pro | HTC Vive | N/A | VR hardware system with external motion sensors; 1440×1600 pixels per eye, 90 Hz refresh rate, 110° FoV |