Summary

Virtual Reality Tools for Assessing Unilateral Spatial Neglect: A Novel Opportunity for Data Collection

Published: March 10, 2021
doi:

Summary

The goal was to design, build, and pilot a novel virtual reality task to detect and characterize unilateral spatial neglect, a syndrome affecting 23-46% of acute stroke survivors, expanding the role of virtual reality in the study and management of neurologic disease.

Abstract

Unilateral spatial neglect (USN) is a syndrome characterized by inattention to or inaction in one side of space and affects between 23-46% of acute stroke survivors. The diagnosis and characterization of these symptoms in individual patients can be challenging and often requires skilled clinical staff. Virtual reality (VR) presents an opportunity to develop novel assessment tools for patients with USN.

We aimed to design and build a VR tool to detect and characterize subtle USN symptoms, and to test the tool on subjects treated with inhibitory repetitive transcranial magnetic stimulation (TMS) of cortical regions associated with USN.

We created three experimental conditions by applying TMS to two distinct regions of cortex associated with visuospatial processing- the superior temporal gyrus (STG) and the supramarginal gyrus (SMG) – and applied sham TMS as a control. We then placed subjects in a virtual reality environment in which they were asked to identify the flowers with lateral asymmetries of flowers distributed across bushes in both hemispaces, with dynamic difficulty adjustment based on each subject’s performance.

We found significant differences in average head yaw between subjects stimulated at the STG and those stimulated at the SMG and marginally significant effects in the average visual axis.

VR technology is becoming more accessible, affordable, and robust, presenting an exciting opportunity to create useful and novel game-like tools. In conjunction with TMS, these tools could be used to study specific, isolated, artificial neurological deficits in healthy subjects, informing the creation of VR-based diagnostic tools for patients with deficits due to acquired brain injury. This study is the first to our knowledge in which artificially generated USN symptoms have been evaluated with a VR task.

Introduction

Unilateral spatial neglect (USN) is a syndrome characterized by inattention to or inaction in one side of space that affects between 23-46% of acute stroke survivors, most commonly involving injury to the right cerebral hemisphere and resulting in a tendency to ignore the left side of space and/or the survivor's body1,2. Although the majority of patients with USN experience significant recovery in the short term, subtle USN symptoms often persist3. USN can increase patient risk for falls and impede activities of daily living2,4 It has also been shown to negatively impact both motor and global functional outcomes5,6.

Deficits in USN can be conceptualized as existing across multiple dimensions, such as whether a person ignores one side of space with respect to their own body (egocentric) or with respect to an external stimulus (allocentric)7,8,9, or whether a person is unable to direct their attention (attentional) or actions (intentional) toward one side of space10. Patients often exhibit a complex constellation of symptoms that can be characterized along more than one of these dimensions. This variability of USN syndromes is thought to result from varying degrees of injury to specific neuroanatomical structures and neuronal networks, which are complex11. Allocentric neglect has been associated with lesions of the angular gyrus (AG) and superior temporal gyrus (STG), while the posterior parietal cortex (PPC) including the supramarginal gyrus (SMG) has been implicated in egocentric processing12,13,14,15. Attentional neglect is thought to involve lesions in the right IPL16, while intentional neglect is thought to be secondary to damage of the right frontal lobe17 or basal ganglia18.

Clinical assessment of USN currently relies on pen-and-paper neuropsychological instruments. These conventional assessment tools may be less sensitive than more technologically sophisticated tools, resulting in misdiagnosing or under-diagnosing some patients with USN19. Better characterization of residual deficits could facilitate the delivery of therapy to patients with milder USN and potentially improve their overall recovery, but such characterization would require very sensitive diagnostic tools. USN poses similar challenges in the laboratory setting, where it can be difficult to isolate from the motor and visual impairments that commonly accompany USN among stroke patients.

Virtual reality (VR) presents a unique opportunity to develop new tools for the diagnosis and characterization of USN. VR is a multisensory 3D environment presented in the first person with real time interactions in which individuals are able to perform tasks involving ecologically valid objects20. It is a promising tool for assessing USN; the ability to precisely control what the user sees and hears allows developers to present a wide variety of virtual tasks to the user. In addition, the sophisticated hardware and software packages currently available allow for real-time collection of a wealth of data about the user's actions, including eye, head, and limb movements, far exceeding the metrics offered by traditional diagnostic tests21. These data streams are instantaneously available, opening up the possibility for real-time adjustment of diagnostic tasks based on user performance (e.g. targeting the ideal difficulty level for a given task). This feature can facilitate task adaptation to the wide range of severity seen in USN, which is regarded as a priority in the development of new diagnostic tools for USN22. In addition, immersive VR tasks may impose an increased burden on the patients' attentional resources23,24, resulting in increased errors which can facilitate the detection of neglect symptoms; indeed, some VR tasks have been shown to have increased sensitivity when compared to conventional paper-and-pencil measures of USN24,25.

In this study, the goal was to create an assessment tool that requires no expertise in neurology to operate and that can reliably detect and characterize even subtle cases of USN. We built a virtual reality-based, game-like task. We then induced a USN-like syndrome in healthy subjects with transcranial magnetic stimulation (TMS), a noninvasive brain stimulation technique that utilizes electromagnetic pulses emitted from a handheld stimulation coil, which pass through the scalp and skull of the subject and induce electric currents in the subject's brain that stimulate neurons26,27. This technique has been utilized in the study of USN by others13,17,28,29,30, though to our knowledge never in conjunction with a VR-based assessment tool.

Many researchers are already working on diagnostic and therapeutic applications of VR systems. Recent reviews31,32 explored a number of projects aimed at the assessment of USN with VR-based techniques, and a number of other studies with this aim have been published33,34,35,36,37,38,39,40,41. The majority of these studies do not utilize the full complement of VR technology that is currently available to the consumer market (e.g., a head-mounted display (HMD) and eye-tracking inserts), limiting their datasets to a smaller number of easily-quantifiable metrics. In addition, all of these studies were performed on patients with acquired brain injury leading to USN, requiring screening methods to assure that patients could at least participate in the assessment tasks (e.g., excluding patients with large visual field deficits or cognitive impairment). It is possible that more subtle cognitive, motor, or visual deficits passed under the threshold of these screening methods, possibly confounding the results of these studies. It is also possible that such screening biased the samples of participants in these studies toward a particular subtype of USN.

To avoid the screening biases of prior studies, we recruited healthy subjects and artificially simulated USN symptoms with a standard TMS protocol that is well-described in a recent manuscript15, with the goal of inducing allocentric USN-like symptoms by targeting the STG and egocentric USN-like symptoms by targeting the SMG. We designed the task to actively adjust its difficulty trial to trial and to differentiate between different subtypes of USN, specifically allocentric vs. egocentric symptoms. We also used standard paper & pencil assessments of USN to formally demonstrate that the deficits we induced with rTMS are USN-like. We believe the method will be useful to other researchers who want to test novel VR tools for the assessment and rehabilitation of USN.

Protocol

This study was approved by the local Institutional Review Board and meets all criteria set forth by Good Clinical Practice Guidelines. All participants provided informed consent before any study procedures began. Study participants were expected to participate in three separate sessions (outlined in Table 1). The elements of the experiment are described in stepwise fashion below. Session order was randomized.

Session A Pre-rTMA VR Task Resting Motor Threshhold*  rTMR at STG or SMG Post-rTMS VR Behavioral Task
5/10 pulses elicit MEP o finger twitch (*First session only) 110% of RMT for 20 min at 1 Hz (1200 pulses total)
15 min 60 min 20 min 15 min
Session B Pre-rTMA VR Task Resting Motor Threshhold*  rTMR at Vertex Post-rTMS VR Behavioral Task
5/10 pulses elicit MEP o finger twitch (*First session only) 110% of RMT for 20 min at 1 Hz (1200 pulses total)
15 min 60 min 20 min 15 min
Session C Pre-rTMS paper & Pencil Behavioral Task Resting Motor Threshhold*  rTMR at STG or SMG Post-rTMS paper & Pencil Behavioral Task
Bell's test; Ota's circle cancellation; stay cancellation; line bisection task 5/10 pulses elicit MEP o finger twitch (*First session only) 110% of RMT for 20 min at 1 Hz (1200 pulses total) Bell's test; Ota's circle cancellation; stay cancellation; line bisection task
10 min 60 min 20 min 10 min

Table 1. Structure for each study session. Session order was randomized. Estimated time for each item in italics. MEP=motor evoked potential; rTMS=Repetitive Transcranial Magnetic Stimulation; P&P=Paper and Pencil Stroke Diagnostic Tests; RMT=Resting Motor Threshold

1. Paper & pencil behavioral tasks

  1. Have the subject complete the line bisection task (LBT).
    1. Have the subject sit at a table directly across from the tester. Provide the subject with a writing utensil. Provide the subject with the stimulus sheet (Figure 1), ensuring it is placed directly in front of the subject.
      ​NOTE: Although not performed in this experiment, it would be ideal to present each line to be bisected individually on separate sheets of paper to avoid biasing subject with additional context (See Ricci and Chatterjee, 200142).
    2. Instruct the subject to bisect (divide into halves) each line printed on the stimulus sheet and get as close to the middle as possible.
    3. Tell the subject to keep their head and shoulders centered as best as possible, to complete the task as quickly and accurately as possible, and to notify the tester when they are finished. Monitor the subject to ensure they are not leaning or tilting their head excessively.
    4. Collect the sheet from the subject when the subjects say they are done.
  2. Have the subject complete the Bell's Test.
    1. Provide the subject with the Bell's test stimuli sheet (Figure 2).
    2. Instruct the subject to circle or cross out all of the bells on the stimulus sheet, to do so as quickly and accurately as possible, to keep their head and shoulders as centered as possible, and to notify the tester when they are finished.
    3. Monitor the subject to ensure they are not leaning or tilting their head excessively. When the subject says they are finished, ask the subject if they are sure, and allow them to double check their work.
    4. Collect the sheet from the subject when the subjects say they are done a second time.
  3. Have the subject complete the star cancellation task.
    1. Present the subject with the stimulus sheet (Figure 3), ensuring it is directly in front of them.
    2. Instruct the subject to circle or cross out all of the stars on the stimulus sheet, to do so as quickly and accurately as possible, to keep their head and shoulders as centered as possible, and to notify the tester when they are finished.
    3. Monitor the subject to ensure they are not leaning or tilting their head excessively.
    4. Collect the sheet from the subject when the subjects say they are done.
  4. Have the subject complete the Ota's circle cancellation task.
    1. Provide the subject with the Ota's circle cancellation stimulus sheet (Figure 4), ensuring it is placed directly in front of the subject.
    2. Instruct the subject to cross out or circle all of the open/incomplete circles, to do so as quickly and accurately as possible, to keep their shoulders as centered as possible, and to notify the tester when they are finished.
    3. Monitor the subject to ensure they are not leaning or tilting their head excessively.
    4. Collect the sheet from the subject when the subjects say they are done.
    5. Repeat this task (steps 1.4.1 through 1.4.4) with another copy of the stimulus sheet, but this time the stimulus sheet should be rotated 180 degrees from the orientation it was originally presented.

2. TMS procedures

  1. Create a model for neuronavigation prior to the first session.
    1. Obtain the subject's 3T T1 MRI scan in a NIFTI or dicom file type.
    2. Upload that MRI scan into the neuronavigational software to create a 3D representation of the subject's brain.
      1. Select New Empty Project within the software. Drag the subject's MRI scan onto the field labeled "File:".
      2. Go to the Reconstructions tab.
      3. Select New Skin and on the next screen, drag the green boundary lines to encompass the entire image of the brain. Select compute skin. Adjust the Skin/Air Threshold accordingly to get an optimal reconstruction.
      4. Go back to the Reconstructions tab and select New Full Brain Curvilinear and drag the green boundary lines to encompass the entire image of the brain. Set slice spacing to 1 mm and set end depth to 18 mm. Select Compute Curvilinear.
      5. Go to Landmarks tab and select Configure Landmarks. Select Nuevo to create a landmark on the reconstruction. Place landmarks on the tip of the nose, bridge of the nose, left tragus, and right tragus.
      6. Go to the Targets tab and select Configure Targets. Select the Curvilinear Brain & Targets view. Using the inspector, peel to a depth of 5-7 mm.
      7. Follow guidelines of Shah-Basak et al. (2018)14, Neggers et al. (2006)11 and Oliveri and Vallar (2009)39 to locate the superior temporal gyrus or the supramarginal gyrus, and place a marker at those sites.
      8. Place a marker where the two central sulci meet along the median longitudinal fissure for sham stimulation at the vertex.
  2. During the first session, find the subject's Resting Motor Threshold (may be completed before or after behavioral task).
    1. Have the subject seated in front of an optical tracking camera and place a tracker on the subject using a headband or glasses.
    2. Attach three disposable electrodes on the subject's right hand and wrist.
      1. Attach one disk electrode to the subject's first dorsal interosseous. Attach a second disk electrode to the subject's second knuckle on their right pointer finger. Attach a ground electrode to the subject's right wrist.
    3. Plug these electrodes into an electrode adaptor, which inputs into an MEP tracking software.
    4. Open the subject's project within the neuronavigational software by selecting New Online Session.
    5. Select the targets to be stimulated in this session (Vertex, SMG, STG).
    6. Go to the Polaris tab and ensure the subject tracker is within view of the camera.
    7. Go to Registration tab.
    8. Using a pointer registered to the neuronavigational software, touch the subjects' face in the same locations that the landmarks were placed in step 2.1.2.5.
      1. Click Sample and go to Next Landmark when the pointer is positioned properly on the subject's head for each landmark.
    9. Go to Validation tab.
    10. Using the pointer, touch the subject in various spots on their head and ensure the crosshairs on the screen line up with the spot being pointed to on the subject.
      1. If they do not line up, redo step 2.2.8 and make sure the pointer is as precisely placed on the landmarks as possible.
    11. Go to Perform tab and ensure the Full Brain Curvilinear View is selected so the experimenter can precisely locate brain regions to target.
    12. Set driver to be the TMS coil that will be used.
    13. Plug handheld TMS coil into TMS machine.
    14. Turn on the TMS Machine and set to single pulse. Set stimulation intensity appropriately; in this experiment, 65% of machine output was used as a starting point.
    15. Place the handheld TMS coil on the left side of the subject's head and stimulate within the motor cortex using single pulses of TMS to identify the location that stimulates the FDI. It may be helpful to have an assistant to watch the subject's finger to identify when the FDI muscle twitches due to stimulation.
    16. Alter the stimulation intensity until stimulation elicits MEP of at least 50 mV exactly 5/10 times, and this will be the resting motor threshold (rMT).
  3. Stimulation in between tasks
    1. Repeat steps 2.2.1 through 2.2.13, substituting an air-cooled TMS coil for the handheld coil.
    2. Set stimulation parameters to repetitive TMS at a rate of 1 Hz for 20 minutes (1200 pulses total) with an intensity of 110% of rMT in accordance with parameters set by Shah-Basak et al. (2018)15.
    3. Place an air-cooled TMS coil with a built-in cooling system on the subject's head targeting the SMG or STG for active sessions or the Vertex for sham sessions (Figure 5).
    4. Proceed with stimulation.

3. VR behavioral task

  1. Install supporting software.
    1. Download and install Pupil core software from the Pupil Labs website.
    2. Download and install Unity 3D 2018.3 from the Unity website.
    3. Download and install OpenVR tool through Unity Asset Store or through Steam.
  2. Set up the VR hardware (e.g., HTC Vive Pro).
    1. Place base stations on opposite sides of the room, ensuring a clear line of sight, and plug them in.
    2. Press the Channel/Mode button on the back of each sensor to cycle through channels until one of them is set to channel " b" and one is set to " c." Both status LEDs should be white.
    3. Install Pupil Labs Binocular insert into HTC Vive Pro. Connect the Link Box to the computer (Power, USB-A, and HDMI or Mini DisplayPort).
    4. Connect the headset to the Link Box. Adjust top and side straps on headset. Adjust the lens distance.
  3. Launch SteamVR.
    1. Launch SteamVR by clicking on the VR icon in the top right corner of Steam.
      1. Turn on controllers with the power button.
      2. On SteamVR, click Settings | Pair New Device to pair each controller by following on-screen instructions.
      3. Click Room Setup from the SteamVR menu and follow on-screen instructions.
  4. Launch Pupil Core Software.
  5. Place headset on the seated subject's head and give them both controllers. Ensure the straps are tight but comfortable. Ensure both eyes are visible by visually confirming they are centered in the Pupil Core Software's camera feeds.
  6. Open the VR task in the Unity Editor and hit the Play button.
  7. Run the experiment.
    1. Ask the subject to look straight ahead and click the Tare Camera button on the screen.
    2. Click the Begin Tutorial button and wait for the subject to complete the tutorial. The tutorial consists of audio instruction about the operation of the VR system controller, descriptions and examples of symmetrical (decoy) and asymmetrical (target) flowers, and a 1-minute practice session with a small number of decoy and target flowers. The tutorial lasts 75-100 seconds and tutorial performance data is not collected.
    3. When subject is finished, click the Calibrate Eye Tracking button.
      1. If the calibration is successful, the subject will automatically begin the task. Otherwise, repeat step 3.7.3.
    4. Begin the first trial by clicking the Next Trial button.
      NOTE: During the VR task, subjects are placed in a virtual forest (Figure 6). Three curved box hedges formed a semi circle within reaching distance in front of the subject. Each trial consisted of a varying number of flowers, each with 16 petals, distributed among the hedges at a direct line of sight (Figure 7). Subjects were instructed to "pick" (hold their controller over a flower so that the flower would highlight, then depress the trigger button with their index finger) all asymmetrical "target" flowers and leave alone all symmetrical "decoy" flowers. Each trial would end when the subject successfully picks all of the asymmetrical target flowers, but also would end if the subject ran out of time (2-minute time limit) or if the subject inadvertently picked all of the symmetrical decoy flower. In all of these cases the remaining flowers on the bushes would be cleared, and the experimenter would be prompted to begin the next trial.
    5. Wait until the subject is no longer actively completing a trial and then repeat step 3.7.4 unless at least 12 trials have been completed.
    6. Click the Play button again to end the task.

Representative Results

Data were collected from healthy individuals using the protocol outlined above to demonstrate how the different variables that can be extracted from the virtual reality task can be analyzed to detect subtle differences between groups.

In this study, 7 individuals (2 male) with an average age of 25.6 and an average of 16.8 years of education each underwent three separate sessions of TMS. These subjects were broken into two groups: four participants received repetitive TMS at the supramarginal gyrus (SMG), while three other participants received TMS stimulation at the superior temporal gyrus (STG). All participants received sham TMS during a separate session, which was used as a covariate in analyses to account for individual variability in response to TMS. During each session, participants were administered the virtual reality task before and after TMS stimulation to examine change in performance.

First, the average head angle (Figure 8) was examined to determine whether the virtual reality task was sensitive enough to identify a difference between the SMG and STG groups. Head angle change scores were calculated by subtracting pre-TMS scores from the post-TMS scores. An ANCOVA was run to determine whether there was a difference between groups in head angle following TMS stimulation. Sham TMS head angle change scores were used as a covariate to account for individual differences. While keeping in mind that the analyses were conducted using a small pilot sample, a significant difference was found in head angle change scores between the two groups, F(1,4) = 10.25, p = 0.03, where the SMG group had an average change score directed more towards the right side of space compared to the STG group (Figure 9).

A similar pattern was found using the line bisection test, in which the SMG group placed the line significantly more towards the right in the post-TMS administration compared to pre-TMS, t(4) = 2.78, p = 0.04. This finding was not found in the STG group, t(3) = 3.18, p = 0.56. While there was no significant difference in head angle before and after TMS in the virtual reality task in either the SMG or STG groups, the finding that the SMG group had an average head angle change score directed significantly more to the right compared to the STG group demonstrates a similar finding. This finding from the virtual reality task is consistent with the results of the traditional paper-and-pencil task, as both demonstrated a pattern in which the SMG group may have had a subtle neglect and looked more towards the right compared to the STG group. Data gathered from the virtual reality task can be visualized on an individual participant level to examine performance before and after the TMS stimulation, as can be seen in Figure 9.

Next, flowers were separated by which side of the flower contained the defective flower petal (i.e., right petal vs. left petal, see Figure 10) to specifically assess for signs of allocentric neglect on an individual target level. While there was no difference in head angle change scores between the two groups for flowers with shorter petals on the left side, F(1,4) = 0.09, p = 0.78, there was a significant difference in head angle change scores between the two groups for flowers with a smaller petal on the right side, F(1,4) = 9.52, p = 0.04. Specifically, participants in the SMG group had a tendency to look further to the right (higher flower-to-head angle, see Figure 11) when searching for the short petal on the right side of the flower. The angle of the subject's head with respect to the bush (bush angle, see Figure 12) is also available for analysis, allowing for the detection of allocentric neglect with respect to the bush. These analyses demonstrate how variables can be made more specific to capture subtle, specific aspects of neglect.

There are a number of other ways the data may be analyzed. We examined the average number of seconds that participants looked at each flower to determine whether one group had more difficulty identifying defective flowers (as characterized by more seconds spent looking at the flower). In this example, data were extracted from flowers that had a defective petal that was 95% the size of the rest of the petals, as this scale was hypothesized to be the most sensitive. A mixed ANCOVA was run to compare group (SMG vs. STG) and flower visual field (right vs. left). Pre- and post-TMS change scores were calculated and used as the outcome variable to examine whether either group showed an increase in time spent looking at flowers following TMS. The sham TMS condition for both left and right flowers were once again used as covariates to account for individual variability. While there was no significant difference between groups, F(1,3) = 0.12, p = 0.76, there was a marginally significant difference in flower visual field, F(1,3) = 5.62, p = 0.098 (Figure 13). The effect does not reach statistical significance; and more subjects should be assessed moving forward. Despite this, these data serve as an example of how data can be limited to specific flower types and visual field within the virtual reality environment. As these analyses demonstrate, comparing participants' performance can provide researchers with a sensitive and dynamic way to measure the effects of TMS or neglect more generally depending on the examiners' specific research question.

Figure 1
Figure 1: Line bisection task stimulus sheet Please click here to view a larger version of this figure.

Figure 2
Figure 2: Bell's test stimulus sheet Please click here to view a larger version of this figure.

Figure 3
Figure 3: Star cancellation test stimulus sheet Please click here to view a larger version of this figure.

Figure 4
Figure 4: Ota circle cancellation stimulus sheet Please click here to view a larger version of this figure.

Figure 5
Figure 5: Repetitive TMS stimulation; neuronavigational software (left), magnetic stimulation unit (center), and air-cooled coil in position over author CH (right). Please click here to view a larger version of this figure.

Figure 6
Figure 6: Virtual forest environment seen by the subject during the VR task Please click here to view a larger version of this figure.

Figure 7
Figure 7: Layout of three curvedbox hedges with target and decoy flowers distributed across Please click here to view a larger version of this figure.

Figure 8
Figure 8: Head angle – angle between anterior axis of the head and torso Please click here to view a larger version of this figure.

Figure 9
Figure 9. This figure demonstrates two analyses using head angle during task performance:
(Left) SMG vs. STG group head angle change scores. On this scale, a score of 0 indicates that they looked at the center of each flower, while positive scores indicate that they looked towards the right, and negative scores indicate that they looked towards the left. The SMG group had positive scores, indicating that they looked more to the right on average following stimulation, whereas the STG group had negative scores, indicating that they looked more to the left following stimulation. SMG and STG group had significantly different head angle change scores. (Right). Mean head angle plotted for each participant pre-TMS and post-TMS. The STG group did not show strong differences before and after TMS stimulation, unlike the SMG participants who appeared to look more towards the right visual field following stimulation (as represented by positive numbers). Please click here to view a larger version of this figure.

Figure 10
Figure 10: Asymmetric target flowers, with smaller petals on the left (left) and smaller petals on the right (right). Please click here to view a larger version of this figure.

Figure 11
Figure 11: Flower to head angle – angle subtended by the head's anterior axis and the flower from the head at the instant in which the flower was picked/identified Please click here to view a larger version of this figure.

Figure 12
Figure 12: Bush angle – angle subtended by the flower and the center of the flower's bush from the head at the instant in which the flower was picked/identified Please click here to view a larger version of this figure.

Figure 13
Figure 13. Mean change score for seconds spent looking at each flower before and after TMS. Negative scores indicate that participants spent less time looking at flowers in the post-TMS administration compared to the pre-TMS administration, whereas positive numbers indicate more time spent looking at flowers post-TMS. Data are separated by whether flowers were located in the left vs. right visual field within the virtual environment. Data were also separated by group (SMG vs. STG). Flowers were restricted to those with a defective petal at a scale of 0.95. Though not statistically significant, there was a marginal effect of flower visual field. Qualitatively, there appears to be greater variability for flowers in the left visual field compared to the right. Please click here to view a larger version of this figure.

Discussion

We successfully induced and measured USN symptoms with TMS and VR, respectively. While we did not have significant results when compared to sham trials, we were able to compare multiple metrics of egocentric neglect (average head angle, time spent looking at flowers in either hemispace) and allocentric neglect (performance in selecting flowers with asymmetric petals on the left vs. the right side) between the different experimental groups, and found significant differences in average head angle between subjects stimulated at the STG and those stimulated at the SMG and marginally significant effects in the average visual axis. Of interest, there is still debate concerning the proportional contribution of temporal (STG) and parietal (PPC) contribution to USN-relevant spatial processing12,43, and the increased rightward head angle we detected in the SMG-stimulated group may provide some support for the implication of PPC of the egocentric variety of USN.

There were multiple critical steps in this protocol. This method is limited by the subtle clinical effects achieved with rTMS, so proper stimulation parameters and cortical region targeting is critical – TMS stimulation intensity should always be based on the rMT and TMS coil targeting should always be precisely determined with high-resolution MRI images and proper targeting software like Brainsight. The method is also limited by the relatively short duration of the inhibitory effect created by rTMS stimulation (~20 minutes, or roughly the duration of stimulation26), so rapid transition from rTMS stimulation back to the VR or paper & pencil tasks is of paramount importance to detect this effect. Assuring that the VR equipment is set up and the software is properly calibrated during the pre-TMS VR sessions helps maximize the proportion of post-stimulation time spent collecting data.

As enumerated in the introduction, a number of groups have developed novel VR-based tools for the assessment of USN. Many of these systems also utilize the distinct measurement advantages of computerized tasks, and some groups have attempted to differentiate the various subtypes of USN including extrapersonal vs. peripersonal neglect symptoms and egocentric vs. allocentric symptoms37,40. We believe that the method adds two novel contributions to this existing work. First, we provide a broader array of datasets (head position, eye tracking, etc.) that can be analyzed to detect and characterize even subtle cases of USN. Second, we induced USN symptoms in healthy volunteers using TMS, helping assure that the VR-based diagnostic tool was isolating induced USN symptoms and avoiding the possible confounding effects of visual, motor, and cognitive comorbidities seen in acquired brain injury patients. In addition, the task contrasts with a trend in recent studies that focuses on navigation tasks. We contend that a task that demands interaction with a number of distributed objects across both left and right hemispaces is potentially more demanding and may increase the sensitivity of the VR task as a diagnostic tool. In addition, this format allows for more of a game-like task with multiple trials, which in turn allows for titration of the task's difficulty level from round to round. This type of titration helps the task avoid ceiling and floor effects (i.e., the task being too hard for those with significant deficits or too easy for those with subtle deficits).

There are many possible future applications of the method. With regard to the study of USN, we believe that the addition of eye-tracking data will enable VR tasks to differentiate between attentional and intentional symptoms by separating data measuring asymmetry of search pattern from data measuring asymmetry of motor action. Furthermore, TMS can be used to isolate specific neurologic deficits beyond USN, creating a means by which investigators can design and validate a wide variety of novel VR tools to help diagnose and characterize these deficits in patients who suffer from acquired brain injury. Although the technique involves healthy participants and artificial neurologic deficits in an effort to reliably isolate and characterize USN specifically, we believe that VR tools that are validated by the method can then be applied in populations of patients with mixed neurologic deficits (motor, visual, etc.) by way of user interface innovations such as EEG- or EMG-based brain-computer interfaces44,45. In addition, VR-based tasks like the one we present here can also be modified to serve as cognitive rehabilitation tools, a growing area of research and development31,46.

We faced a number of frustrating issues in testing. The eye tracking became uncalibrated upon small shifts in the HMD's position and the software sometimes failed. The application needed more development and suffered from correctable issues like subject starting position and range of flower placement (some flowers were placed outside of the subject's field of view and invalidated some trials). We had too few subjects. Nevertheless, we were still able to detect the subtle perturbations of two neural networks associated with USN with the novel VR tool. While the ambitious experiment yielded marginal results, we believe many of the challenges it faced will be ameliorated as the technology continues to improve. We argue that the promise of the results, in combination with other encouraging trends within the field, support the idea that VR systems are an excellent substrate for the development of novel diagnostic tools for USN.

Divulgaciones

The authors have nothing to disclose.

Acknowledgements

This work was supported by the University Research Fund (URF) from the University of Pennsylvania, and the American Heart Association's Student Scholarships in Cerebrovascular Disease & Stroke. Special thanks to the researchers, clinicians and staff of the Laboratory for Cognition and Neural Stimulation for their ongoing support.

Materials

AirFilm Coil (AFC) Rapid Version Magstim N/A Air-cooled TMS coil
Alienware 17 R4 Laptop Dell N/A NVIDIA GeForce GTX 1060 (full specs at https://topics-cdn.dell.com/pdf/alienware-17-laptop_users-guide_en-us.pdf)
BrainSight 2.0 TMS Neuronavigation Software Rogue Research Inc N/A TMS neural targeting software
CED 1902 Isolated pre-amplifier Cambridge Electronic Design Limted N/A EMG pre-amplifier
CED Micro 401 mkII Cambridge Electronic Design Limted N/A Multi-channel waveform data acquisition unit
CED Signal 5 Cambridge Electronic Design Limted N/A Sweep-based data acquisition and analysis software. Used to measure TMS evoked motor responses.
HTC Vive Binocular Add-on Pupil Labs N/A HTC Vive, Vive Pro, or Vive Cosmos eye tracking add-on with 2 x 200Hz eye cameras.
Magstim D70 Remote Coil Magstim N/A Hand-held TMS coil
Magstim Super Rapid 2 plus 1 Magstim N/A Transcranial Magnetic Stimulation Unit
Unity 2018 Unity N/A cross-platform VR game engine
Vive Pro HTC Vive N/A VR hardware system with external motion sensors; 1440×1600 pixels per eye, 90 Hz refresh rate, 110° FoV

Referencias

  1. Heilman, K. M., Bowers, D., Coslett, H. B., Whelan, H., Watson, R. T. Directional Hypokinesia: Prolonged Reaction Times for Leftward Movements in Patients with Right Hemisphere Lesions and Neglect. Neurology. 35 (6), 855-859 (1985).
  2. Paolucci, S., Antonucci, G., Grasso, M. G., Pizzamiglio, L. The Role of Unilateral Spatial Neglect in Rehabilitation of Right Brain-Damaged Ischemic Stroke Patients: A Matched Comparison. Archives of Physical Medicine and Rehabilitation. 82 (6), 743-749 (2001).
  3. Ringman, J. M., Saver, J. L., Woolson, R. F., Clarke, W. R., Adams, H. P. Frequency, Risk Factors, Anatomy, and Course of Unilateral Neglect in an Acute Stroke Cohort. Neurology. 63 (3), 468-474 (2004).
  4. Jutai, J. W., et al. Treatment of visual perceptual disorders post stroke. Topics in Stroke Rehabilitation. 10 (2), 77-106 (2003).
  5. Buxbaum, L. J., et al. Hemispatial Neglect: Subtypes, Neuroanatomy, and Disability. Neurology. 62 (5), 749-756 (2004).
  6. Numminen, S., et al. Factors Influencing Quality of Life Six Months after a First-Ever Ischemic Stroke: Focus on Thrombolyzed Patients. Folia Phoniatrica et Logopaedica: Official Organ of the International Association of Logopedics and Phoniatrics (IALP). 68 (2), 86-91 (2016).
  7. Ladavas, E. Is the Hemispatial Deficit Produced by Right Parietal Lobe Damage Associated with Retinal or Gravitational Coordinates. Brain: A Journal of Neurology. 110 (1), 167-180 (1987).
  8. Ota, H., Fujii, T., Suzuki, K., Fukatsu, R., Yamadori, A. Dissociation of Body-Centered and Stimulus-Centered Representations in Unilateral Neglect. Neurology. 57 (11), 2064-2069 (2001).
  9. Neggers, S. F., Vander Lubbe, R. H., Ramsey, N. F., Postma, A. Interactions between ego- and allocentric neuronal representations of space. Neuroimage. 31 (1), 320-331 (2006).
  10. Adair, J. C., Barrett, A. M. Spatial Neglect: Clinical and Neuroscience Review: A Wealth of Information on the Poverty of Spatial Attention. Annals of the New York Academy of Sciences. 1142, 21-43 (2008).
  11. Corbetta, M., Shulman, G. L. Spatial neglect and attention networks. Annual Review of Neuroscience. 34, 569-599 (2011).
  12. Marshall, J. C., Fink, G. R., Halligan, P. W., Vallar, G. Spatial awareness: a function of the posterior parietal lobe. Cortex. 38 (2), 253-260 (2002).
  13. Ellison, A., Schindler, I., Pattison, L. L., Milner, A. D. An exploration of the role of the superior temporal gyrus in visual search and spatial perception using TMS. Brain. (10), 2307-2315 (2004).
  14. Vallar, G., Calzolari, E., Vallar, G., Coslett, H. B. Unilateral spatial neglect after posterior parietal damage. Handb Clin Neurol; Theparietal lobe. , 287-312 (2018).
  15. Shah-Basak, P. P., Chen, P., Caulfield, K., Medina, J., Hamilton, R. H. The Role of the Right Superior Temporal Gyrus in Stimulus-Centered Spatial Processing. Neuropsychologia. 113, 6-13 (2018).
  16. Verdon, V., Schwartz, S., Lovblad, K. O., Hauert, C. A., Vuilleumier, P. Neuroanatomy of hemispatial neglect and its functional components: a study using voxel-based lesion-symptom mapping. Brain. 133 (3), 880-894 (2010).
  17. Ghacibeh, G. A., Shenker, J. I., Winter, K. H., Triggs, W. J., Heilman, K. M. Dissociation of Neglect Subtypes with Transcranial Magnetic Stimulation. Neurology. 69 (11), 1122-1127 (2007).
  18. Chaudhari, A., Pigott, K., Barrett, A. M. Midline Body Actions and Leftward Spatial ‘Aiming’ in Patients with Spatial Neglect. Frontiers in Human Neuroscience. 9, 393 (2015).
  19. Rizzo, A. A., et al. Design and Development of Virtual Reality Based Perceptual-Motor Rehabilitation Scenarios. The 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. , (2004).
  20. Steinicke, F. . Being Really Virtual Immersive Natives and the Future of Virtual Reality. , (2018).
  21. Tsirlin, I., Dupierrix, E., Chokron, S., Coquillart, S., Ohlmann, T. Uses of Virtual Reality for Diagnosis, Rehabilitation and Study of Unilateral Spatial Neglect: Review and Analysis. CyberPsychology & Behavior. 12 (2), 175-181 (2009).
  22. Barrett, A. M., et al. Cognitive Rehabilitation Interventions for Neglect and Related Disorders: Moving from Bench to Bedside in Stroke Patients. Journal of Cognitive Neuroscience. 18 (7), 1223-1236 (2006).
  23. Ricci, R., et al. Effects of attentional and cognitive variables on unilateral spatial neglect. Neuropsychologia. 92, 158-166 (2016).
  24. Bonato, M. Neglect and Extinction Depend Greatly on Task Demands: A Review. Frontiers in Human Neuroscience. 6, 195 (2012).
  25. Grattan, E. S., Woodbury, M. L. Do Neglect Assessments Detect Neglect Differently. American Journal of Occupational Therapy. 71, 3 (2017).
  26. Rossi, S., Hallett, M., Rossini, P. M., Pascual-Leone, A. Safety of TMS Consensus Group. Safety, ethical considerations, and application guidelines for the use of transcranial magnetic stimulation in clinical practice and research. Clinical Neurophysiology. 120 (12), 2008-2039 (2009).
  27. Pascual-Leone, A., Walsh, V., Rothwell, J. Transcranial Magnetic Stimulation in Cognitive Neuroscience – Lesion, Chronometry, and Functional Connectivity. Current Opinion in Neurobiology. 10 (2), 232-237 (2000).
  28. Oliveri, M., et al. Interhemispheric Asymmetries in the Perception of Unimanual and Bimanual Cutaneous Stimuli. Brain. 122 (9), 1721-1729 (1999).
  29. Salatino, A., et al. Transcranial Magnetic Stimulation of Posterior Parietal Cortex Modulates Line-Length Estimation but Not Illusory Depth Perception. Frontiers in Psychology. 10, (2019).
  30. Oliveri, M., Vallar, G. Parietal versus temporal lobe components in spatial cognition: Setting the mid-point of a horizontal line. Journal of Neuropsychology. 3, 201-211 (2009).
  31. Ogourtsova, T., Souza Silva, W., Archambault, P. S., Lamontagne, A. Virtual Reality Treatment and Assessments for Post-Stroke Unilateral Spatial Neglect: A Systematic Literature Review. Neuropsychological Rehabilitation. 27 (3), 409-454 (2017).
  32. Pedroli, E., Serino, S., Cipresso, P., Pallavicini, F., Riva, G. Assessment and rehabilitation of neglect using virtual reality: a systematic review. Frontiers in Behavioral Neuroscience. 9, 226 (2015).
  33. Peskine, A., et al. Virtual reality assessment for visuospatial neglect: importance of a dynamic task. Journal of Neurology, Neurosurgery, and Psychiatry. 82 (12), 1407-1409 (2011).
  34. Mesa-Gresa, P., et al. Clinical Validation of a Virtual Environment Test for Safe Street Crossing in the Assessment of Acquired Brain Injury Patients with and without Neglect. Human-Computer Interaction – INTERACT 2011 Lecture Notes in Computer Science. , 44-51 (2011).
  35. Aravind, G., Lamontagne, A. Perceptual and Locomotor Factors Affect Obstacle Avoidance in Persons with Visuospatial Neglect. Journal of NeuroEngineering and Rehabilitation. 11 (1), 8 (2014).
  36. Pallavicini, F., et al. Assessing Unilateral Spatial Neglect Using Advanced Technologies: The Potentiality of Mobile Virtual Reality. Technology and Health Care. 23 (6), 795-807 (2015).
  37. Glize, B., et al. Improvement of Navigation and Representation in Virtual Reality after Prism Adaptation in Neglect Patients. Frontiers in Psychology. 8, (2017).
  38. Yasuda, K., Muroi, D., Ohira, M., Iwata, H. Validation of an Immersive Virtual Reality System for Training near and Far Space Neglect in Individuals with Stroke: a Pilot Study. Topics in Stroke Rehabilitation. 24 (7), 533-538 (2017).
  39. Spreij, L. A., Ten Brink, A. F., Visser-Meily, J. M. A., Nijboer, T. C. W. Simulated Driving: The Added Value of Dynamic Testing in the Assessment of Visuo-Spatial Neglect after Stroke. Journal of Neuropsychology. 31, (2018).
  40. Ogourtsova, T., Archambault, P. S., Lamontagne, A. Post-Stroke Unilateral Spatial Neglect: Virtual Reality-Based Navigation and Detection Tasks Reveal Lateralized and Non-Lateralized Deficits in Tasks of Varying Perceptual and Cognitive Demands. Journal of NeuroEngineering and Rehabilitation. 15, 1 (2018).
  41. Ogourtsova, T., Archambault, P., Sangani, S., Lamontagne, A. Ecological Virtual Reality Evaluation of Neglect Symptoms (EVENS), Effects of Virtual Scene Complexity in the Assessment of Poststroke Unilateral Spatial Neglect. Neurorehabilitation and Neural Repair. 32 (1), 46-61 (2018).
  42. Ricci, R., Chatterjee, A. Context and crossover in unilateral neglect. Neuropsychologia. 39 (11), 1138-1143 (2001).
  43. Karnath, H. O., Ferber, S., Himmelbach, M. Spatial awareness is a function of the temporal not the posterior parietal lobe. Nature. 411, 950-953 (2001).
  44. Spicer, R., Anglin, J., Krum, D. M., Liew, S. REINVENT: A low-cost, virtual reality brain-computer interface for severe stroke upper limb motor recovery. 2017 IEEE Virtual Reality (VR). , 385-386 (2017).
  45. Vourvopoulos, A., et al. Effects of a Brain-Computer Interface With Virtual Reality (VR) Neurofeedback: A Pilot Study in Chronic Stroke Patients. Frontiers in Human Neuroscience. 13, 210 (2019).
  46. Gammeri, R., Iacono, C., Ricci, R., Salatino, A. Unilateral Spatial Neglect After Stroke: Current Insights. Neuropsychiatric Disease and Treatment. 16, 131-152 (2020).

Play Video

Citar este artículo
Schwab, P. J., Miller, A., Raphail, A., Levine, A., Haslam, C., Coslett, H. B., Hamilton, R. H. Virtual Reality Tools for Assessing Unilateral Spatial Neglect: A Novel Opportunity for Data Collection. J. Vis. Exp. (169), e61951, doi:10.3791/61951 (2021).

View Video