JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Applying the risk of bias tool in a systematic review of combination long-acting beta-agonists and inhaled corticosteroids for persistent asthma.
PUBLISHED: 01-22-2011
The Risk of Bias (RoB) tool is used to assess internal validity of randomized controlled trials (RCTs). Our objectives were to: 1) evaluate inter-rater agreement of the RoB tool; 2) determine the time to access supplemental study information; 3) compare the RoB tool with the Jadad scale and Schulz allocation concealment (AC); and 4) examine the relationship between RoB and effect estimates.
Authors: Stacy A. Ruse, Vicki G. Davis, Alexandra S. Atkins, K. Ranga R. Krishnan, Kolleen H. Fox, Philip D. Harvey, Richard S.E. Keefe.
Published: 04-23-2014
Cognitive impairments affect the majority of patients with schizophrenia and these impairments predict poor long term psychosocial outcomes.  Treatment studies aimed at cognitive impairment in patients with schizophrenia not only require demonstration of improvements on cognitive tests, but also evidence that any cognitive changes lead to clinically meaningful improvements.  Measures of “functional capacity” index the extent to which individuals have the potential to perform skills required for real world functioning.  Current data do not support the recommendation of any single instrument for measurement of functional capacity.  The Virtual Reality Functional Capacity Assessment Tool (VRFCAT) is a novel, interactive gaming based measure of functional capacity that uses a realistic simulated environment to recreate routine activities of daily living. Studies are currently underway to evaluate and establish the VRFCAT’s sensitivity, reliability, validity, and practicality. This new measure of functional capacity is practical, relevant, easy to use, and has several features that improve validity and sensitivity of measurement of function in clinical trials of patients with CNS disorders.
25 Related JoVE Articles!
Play Button
Measuring the Subjective Value of Risky and Ambiguous Options using Experimental Economics and Functional MRI Methods
Authors: Ifat Levy, Lior Rosenberg Belmaker, Kirk Manson, Agnieszka Tymula, Paul W. Glimcher.
Institutions: Yale School of Medicine, Yale School of Medicine, New York University , New York University , New York University .
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2, the degree of those aversions vary substantially across individuals, such that the subjective value of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3 to assess the neural representation of the subjective values of risky and ambiguous options4. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations. In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
Neuroscience, Issue 67, Medicine, Molecular Biology, fMRI, magnetic resonance imaging, decision-making, value, uncertainty, risk, ambiguity
Play Button
The NeuroStar TMS Device: Conducting the FDA Approved Protocol for Treatment of Depression
Authors: Jared C. Horvath, John Mathews, Mark A. Demitrack, Alvaro Pascual-Leone.
Institutions: Beth Israel Deaconess Medical Center, Inc..
The Neuronetics NeuroStar Transcranial Magnetic Stimulation (TMS) System is a class II medical device that produces brief duration, pulsed magnetic fields. These rapidly alternating fields induce electrical currents within localized, targeted regions of the cortex which are associated with various physiological and functional brain changes.1,2,3 In 2007, O'Reardon et al., utilizing the NeuroStar device, published the results of an industry-sponsored, multisite, randomized, sham-stimulation controlled clinical trial in which 301 patients with major depression, who had previously failed to respond to at least one adequate antidepressant treatment trial, underwent either active or sham TMS over the left dorsolateral prefrontal cortex (DLPFC). The patients, who were medication-free at the time of the study, received TMS five times per week over 4-6 weeks.4 The results demonstrated that a sub-population of patients (those who were relatively less resistant to medication, having failed not more than two good pharmacologic trials) showed a statistically significant improvement on the Montgomery-Asberg Depression Scale (MADRS), the Hamilton Depression Rating Scale (HAMD), and various other outcome measures. In October 2008, supported by these and other similar results5,6,7, Neuronetics obtained the first and only Food and Drug Administration (FDA) approval for the clinical treatment of a specific form of medication-refractory depression using a TMS Therapy device (FDA approval K061053). In this paper, we will explore the specified FDA approved NeuroStar depression treatment protocol (to be administered only under prescription and by a licensed medical profession in either an in- or outpatient setting).
Neuroscience, Issue 45, Transcranial Magnetic Stimulation, Depression, Neuronetics, NeuroStar, FDA Approved
Play Button
Osteopathic Manipulative Treatment as a Useful Adjunctive Tool for Pneumonia
Authors: Sheldon Yao, John Hassani, Martin Gagne, Gebe George, Wolfgang Gilliar.
Institutions: New York Institute of Technology College of Osteopathic Medicine.
Pneumonia, the inflammatory state of lung tissue primarily due to microbial infection, claimed 52,306 lives in the United States in 20071 and resulted in the hospitalization of 1.1 million patients2. With an average length of in-patient hospital stay of five days2, pneumonia and influenza comprise significant financial burden costing the United States $40.2 billion in 20053. Under the current Infectious Disease Society of America/American Thoracic Society guidelines, standard-of-care recommendations include the rapid administration of an appropriate antibiotic regiment, fluid replacement, and ventilation (if necessary). Non-standard therapies include the use of corticosteroids and statins; however, these therapies lack conclusive supporting evidence4. (Figure 1) Osteopathic Manipulative Treatment (OMT) is a cost-effective adjunctive treatment of pneumonia that has been shown to reduce patients’ length of hospital stay, duration of intravenous antibiotics, and incidence of respiratory failure or death when compared to subjects who received conventional care alone5. The use of manual manipulation techniques for pneumonia was first recorded as early as the Spanish influenza pandemic of 1918, when patients treated with standard medical care had an estimated mortality rate of 33%, compared to a 10% mortality rate in patients treated by osteopathic physicians6. When applied to the management of pneumonia, manual manipulation techniques bolster lymphatic flow, respiratory function, and immunological defense by targeting anatomical structures involved in the these systems7,8, 9, 10. The objective of this review video-article is three-fold: a) summarize the findings of randomized controlled studies on the efficacy of OMT in adult patients with diagnosed pneumonia, b) demonstrate established protocols utilized by osteopathic physicians treating pneumonia, c) elucidate the physiological mechanisms behind manual manipulation of the respiratory and lymphatic systems. Specifically, we will discuss and demonstrate four routine techniques that address autonomics, lymph drainage, and rib cage mobility: 1) Rib Raising, 2) Thoracic Pump, 3) Doming of the Thoracic Diaphragm, and 4) Muscle Energy for Rib 1.5,11
Medicine, Issue 87, Pneumonia, osteopathic manipulative medicine (OMM) and techniques (OMT), lymphatic, rib raising, thoracic pump, muscle energy, doming diaphragm, alternative treatment
Play Button
Bronchial Thermoplasty: A Novel Therapeutic Approach to Severe Asthma
Authors: David R. Duhamel, Jeff B. Hales.
Institutions: Virginia Hospital Center, Virginia Hospital Center.
Bronchial thermoplasty is a non-drug procedure for severe persistent asthma that delivers thermal energy to the airway wall in a precisely controlled manner to reduce excessive airway smooth muscle. Reducing airway smooth muscle decreases the ability of the airways to constrict, thereby reducing the frequency of asthma attacks. Bronchial thermoplasty is delivered by the Alair System and is performed in three outpatient procedure visits, each scheduled approximately three weeks apart. The first procedure treats the airways of the right lower lobe, the second treats the airways of the left lower lobe and the third and final procedure treats the airways in both upper lobes. After all three procedures are performed the bronchial thermoplasty treatment is complete. Bronchial thermoplasty is performed during bronchoscopy with the patient under moderate sedation. All accessible airways distal to the mainstem bronchi between 3 and 10 mm in diameter, with the exception of the right middle lobe, are treated under bronchoscopic visualization. Contiguous and non-overlapping activations of the device are used, moving from distal to proximal along the length of the airway, and systematically from airway to airway as described previously. Although conceptually straightforward, the actual execution of bronchial thermoplasty is quite intricate and procedural duration for the treatment of a single lobe is often substantially longer than encountered during routine bronchoscopy. As such, bronchial thermoplasty should be considered a complex interventional bronchoscopy and is intended for the experienced bronchoscopist. Optimal patient management is critical in any such complex and longer duration bronchoscopic procedure. This article discusses the importance of careful patient selection, patient preparation, patient management, procedure duration, postoperative care and follow-up to ensure that bronchial thermoplasty is performed safely. Bronchial thermoplasty is expected to complement asthma maintenance medications by providing long-lasting asthma control and improving asthma-related quality of life of patients with severe asthma. In addition, bronchial thermoplasty has been demonstrated to reduce severe exacerbations (asthma attacks) emergency rooms visits for respiratory symptoms, and time lost from work, school and other daily activities due to asthma.
Medicine, Issue 45, bronchial thermoplasty, severe asthma, airway smooth muscle, bronchoscopy, radiofrequency energy, patient management, moderate sedation
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Evaluation of Respiratory System Mechanics in Mice using the Forced Oscillation Technique
Authors: Toby K. McGovern, Annette Robichaud, Liah Fereydoonzad, Thomas F. Schuessler, James G. Martin.
Institutions: McGill University , SCIREQ Scientific Respiratory Equipment Inc..
The forced oscillation technique (FOT) is a powerful, integrative and translational tool permitting the experimental assessment of lung function in mice in a comprehensive, detailed, precise and reproducible manner. It provides measurements of respiratory system mechanics through the analysis of pressure and volume signals acquired in reaction to predefined, small amplitude, oscillatory airflow waveforms, which are typically applied at the subject's airway opening. The present protocol details the steps required to adequately execute forced oscillation measurements in mice using a computer-controlled piston ventilator (flexiVent; SCIREQ Inc, Montreal, Qc, Canada). The description is divided into four parts: preparatory steps, mechanical ventilation, lung function measurements, and data analysis. It also includes details of how to assess airway responsiveness to inhaled methacholine in anesthetized mice, a common application of this technique which also extends to other outcomes and various lung pathologies. Measurements obtained in naïve mice as well as from an oxidative-stress driven model of airway damage are presented to illustrate how this tool can contribute to a better characterization and understanding of studied physiological changes or disease models as well as to applications in new research areas.
Medicine, Issue 75, Biomedical Engineering, Anatomy, Physiology, Biophysics, Pathology, lung diseases, asthma, respiratory function tests, respiratory system, forced oscillation technique, respiratory system mechanics, airway hyperresponsiveness, flexiVent, lung physiology, lung, oxidative stress, ventilator, cannula, mice, animal model, clinical techniques
Play Button
Assessment of Morphine-induced Hyperalgesia and Analgesic Tolerance in Mice Using Thermal and Mechanical Nociceptive Modalities
Authors: Khadija Elhabazi, Safia Ayachi, Brigitte Ilien, Frédéric Simonin.
Institutions: Université de Strasbourg.
Opioid-induced hyperalgesia and tolerance severely impact the clinical efficacy of opiates as pain relievers in animals and humans. The molecular mechanisms underlying both phenomena are not well understood and their elucidation should benefit from the study of animal models and from the design of appropriate experimental protocols. We describe here a methodological approach for inducing, recording and quantifying morphine-induced hyperalgesia as well as for evidencing analgesic tolerance, using the tail-immersion and tail pressure tests in wild-type mice. As shown in the video, the protocol is divided into five sequential steps. Handling and habituation phases allow a safe determination of the basal nociceptive response of the animals. Chronic morphine administration induces significant hyperalgesia as shown by an increase in both thermal and mechanical sensitivity, whereas the comparison of analgesia time-courses after acute or repeated morphine treatment clearly indicates the development of tolerance manifested by a decline in analgesic response amplitude. This protocol may be similarly adapted to genetically modified mice in order to evaluate the role of individual genes in the modulation of nociception and morphine analgesia. It also provides a model system to investigate the effectiveness of potential therapeutic agents to improve opiate analgesic efficacy.
Neuroscience, Issue 89, mice, nociception, tail immersion test, tail pressure test, morphine, analgesia, opioid-induced hyperalgesia, tolerance
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
Play Button
The Goeckerman Regimen for the Treatment of Moderate to Severe Psoriasis
Authors: Rishu Gupta, Maya Debbaneh, Daniel Butler, Monica Huynh, Ethan Levin, Argentina Leon, John Koo, Wilson Liao.
Institutions: University of Southern California, University of California, San Francisco , University of California Irvine School of Medicine, University of Arizona College of Medicine, Chicago College of Osteopathic Medicine.
Psoriasis is a chronic, immune-mediated inflammatory skin disease affecting approximately 2-3% of the population. The Goeckerman regimen consists of exposure to ultraviolet B (UVB) light and application of crude coal tar (CCT). Goeckerman therapy is extremely effective and relatively safe for the treatment of psoriasis and for improving a patient's quality of life. In the following article, we present our protocol for the Goeckerman therapy that is utilized specifically at the University of California, San Francisco. This protocol details the preparation of supplies, administration of phototherapy and application of topical tar. This protocol also describes how to assess the patient daily, monitor for adverse effects (including pruritus and burning), and adjust the treatment based on the patient's response. Though it is one of the oldest therapies available for psoriasis, there is an absence of any published videos demonstrating the process in detail. The video is beneficial for healthcare providers who want to administer the therapy, for trainees who want to learn more about the process, and for prospective patients who want to undergo treatment for their cutaneous disease.
Medicine, Issue 77, Infection, Biomedical Engineering, Anatomy, Physiology, Immunology, Dermatology, Skin, Dermis, Epidermis, Skin Diseases, Skin Diseases, Eczematous, Goeckerman, Crude Coal Tar, phototherapy, psoriasis, Eczema, Goeckerman regimen, clinical techniques
Play Button
Murine Model of Allergen Induced Asthma
Authors: Aravind T. Reddy, Sowmya P. Lakshmi, Raju C. Reddy.
Institutions: Emory University and Atlanta VA Medical Center.
Asthma is a major cause of morbidity and mortality, affecting some 300 million people throughout the world.1 More than 8% of the US population has asthma, with the prevalence increasing.2 As with other diseases, animal models of allergic airway disease greatly facilitate understanding of the underlying pathophysiology, help identify potential therapeutic targets, and allow preclinical testing of possible new therapies. Models of allergic airway disease have been developed in several animal species, but murine models are particularly attractive due to the low cost, ready availability, and well-characterized immune systems of these animals.3 Availability of a variety of transgenic strains further increases the attractiveness of these models.4 Here we describe two murine models of allergic airway disease, both employing ovalbumin as the antigen. Following initial sensitization by intraperitoneal injection, one model delivers the antigen challenge by nebulization, the other by intratracheal delivery. These two models offer complementary advantages, with each mimicking the major features of human asthma.5 The major features of acute asthma include an exaggerated airway response to stimuli such as methacholine (airway hyperresponsiveness; AHR) and eosinophil-rich airway inflammation. These are also prominent effects of allergen challenge in our murine models,5,6 and we describe techniques for measuring them and thus evaluating the effects of experimental manipulation. Specifically, we describe both invasive7 and non-invasive8 techniques for measuring airway hyperresponsiveness as well as methods for assessing infiltration of inflammatory cells into the airways and the lung. Airway inflammatory cells are collected by bronchoalveolar lavage while lung histopathology is used to assess markers of inflammation throughout the organ. These techniques provide powerful tools for studying asthma in ways that would not be possible in humans.
Immunology, Issue 63, Allergy, airway hyperresponsiveness, pulmonary function, eosinophil, ovalbumin, methacholine, airway resistance, plethysmography, flexiVent, bronchoalveolar lavage, physiology
Play Button
Drug-induced Sensitization of Adenylyl Cyclase: Assay Streamlining and Miniaturization for Small Molecule and siRNA Screening Applications
Authors: Jason M. Conley, Tarsis F. Brust, Ruqiang Xu, Kevin D. Burris, Val J. Watts.
Institutions: Purdue University, Eli Lilly and Company.
Sensitization of adenylyl cyclase (AC) signaling has been implicated in a variety of neuropsychiatric and neurologic disorders including substance abuse and Parkinson's disease. Acute activation of Gαi/o-linked receptors inhibits AC activity, whereas persistent activation of these receptors results in heterologous sensitization of AC and increased levels of intracellular cAMP. Previous studies have demonstrated that this enhancement of AC responsiveness is observed both in vitro and in vivo following the chronic activation of several types of Gαi/o-linked receptors including D2 dopamine and μ opioid receptors. Although heterologous sensitization of AC was first reported four decades ago, the mechanism(s) that underlie this phenomenon remain largely unknown. The lack of mechanistic data presumably reflects the complexity involved with this adaptive response, suggesting that nonbiased approaches could aid in identifying the molecular pathways involved in heterologous sensitization of AC. Previous studies have implicated kinase and Gbγ signaling as overlapping components that regulate the heterologous sensitization of AC. To identify unique and additional overlapping targets associated with sensitization of AC, the development and validation of a scalable cAMP sensitization assay is required for greater throughput. Previous approaches to study sensitization are generally cumbersome involving continuous cell culture maintenance as well as a complex methodology for measuring cAMP accumulation that involves multiple wash steps. Thus, the development of a robust cell-based assay that can be used for high throughput screening (HTS) in a 384 well format would facilitate future studies. Using two D2 dopamine receptor cellular models (i.e. CHO-D2L and HEK-AC6/D2L), we have converted our 48-well sensitization assay (>20 steps 4-5 days) to a five-step, single day assay in 384-well format. This new format is amenable to small molecule screening, and we demonstrate that this assay design can also be readily used for reverse transfection of siRNA in anticipation of targeted siRNA library screening.
Bioengineering, Issue 83, adenylyl cyclase, cAMP, heterologous sensitization, superactivation, D2 dopamine, μ opioid, siRNA
Play Button
Nerve Excitability Assessment in Chemotherapy-induced Neurotoxicity
Authors: Susanna B. Park, Cindy S-Y. Lin, Matthew C. Kiernan.
Institutions: University of New South Wales , University of New South Wales , University of New South Wales .
Chemotherapy-induced neurotoxicity is a serious consequence of cancer treatment, which occurs with some of the most commonly used chemotherapies1,2. Chemotherapy-induced peripheral neuropathy produces symptoms of numbness and paraesthesia in the limbs and may progress to difficulties with fine motor skills and walking, leading to functional impairment. In addition to producing troubling symptoms, chemotherapy-induced neuropathy may limit treatment success leading to dose reduction or early cessation of treatment. Neuropathic symptoms may persist long-term, leaving permanent nerve damage in patients with an otherwise good prognosis3. As chemotherapy is utilised more often as a preventative measure, and survival rates increase, the importance of long-lasting and significant neurotoxicity will increase. There are no established neuroprotective or treatment options and a lack of sensitive assessment methods. Appropriate assessment of neurotoxicity will be critical as a prognostic factor and as suitable endpoints for future trials of neuroprotective agents. Current methods to assess the severity of chemotherapy-induced neuropathy utilise clinician-based grading scales which have been demonstrated to lack sensitivity to change and inter-observer objectivity4. Conventional nerve conduction studies provide information about compound action potential amplitude and conduction velocity, which are relatively non-specific measures and do not provide insight into ion channel function or resting membrane potential. Accordingly, prior studies have demonstrated that conventional nerve conduction studies are not sensitive to early change in chemotherapy-induced neurotoxicity4-6. In comparison, nerve excitability studies utilize threshold tracking techniques which have been developed to enable assessment of ion channels, pumps and exchangers in vivo in large myelinated human axons7-9. Nerve excitability techniques have been established as a tool to examine the development and severity of chemotherapy-induced neurotoxicity10-13. Comprising a number of excitability parameters, nerve excitability studies can be used to assess acute neurotoxicity arising immediately following infusion and the development of chronic, cumulative neurotoxicity. Nerve excitability techniques are feasible in the clinical setting, with each test requiring only 5 -10 minutes to complete. Nerve excitability equipment is readily commercially available, and a portable system has been devised so that patients can be tested in situ in the infusion centre setting. In addition, these techniques can be adapted for use in multiple chemotherapies. In patients treated with the chemotherapy oxaliplatin, primarily utilised for colorectal cancer, nerve excitability techniques provide a method to identify patients at-risk for neurotoxicity prior to the onset of chronic neuropathy. Nerve excitability studies have revealed the development of an acute Na+ channelopathy in motor and sensory axons10-13. Importantly, patients who demonstrated changes in excitability in early treatment were subsequently more likely to develop moderate to severe neurotoxicity11. However, across treatment, striking longitudinal changes were identified only in sensory axons which were able to predict clinical neurological outcome in 80% of patients10. These changes demonstrated a different pattern to those seen acutely following oxaliplatin infusion, and most likely reflect the development of significant axonal damage and membrane potential change in sensory nerves which develops longitudinally during oxaliplatin treatment10. Significant abnormalities developed during early treatment, prior to any reduction in conventional measures of nerve function, suggesting that excitability parameters may provide a sensitive biomarker.
Neuroscience, Issue 62, Chemotherapy, Neurotoxicity, Neuropathy, Nerve excitability, Ion channel function, Oxaliplatin, oncology, medicine
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Play Button
Coordinate Mapping of Hyolaryngeal Mechanics in Swallowing
Authors: Thomas Z. Thompson, Farres Obeidin, Alisa A. Davidoff, Cody L. Hightower, Christohper Z. Johnson, Sonya L. Rice, Rebecca-Lyn Sokolove, Brandon K. Taylor, John M. Tuck, William G. Pearson, Jr..
Institutions: Georgia Regents University, New York University, Georgia Regents University, Georgia Regents University.
Characterizing hyolaryngeal movement is important to dysphagia research. Prior methods require multiple measurements to obtain one kinematic measurement whereas coordinate mapping of hyolaryngeal mechanics using Modified Barium Swallow (MBS) uses one set of coordinates to calculate multiple variables of interest. For demonstration purposes, ten kinematic measurements were generated from one set of coordinates to determine differences in swallowing two different bolus types. Calculations of hyoid excursion against the vertebrae and mandible are correlated to determine the importance of axes of reference. To demonstrate coordinate mapping methodology, 40 MBS studies were randomly selected from a dataset of healthy normal subjects with no known swallowing impairment. A 5 ml thin-liquid bolus and a 5 ml pudding swallows were measured from each subject. Nine coordinates, mapping the cranial base, mandible, vertebrae and elements of the hyolaryngeal complex, were recorded at the frames of minimum and maximum hyolaryngeal excursion. Coordinates were mathematically converted into ten variables of hyolaryngeal mechanics. Inter-rater reliability was evaluated by Intraclass correlation coefficients (ICC). Two-tailed t-tests were used to evaluate differences in kinematics by bolus viscosity. Hyoid excursion measurements against different axes of reference were correlated. Inter-rater reliability among six raters for the 18 coordinates ranged from ICC = 0.90 - 0.97. A slate of ten kinematic measurements was compared by subject between the six raters. One outlier was rejected, and the mean of the remaining reliability scores was ICC = 0.91, 0.84 - 0.96, 95% CI. Two-tailed t-tests with Bonferroni corrections comparing ten kinematic variables (5 ml thin-liquid vs. 5 ml pudding swallows) showed statistically significant differences in hyoid excursion, superior laryngeal movement, and pharyngeal shortening (p < 0.005). Pearson correlations of hyoid excursion measurements from two different axes of reference were: r = 0.62, r2 = 0.38, (thin-liquid); r = 0.52, r2 = 0.27, (pudding). Obtaining landmark coordinates is a reliable method to generate multiple kinematic variables from video fluoroscopic images useful in dysphagia research.
Medicine, Issue 87, videofluoroscopy, modified barium swallow studies, hyolaryngeal kinematics, deglutition, dysphagia, dysphagia research, hyolaryngeal complex
Play Button
Lesion Explorer: A Video-guided, Standardized Protocol for Accurate and Reliable MRI-derived Volumetrics in Alzheimer's Disease and Normal Elderly
Authors: Joel Ramirez, Christopher J.M. Scott, Alicia A. McNeely, Courtney Berezuk, Fuqiang Gao, Gregory M. Szilagyi, Sandra E. Black.
Institutions: Sunnybrook Health Sciences Centre, University of Toronto.
Obtaining in vivo human brain tissue volumetrics from MRI is often complicated by various technical and biological issues. These challenges are exacerbated when significant brain atrophy and age-related white matter changes (e.g. Leukoaraiosis) are present. Lesion Explorer (LE) is an accurate and reliable neuroimaging pipeline specifically developed to address such issues commonly observed on MRI of Alzheimer's disease and normal elderly. The pipeline is a complex set of semi-automatic procedures which has been previously validated in a series of internal and external reliability tests1,2. However, LE's accuracy and reliability is highly dependent on properly trained manual operators to execute commands, identify distinct anatomical landmarks, and manually edit/verify various computer-generated segmentation outputs. LE can be divided into 3 main components, each requiring a set of commands and manual operations: 1) Brain-Sizer, 2) SABRE, and 3) Lesion-Seg. Brain-Sizer's manual operations involve editing of the automatic skull-stripped total intracranial vault (TIV) extraction mask, designation of ventricular cerebrospinal fluid (vCSF), and removal of subtentorial structures. The SABRE component requires checking of image alignment along the anterior and posterior commissure (ACPC) plane, and identification of several anatomical landmarks required for regional parcellation. Finally, the Lesion-Seg component involves manual checking of the automatic lesion segmentation of subcortical hyperintensities (SH) for false positive errors. While on-site training of the LE pipeline is preferable, readily available visual teaching tools with interactive training images are a viable alternative. Developed to ensure a high degree of accuracy and reliability, the following is a step-by-step, video-guided, standardized protocol for LE's manual procedures.
Medicine, Issue 86, Brain, Vascular Diseases, Magnetic Resonance Imaging (MRI), Neuroimaging, Alzheimer Disease, Aging, Neuroanatomy, brain extraction, ventricles, white matter hyperintensities, cerebrovascular disease, Alzheimer disease
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Using the Threat Probability Task to Assess Anxiety and Fear During Uncertain and Certain Threat
Authors: Daniel E. Bradford, Katherine P. Magruder, Rachel A. Korhumel, John J. Curtin.
Institutions: University of Wisconsin-Madison.
Fear of certain threat and anxiety about uncertain threat are distinct emotions with unique behavioral, cognitive-attentional, and neuroanatomical components. Both anxiety and fear can be studied in the laboratory by measuring the potentiation of the startle reflex. The startle reflex is a defensive reflex that is potentiated when an organism is threatened and the need for defense is high. The startle reflex is assessed via electromyography (EMG) in the orbicularis oculi muscle elicited by brief, intense, bursts of acoustic white noise (i.e., “startle probes”). Startle potentiation is calculated as the increase in startle response magnitude during presentation of sets of visual threat cues that signal delivery of mild electric shock relative to sets of matched cues that signal the absence of shock (no-threat cues). In the Threat Probability Task, fear is measured via startle potentiation to high probability (100% cue-contingent shock; certain) threat cues whereas anxiety is measured via startle potentiation to low probability (20% cue-contingent shock; uncertain) threat cues. Measurement of startle potentiation during the Threat Probability Task provides an objective and easily implemented alternative to assessment of negative affect via self-report or other methods (e.g., neuroimaging) that may be inappropriate or impractical for some researchers. Startle potentiation has been studied rigorously in both animals (e.g., rodents, non-human primates) and humans which facilitates animal-to-human translational research. Startle potentiation during certain and uncertain threat provides an objective measure of negative affective and distinct emotional states (fear, anxiety) to use in research on psychopathology, substance use/abuse and broadly in affective science. As such, it has been used extensively by clinical scientists interested in psychopathology etiology and by affective scientists interested in individual differences in emotion.
Behavior, Issue 91, Startle; electromyography; shock; addiction; uncertainty; fear; anxiety; humans; psychophysiology; translational
Play Button
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Authors: Jeremy D. Smith, Abbie E. Ferris, Gary D. Heise, Richard N. Hinrichs, Philip E. Martin.
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Play Button
A Proboscis Extension Response Protocol for Investigating Behavioral Plasticity in Insects: Application to Basic, Biomedical, and Agricultural Research
Authors: Brian H. Smith, Christina M. Burden.
Institutions: Arizona State University.
Insects modify their responses to stimuli through experience of associating those stimuli with events important for survival (e.g., food, mates, threats). There are several behavioral mechanisms through which an insect learns salient associations and relates them to these events. It is important to understand this behavioral plasticity for programs aimed toward assisting insects that are beneficial for agriculture. This understanding can also be used for discovering solutions to biomedical and agricultural problems created by insects that act as disease vectors and pests. The Proboscis Extension Response (PER) conditioning protocol was developed for honey bees (Apis mellifera) over 50 years ago to study how they perceive and learn about floral odors, which signal the nectar and pollen resources a colony needs for survival. The PER procedure provides a robust and easy-to-employ framework for studying several different ecologically relevant mechanisms of behavioral plasticity. It is easily adaptable for use with several other insect species and other behavioral reflexes. These protocols can be readily employed in conjunction with various means for monitoring neural activity in the CNS via electrophysiology or bioimaging, or for manipulating targeted neuromodulatory pathways. It is a robust assay for rapidly detecting sub-lethal effects on behavior caused by environmental stressors, toxins or pesticides. We show how the PER protocol is straightforward to implement using two procedures. One is suitable as a laboratory exercise for students or for quick assays of the effect of an experimental treatment. The other provides more thorough control of variables, which is important for studies of behavioral conditioning. We show how several measures for the behavioral response ranging from binary yes/no to more continuous variable like latency and duration of proboscis extension can be used to test hypotheses. And, we discuss some pitfalls that researchers commonly encounter when they use the procedure for the first time.
Neuroscience, Issue 91, PER, conditioning, honey bee, olfaction, olfactory processing, learning, memory, toxin assay
Play Button
Hyponeophagia: A Measure of Anxiety in the Mouse
Authors: Rob M.J. Deacon.
Institutions: University of Oxford.
Before the present day, when fast-acting and potent rodenticides such as alpha-chloralose were not yet in use, the work of pest controllers was often hampered by a phenomenon known as "bait shyness". Mice and rats cannot vomit, due to the tightness of the cardiac sphincter of the stomach, so to overcome the problem of potential food toxicity they have evolved a strategy of first ingesting only very small amounts of novel substances. The amounts ingested then gradually increase until the animal has determined whether the substance is safe and nutritious. So the old rat-catchers would first put a palatable substance such as oatmeal, which was to be the vehicle for the toxin, in the infested area. Only when large amounts were being readily consumed would they then add the poison, in amounts calculated not to affect the taste of the vehicle. The poisoned bait, which the animals were now readily eating in large amounts, would then swiftly perform its function. Bait shyness is now used in the behavioural laboratory as a way of measuring anxiety. A highly palatable but novel substance, such as sweet corn, nuts or sweetened condensed milk, is offered to the mice (or rats) in a novel situation, such as a new cage. The latency to consume a defined amount of the new food is then measured. Robert M.J. Deacon can be reach at
Neuroscience, Issue 51, Anxiety, hyponeophagia, bait shyness, mice, hippocampus, strain differences, plus-maze
Play Button
Major Components of the Light Microscope
Authors: Victoria Centonze Frohlich.
Institutions: University of Texas Health Science Center at San Antonio (UTHSCSA).
The light microscope is a basic tool for the cell biologist, who should have a thorough understanding of how it works, how it should be aligned for different applications, and how it should be maintained as required to obtain maximum image-forming capacity and resolution. The components of the microscope are described in detail here.
Basic Protocols, Issue 17, Current Protocols Wiley, Microscopy, Objectives, Condenser, Eyepiece
Play Button
Deep Neuromuscular Blockade Leads to a Larger Intraabdominal Volume During Laparoscopy
Authors: Astrid Listov Lindekaer, Henrik Halvor Springborg, Olav Istre.
Institutions: Aleris-Hamlet Hospitals, Soeborg, Denmark, Aleris-Hamlet Hospitals, Soeborg, Denmark.
Shoulder pain is a commonly reported symptom following laparoscopic procedures such as myomectomy or hysterectomy, and recent studies have shown that lowering the insufflation pressure during surgery may reduce the risk of post-operative pain. In this pilot study, a method is presented for measuring the intra-abdominal space available to the surgeon during laproscopy, in order to examine whether the relaxation produced by deep neuromuscular blockade can increase the working surgical space sufficiently to permit a reduction in the CO2 insufflation pressure. Using the laproscopic grasper, the distance from the promontory to the skin is measured at two different insufflation pressures: 8 mm Hg and 12 mm Hg. After the initial measurements, a neuromuscular blocking agent (rocuronium) is administered to the patient and the intra-abdominal volume is measured again. Pilot data collected from 15 patients shows that the intra-abdominal space at 8 mm Hg with blockade is comparable to the intra-abdominal space measured at 12 mm Hg without blockade. The impact of neuromuscular blockade was not correlated with patient height, weight, BMI, and age. Thus, using neuromuscular blockade to maintain a steady volume while reducing insufflation pressure may produce improved patient outcomes.
Medicine, Issue 76, Anatomy, Physiology, Neurobiology, Surgery, gynecology, laparoscopy, deep neuromuscular blockade, reversal, rocuronium, sugammadex, laparoscopic surgery, clinical techniques, surgical techniques
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.