In recent years, it has become apparent that genomic instability is tightly related to many developmental disorders, cancers, and aging. Given that stem cells are responsible for ensuring tissue homeostasis and repair throughout life, it is reasonable to hypothesize that the stem cell population is critical for preserving genomic integrity of tissues. Therefore, significant interest has arisen in assessing the impact of endogenous and environmental factors on genomic integrity in stem cells and their progeny, aiming to understand the etiology of stem-cell based diseases.
LacI transgenic mice carry a recoverable λ phage vector encoding the LacI reporter system, in which the LacI gene serves as the mutation reporter. The result of a mutated LacI gene is the production of β-galactosidase that cleaves a chromogenic substrate, turning it blue. The LacI reporter system is carried in all cells, including stem/progenitor cells and can easily be recovered and used to subsequently infect E. coli. After incubating infected E. coli on agarose that contains the correct substrate, plaques can be scored; blue plaques indicate a mutant LacI gene, while clear plaques harbor wild-type. The frequency of blue (among clear) plaques indicates the mutant frequency in the original cell population the DNA was extracted from. Sequencing the mutant LacI gene will show the location of the mutations in the gene and the type of mutation.
The LacI transgenic mouse model is well-established as an in vivo mutagenesis assay. Moreover, the mice and the reagents for the assay are commercially available. Here we describe in detail how this model can be adapted to measure the frequency of spontaneously occurring DNA mutants in stem cell-enriched Lin-IL7R-Sca-1+cKit++(LSK) cells and other subpopulations of the hematopoietic system.
28 Related JoVE Articles!
An Experimental Paradigm for the Prediction of Post-Operative Pain (PPOP)
Institutions: University of Washington School of Medicine.
Many women undergo cesarean delivery without problems, however some experience significant pain after cesarean section. Pain is associated with negative short-term and long-term effects on the mother. Prior to women undergoing surgery, can we predict who is at risk for developing significant postoperative pain and potentially prevent or minimize its negative consequences? These are the fundamental questions that a team from the University of Washington, Stanford University, the Catholic University in Brussels, Belgium, Santa Joana Women's Hospital in São Paulo, Brazil, and Rambam Medical Center in Israel is currently evaluating in an international research collaboration. The ultimate goal of this project is to provide optimal pain relief during and after cesarean section by offering individualized anesthetic care to women who appear to be more 'susceptible' to pain after surgery.
A significant number of women experience moderate or severe acute post-partum pain after vaginal and cesarean deliveries. 1
Furthermore, 10-15% of women suffer chronic persistent pain after cesarean section. 2
With constant increase in cesarean rates in the US 3
and the already high rate in Brazil, this is bound to create a significant public health problem. When questioning women's fears and expectations from cesarean section, pain during and after it is their greatest concern. 4
Individual variability in severity of pain after vaginal or operative delivery is influenced by multiple factors including sensitivity to pain, psychological factors, age, and genetics. The unique birth experience leads to unpredictable requirements for analgesics, from 'none at all' to 'very high' doses of pain medication. Pain after cesarean section is an excellent model to study post-operative pain because it is performed on otherwise young and healthy women. Therefore, it is recommended to attenuate the pain during the acute phase because this may lead to chronic pain disorders. The impact of developing persistent pain is immense, since it may impair not only the ability of women to care for their child in the immediate postpartum period, but also their own well being for a long period of time.
In a series of projects, an international research network is currently investigating the effect of pregnancy on pain modulation and ways to predict who will suffer acute severe pain and potentially chronic pain, by using simple pain tests and questionnaires in combination with genetic analysis. A relatively recent approach to investigate pain modulation is via the psychophysical measure of Diffuse Noxious Inhibitory Control (DNIC). This pain-modulating process is the neurophysiological basis for the well-known phenomenon of 'pain inhibits pain' from remote areas of the body. The DNIC paradigm has evolved recently into a clinical tool and simple test and has been shown to be a predictor of post-operative pain.5
Since pregnancy is associated with decreased pain sensitivity and/or enhanced processes of pain modulation, using tests that investigate pain modulation should provide a better understanding of the pathways involved with pregnancy-induced analgesia and may help predict pain outcomes during labor and delivery. For those women delivering by cesarean section, a DNIC test performed prior to surgery along with psychosocial questionnaires and genetic tests should enable one to identify women prone to suffer severe post-cesarean pain and persistent pain. These clinical tests should allow anesthesiologists to offer not only personalized medicine to women with the promise to improve well-being and satisfaction, but also a reduction in the overall cost of perioperative and long term care due to pain and suffering. On a larger scale, these tests that explore pain modulation may become bedside screening tests to predict the development of pain disorders following surgery.
JoVE Medicine, Issue 35, diffuse noxious inhibitory control, DNIC, temporal summation, TS, psychophysical testing, endogenous analgesia, pain modulation, pregnancy-induced analgesia, cesarean section, post-operative pain, prediction
Diagnosing Pulmonary Tuberculosis with the Xpert MTB/RIF Test
Institutions: University of Bern, MCL Laboratories Inc..
Tuberculosis (TB) due to Mycobacterium tuberculosis
(MTB) remains a major public health issue: the infection affects up to one third of the world population1
, and almost two million people are killed by TB each year.2
Universal access to high-quality, patient-centered treatment for all TB patients is emphasized by WHO's Stop TB Strategy.3
The rapid detection of MTB in respiratory specimens and drug therapy based on reliable drug resistance testing results are a prerequisite for the successful implementation of this strategy. However, in many areas of the world, TB diagnosis still relies on insensitive, poorly standardized sputum microscopy methods. Ineffective TB detection and the emergence and transmission of drug-resistant MTB strains increasingly jeopardize global TB control activities.2
Effective diagnosis of pulmonary TB requires the availability - on a global scale - of standardized, easy-to-use, and robust diagnostic tools that would allow the direct detection of both the MTB complex and resistance to key antibiotics, such as rifampicin (RIF). The latter result can serve as marker for multidrug-resistant MTB (MDR TB) and has been reported in > 95% of the MDR-TB isolates.4, 5
The rapid availability of reliable test results is likely to directly translate into sound patient management decisions that, ultimately, will cure the individual patient and break the chain of TB transmission in the community.2
Cepheid's (Sunnyvale, CA, U.S.A.) Xpert MTB/RIF assay6, 7
meets the demands outlined above in a remarkable manner. It is a nucleic-acids amplification test for 1) the detection of MTB complex DNA in sputum or concentrated sputum sediments; and 2) the detection of RIF resistance-associated mutations of the rpoB
It is designed for use with Cepheid's GeneXpert Dx System that integrates and automates sample processing, nucleic acid amplification, and detection of the target sequences using real-time PCR and reverse transcriptase PCR. The system consists of an instrument, personal computer, barcode scanner, and preloaded software for running tests and viewing the results.9
It employs single-use disposable Xpert MTB/RIF cartridges that hold PCR reagents and host the PCR process. Because the cartridges are self-contained, cross-contamination between samples is eliminated.6
Current nucleic acid amplification methods used to detect MTB are complex, labor-intensive, and technically demanding. The Xpert MTB/RIF assay has the potential to bring standardized, sensitive and very specific diagnostic testing for both TB and drug resistance to universal-access point-of-care settings3
, provided that they will be able to afford it. In order to facilitate access, the Foundation for Innovative New Diagnostics (FIND) has negotiated significant price reductions. Current FIND-negotiated prices, along with the list of countries eligible for the discounts, are available on the web.10
Immunology, Issue 62, tuberculosis, drug resistance, rifampicin, rapid diagnosis, Xpert MTB/RIF test
Fundus Photography as a Convenient Tool to Study Microvascular Responses to Cardiovascular Disease Risk Factors in Epidemiological Studies
Institutions: Flemish Institute for Technological Research (VITO), Hasselt University, Hasselt University, Leuven University.
The microcirculation consists of blood vessels with diameters less than 150 µm. It makes up a large part of the circulatory system and plays an important role in maintaining cardiovascular health. The retina is a tissue that lines the interior of the eye and it is the only tissue that allows for a non-invasive analysis of the microvasculature. Nowadays, high-quality fundus images can be acquired using digital cameras. Retinal images can be collected in 5 min or less, even without dilatation of the pupils. This unobtrusive and fast procedure for visualizing the microcirculation is attractive to apply in epidemiological studies and to monitor cardiovascular health from early age up to old age.
Systemic diseases that affect the circulation can result in progressive morphological changes in the retinal vasculature. For example, changes in the vessel calibers of retinal arteries and veins have been associated with hypertension, atherosclerosis, and increased risk of stroke and myocardial infarction. The vessel widths are derived using image analysis software and the width of the six largest arteries and veins are summarized in the Central Retinal Arteriolar Equivalent (CRAE) and the Central Retinal Venular Equivalent (CRVE). The latter features have been shown useful to study the impact of modifiable lifestyle and environmental cardiovascular disease risk factors.
The procedures to acquire fundus images and the analysis steps to obtain CRAE and CRVE are described. Coefficients of variation of repeated measures of CRAE and CRVE are less than 2% and within-rater reliability is very high. Using a panel study, the rapid response of the retinal vessel calibers to short-term changes in particulate air pollution, a known risk factor for cardiovascular mortality and morbidity, is reported. In conclusion, retinal imaging is proposed as a convenient and instrumental tool for epidemiological studies to study microvascular responses to cardiovascular disease risk factors.
Medicine, Issue 92, retina, microvasculature, image analysis, Central Retinal Arteriolar Equivalent, Central Retinal Venular Equivalent, air pollution, particulate matter, black carbon
Making Sense of Listening: The IMAP Test Battery
Institutions: MRC Institute of Hearing Research, National Biomedical Research Unit in Hearing.
The ability to hear is only the first step towards making sense of the range of information contained in an auditory signal. Of equal importance are the abilities to extract and use the information encoded in the auditory signal. We refer to these as listening skills (or auditory processing AP). Deficits in these skills are associated with delayed language and literacy development, though the nature of the relevant deficits and their causal connection with these delays is hotly debated.
When a child is referred to a health professional with normal hearing and unexplained difficulties in listening, or associated delays in language or literacy development, they should ideally be assessed with a combination of psychoacoustic (AP) tests, suitable for children and for use in a clinic, together with cognitive tests to measure attention, working memory, IQ, and language skills. Such a detailed examination needs to be relatively short and within the technical capability of any suitably qualified professional. Current tests for the presence of AP deficits tend to be poorly constructed and inadequately validated within the normal population. They have little or no reference to the presenting symptoms of the child, and typically include a linguistic component. Poor performance may thus reflect problems with language rather than with AP. To assist in the assessment of children with listening difficulties, pediatric audiologists need a single, standardized child-appropriate test battery based on the use of language-free stimuli.
We present the IMAP test battery which was developed at the MRC Institute of Hearing Research to supplement tests currently used to investigate cases of suspected AP deficits. IMAP assesses a range of relevant auditory and cognitive skills and takes about one hour to complete. It has been standardized in 1500 normally-hearing children from across the UK, aged 6-11 years. Since its development, it has been successfully used in a number of large scale studies both in the UK and the USA. IMAP provides measures for separating out sensory from cognitive contributions to hearing. It further limits confounds due to procedural effects by presenting tests in a child-friendly game-format. Stimulus-generation, management of test protocols and control of test presentation is mediated by the IHR-STAR software platform. This provides a standardized methodology for a range of applications and ensures replicable procedures across testers. IHR-STAR provides a flexible, user-programmable environment that currently has additional applications for hearing screening, mapping cochlear implant electrodes, and academic research or teaching.
Neuroscience, Issue 44, Listening skills, auditory processing, auditory psychophysics, clinical assessment, child-friendly testing
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro
using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro
preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers.
In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo
counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure
neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic
SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
Collection, Isolation, and Flow Cytometric Analysis of Human Endocervical Samples
Institutions: University of Manitoba, University of Manitoba.
Despite the public health importance of mucosal pathogens (including HIV), relatively little is known about mucosal immunity, particularly at the female genital tract (FGT). Because heterosexual transmission now represents the dominant mechanism of HIV transmission, and given the continual spread of sexually transmitted infections (STIs), it is critical to understand the interplay between host and pathogen at the genital mucosa. The substantial gaps in knowledge around FGT immunity are partially due to the difficulty in successfully collecting and processing mucosal samples. In order to facilitate studies with sufficient sample size, collection techniques must be minimally invasive and efficient. To this end, a protocol for the collection of cervical cytobrush samples and subsequent isolation of cervical mononuclear cells (CMC) has been optimized. Using ex vivo
flow cytometry-based immunophenotyping, it is possible to accurately and reliably quantify CMC lymphocyte/monocyte population frequencies and phenotypes. This technique can be coupled with the collection of cervical-vaginal lavage (CVL), which contains soluble immune mediators including cytokines, chemokines and anti-proteases, all of which can be used to determine the anti- or pro-inflammatory environment in the vagina.
Medicine, Issue 89, mucosal, immunology, FGT, lavage, cervical, CMC
Breathing-controlled Electrical Stimulation (BreEStim) for Management of Neuropathic Pain and Spasticity
Institutions: University of Texas Health Science Center at Houston , TIRR Memorial Hermann Hospital, TIRR Memorial Hermann Hospital.
Electrical stimulation (EStim) refers to the application of electrical current to muscles or nerves in order to achieve functional and therapeutic goals. It has been extensively used in various clinical settings. Based upon recent discoveries related to the systemic effects of voluntary breathing and intrinsic physiological interactions among systems during voluntary breathing, a new EStim protocol, Breathing-controlled Electrical Stimulation (BreEStim), has been developed to augment the effects of electrical stimulation. In BreEStim, a single-pulse electrical stimulus is triggered and delivered to the target area when the airflow rate of an isolated voluntary inspiration reaches the threshold. BreEStim integrates intrinsic physiological interactions that are activated during voluntary breathing and has demonstrated excellent clinical efficacy. Two representative applications of BreEStim are reported with detailed protocols: management of post-stroke finger flexor spasticity and neuropathic pain in spinal cord injury.
Medicine, Issue 71, Neuroscience, Neurobiology, Anatomy, Physiology, Behavior, electrical stimulation, BreEStim, electrode, voluntary breathing, respiration, inspiration, pain, neuropathic pain, pain management, spasticity, stroke, spinal cord injury, brain, central nervous system, CNS, clinical, electromyogram, neuromuscular electrical stimulation
Laboratory Estimation of Net Trophic Transfer Efficiencies of PCB Congeners to Lake Trout (Salvelinus namaycush) from Its Prey
Institutions: U. S. Geological Survey, Grand Valley State University, Shedd Aquarium.
A technique for laboratory estimation of net trophic transfer efficiency (γ) of polychlorinated biphenyl (PCB) congeners to piscivorous fish from their prey is described herein. During a 135-day laboratory experiment, we fed bloater (Coregonus hoyi
) that had been caught in Lake Michigan to lake trout (Salvelinus namaycush
) kept in eight laboratory tanks. Bloater is a natural prey for lake trout. In four of the tanks, a relatively high flow rate was used to ensure relatively high activity by the lake trout, whereas a low flow rate was used in the other four tanks, allowing for low lake trout activity. On a tank-by-tank basis, the amount of food eaten by the lake trout on each day of the experiment was recorded. Each lake trout was weighed at the start and end of the experiment. Four to nine lake trout from each of the eight tanks were sacrificed at the start of the experiment, and all 10 lake trout remaining in each of the tanks were euthanized at the end of the experiment. We determined concentrations of 75 PCB congeners in the lake trout at the start of the experiment, in the lake trout at the end of the experiment, and in bloaters fed to the lake trout during the experiment. Based on these measurements, γ was calculated for each of 75 PCB congeners in each of the eight tanks. Mean γ was calculated for each of the 75 PCB congeners for both active and inactive lake trout. Because the experiment was replicated in eight tanks, the standard error about mean γ could be estimated. Results from this type of experiment are useful in risk assessment models to predict future risk to humans and wildlife eating contaminated fish under various scenarios of environmental contamination.
Environmental Sciences, Issue 90, trophic transfer efficiency, polychlorinated biphenyl congeners, lake trout, activity, contaminants, accumulation, risk assessment, toxic equivalents
Assessing Cell Cycle Progression of Neural Stem and Progenitor Cells in the Mouse Developing Brain after Genotoxic Stress
Institutions: CEA DSV iRCM SCSR, INSERM, U967, Université Paris Diderot, Sorbonne Paris Cité, Université Paris Sud, UMR 967.
Neurons of the cerebral cortex are generated during brain development from different types of neural stem and progenitor cells (NSPC), which form a pseudostratified epithelium lining the lateral ventricles of the embryonic brain. Genotoxic stresses, such as ionizing radiation, have highly deleterious effects on the developing brain related to the high sensitivity of NSPC. Elucidation of the cellular and molecular mechanisms involved depends on the characterization of the DNA damage response of these particular types of cells, which requires an accurate method to determine NSPC progression through the cell cycle in the damaged tissue. Here is shown a method based on successive intraperitoneal injections of EdU and BrdU in pregnant mice and further detection of these two thymidine analogues in coronal sections of the embryonic brain. EdU and BrdU are both incorporated in DNA of replicating cells during S phase and are detected by two different techniques (azide or a specific antibody, respectively), which facilitate their simultaneous detection. EdU and BrdU staining are then determined for each NSPC nucleus in function of its distance from the ventricular margin in a standard region of the dorsal telencephalon. Thus this dual labeling technique allows distinguishing cells that progressed through the cell cycle from those that have activated a cell cycle checkpoint leading to cell cycle arrest in response to DNA damage.
An example of experiment is presented, in which EdU was injected before irradiation and BrdU immediately after and analyzes performed within the 4 hr following irradiation. This protocol provides an accurate analysis of the acute DNA damage response of NSPC in function of the phase of the cell cycle at which they have been irradiated. This method is easily transposable to many other systems in order to determine the impact of a particular treatment on cell cycle progression in living tissues.
Neuroscience, Issue 87, EdU, BrdU, in utero irradiation, neural stem and progenitor cells, cell cycle, embryonic cortex, immunostaining, cell cycle checkpoints, apoptosis, genotoxic stress, embronic mouse brain
Measuring Fluxes of Mineral Nutrients and Toxicants in Plants with Radioactive Tracers
Institutions: University of Toronto.
Unidirectional influx and efflux of nutrients and toxicants, and their resultant net fluxes, are central to the nutrition and toxicology of plants. Radioisotope tracing is a major technique used to measure such fluxes, both within plants, and between plants and their environments. Flux data obtained with radiotracer protocols can help elucidate the capacity, mechanism, regulation, and energetics of transport systems for specific mineral nutrients or toxicants, and can provide insight into compartmentation and turnover rates of subcellular mineral and metabolite pools. Here, we describe two major radioisotope protocols used in plant biology: direct influx (DI) and compartmental analysis by tracer efflux (CATE). We focus on flux measurement of potassium (K+
) as a nutrient, and ammonia/ammonium (NH3
) as a toxicant, in intact seedlings of the model species barley (Hordeum vulgare
L.). These protocols can be readily adapted to other experimental systems (e.g.
, different species, excised plant material, and other nutrients/toxicants). Advantages and limitations of these protocols are discussed.
Environmental Sciences, Issue 90,
influx, efflux, net flux, compartmental analysis, radiotracers, potassium, ammonia, ammonium
Training Synesthetic Letter-color Associations by Reading in Color
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
Flexible Colonoscopy in Mice to Evaluate the Severity of Colitis and Colorectal Tumors Using a Validated Endoscopic Scoring System
Institutions: Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland.
The use of modern endoscopy for research purposes has greatly facilitated our understanding of gastrointestinal pathologies. In particular, experimental endoscopy has been highly useful for studies that require repeated assessments in a single laboratory animal, such as those evaluating mechanisms of chronic inflammatory bowel disease and the progression of colorectal cancer. However, the methods used across studies are highly variable. At least three endoscopic scoring systems have been published for murine colitis and published protocols for the assessment of colorectal tumors fail to address the presence of concomitant colonic inflammation. This study develops and validates a reproducible endoscopic scoring system that integrates evaluation of both inflammation and tumors simultaneously. This novel scoring system has three major components: 1) assessment of the extent and severity of colorectal inflammation (based on perianal findings, transparency of the wall, mucosal bleeding, and focal lesions), 2) quantitative recording of tumor lesions (grid map and bar graph), and 3) numerical sorting of clinical cases by their pathological and research relevance based on decimal units with assigned categories of observed lesions and endoscopic complications (decimal identifiers). The video and manuscript presented herein were prepared, following IACUC-approved protocols, to allow investigators to score their own experimental mice using a well-validated and highly reproducible endoscopic methodology, with the system option to differentiate distal from proximal endoscopic colitis (D-PECS).
Medicine, Issue 80, Crohn's disease, ulcerative colitis, colon cancer, Clostridium difficile, SAMP mice, DSS/AOM-colitis, decimal scoring identifier
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33
. To help improve this understanding, proton magnetic resonance spectroscopy (1
H-MRS) can be used as it allows the in vivo
quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41
. In fact, a recent study demonstrated that 1
H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34
. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1
H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31
. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Assembly, Loading, and Alignment of an Analytical Ultracentrifuge Sample Cell
Institutions: Dynamics of Macromolecular Assembly, Laboratory of Bioengineering and Physical Science.
The analytical ultracentrifuge (AUC) is a powerful biophysical tool that allows us to record macromolecular sedimentation profiles during high speed centrifugation. When properly planned and executed, an AUC sedimentation velocity or sedimentation equilibrium experiment can reveal a great deal about a protein in regards to size and shape, sample purity, sedimentation coefficient, oligomerization states and protein-protein interactions.
This technique, however, requires a rigorous level of technical attention. Sample cells hold a sectored center piece sandwiched between two window assemblies. They are sealed with a torque pressure of around 120-140 in/lbs. Reference buffer and sample are loaded into the centerpiece sectors and then after sealing, the cells are precisely aligned into a titanium rotor so that the optical detection systems scan both sample and reference buffer in the same radial path midline through each centerpiece sector while rotating at speeds of up to 60, 000 rpm and under very high vacuum
Not only is proper sample cell assembly critical, sample cell components are very expensive and must be properly cared for to ensure they are in optimum working condition in order to avoid leaks and breakage during experiments. Handle windows carefully, for even the slightest crack or scratch can lead to breakage in the centrifuge. The contact between centerpiece and windows must be as tight as possible; i.e. no Newton s rings should be visible after torque pressure is applied. Dust, lint, scratches and oils on either the windows or the centerpiece all compromise this contact and can very easily lead to leaking of solutions from one sector to another or leaking out of the centerpiece all together. Not only are precious samples lost, leaking of solutions during an experiment will cause an imbalance of pressure in the cell that often leads to broken windows and centerpieces. In addition, plug gaskets and housing plugs must be securely in place to avoid solutions being pulled out of the centerpiece sector through the loading holes by the high vacuum in the centrifuge chamber. Window liners and gaskets must be free of breaks and cracks that could cause movement resulting in broken windows.
This video will demonstrate our procedures of sample cell assembly, torque, loading and rotor alignment to help minimize component damage, solution leaking and breakage during the perfect AUC experiment.
Basic Protocols, Issue 33, analytical ultracentrifugation, sedimentation velocity, sedimentation equilibrium, protein characterization, sedimentation coefficient
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g.
carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Basics of Multivariate Analysis in Neuroimaging Data
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9
. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Combining Behavioral Endocrinology and Experimental Economics: Testosterone and Social Decision Making
Institutions: University of Zurich, Royal Holloway, University of London.
Behavioral endocrinological research in humans as well as in animals suggests that testosterone plays a key role in social interactions. Studies in rodents have shown a direct link between testosterone and aggressive behavior1
and folk wisdom adapts these findings to humans, suggesting that testosterone induces antisocial, egoistic or even aggressive behavior2
. However, many researchers doubt a direct testosterone-aggression link in humans, arguing instead that testosterone is primarily involved in status-related behavior3,4
. As a high status can also be achieved by aggressive and antisocial means it can be difficult to distinguish between anti-social and status seeking behavior.
We therefore set up an experimental environment, in which status can only be achieved by prosocial means. In a double-blind and placebo-controlled experiment, we administered a single sublingual dose of 0.5 mg of testosterone (with a hydroxypropyl-β-cyclodextrin carrier) to 121 women and investigated their social interaction behavior in an economic bargaining paradigm. Real monetary incentives are at stake in this paradigm; every player A receives a certain amount of money and has to make an offer to another player B on how to share the money. If B accepts, she gets what was offered and player A keeps the rest. If B refuses the offer, nobody gets anything. A status seeking player A is expected to avoid being rejected by behaving in a prosocial way, i.e. by making higher offers.
The results show that if expectations about the hormone are controlled for, testosterone administration leads to a significant increase in fair bargaining offers compared to placebo. The role of expectations is reflected in the fact that subjects who report that they believe to have received testosterone make lower offers than those who say they believe that they were treated with a placebo. These findings suggest that the experimental economics approach is sensitive for detecting neurobiological effects as subtle as those achieved by administration of hormones. Moreover, the findings point towards the importance of both psychosocial as well as neuroendocrine factors in determining the influence of testosterone on human social behavior.
Neuroscience, Issue 49, behavioral endocrinology, testosterone, social status, decision making
Maintaining Wolbachia in Cell-free Medium
Institutions: Johns Hopkins University.
In this video protocol, procedures are demonstrated to (1) purify Wolbachia symbionts out of cultured mosquito cells, (2) use a fluorescent assay to ascertain the viability of the purified Wolbachia and (3) maintain the now extracellular Wolbachia in cell-free medium. Purified Wolbachia remain alive in the extracellular phase but do not replicate until re-inoculated into eukaryotic cells. Extracellular Wolbachia purified in this manner will remain viable for at least a week at room temperature, and possibly longer. Purified Wolbachia are suitable for micro-injection, DNA extraction and other applications.
Cellular Biology, Issue 5, mosquito, Wolbachia, infectious disease
Comprehensive & Cost Effective Laboratory Monitoring of HIV/AIDS: an African Role Model
Institutions: National Health Laboratory Services (NHLS-SA), University of Witwatersrand, Lightcurve Films.
We present the video about assisting anti-retroviral therapy (ART) by an apt laboratory service - representing a South-African role model for economical large scale diagnostic testing. In the low-income countries inexpensive ART has transformed the prospects for the survival of HIV seropositive patients but there are doubts whether there is a need for the laboratory monitoring of ART and at what costs - in situations when the overall quality of pathology services can still be very low. The appropriate answer is to establish economically sound services with better coordination and stricter internal quality assessment than seen in western countries. This video, photographed at location in the National Health Laboratory Services (NHLS-SA) at the Witwatersrand University, Johannesburg, South Africa, provides such a coordinated scheme expanding the original 2-color CD4-CD45 PanLeucoGating strategy (PLG). Thus the six modules of the video presentation reveal the simplicity of a 4-color flow cytometric assay to combine haematological, immunological and virology-related tests in a single tube. These video modules are: (i) the set-up of instruments; (ii) sample preparations; (iii) testing absolute counts and monitoring quality for each sample by bead-count-rate; (iv) the heamatological CD45 test for white cell counts and differentials; (v) the CD4 counts, and (vi) the activation of CD8+ T cells measured by CD38 display, a viral load related parameter. The potential cost-savings are remarkable. This arrangement is a prime example for the feasibility of performing > 800-1000 tests per day with a stricter quality control than that applied in western laboratories, and also with a transfer of technology to other laboratories within a NHLS-SA network. Expert advisors, laboratory managers and policy makers who carry the duty of making decisions about introducing modern medical technology are frequently not in a position to see the latest technical details as carried out in the large regional laboratories with huge burdens of workload. Hence this video shows details of these new developments.
Immunology, Issue 44, Human Immunodeficiency virus (HIV); CD4 lymphocyte count, white cell count, CD45, panleucogating, lymphocyte activation, CD38, HIV viral load, antiretroviral therapy (ART), internal quality control
Using Visual and Narrative Methods to Achieve Fair Process in Clinical Care
Institutions: Brandeis University, Brandeis University.
The Institute of Medicine has targeted patient-centeredness as an important area of quality improvement. A major dimension of patient-centeredness is respect for patient's values, preferences, and expressed needs. Yet specific approaches to gaining this understanding and translating it to quality care in the clinical setting are lacking. From a patient perspective quality is not a simple concept but is best understood in terms of five dimensions: technical outcomes; decision-making efficiency; amenities and convenience; information and emotional support; and overall patient satisfaction. Failure to consider quality from this five-pronged perspective results in a focus on medical outcomes, without considering the processes central to quality from the patient's perspective and vital to achieving good outcomes. In this paper, we argue for applying the concept of fair process in clinical settings. Fair process involves using a collaborative approach to exploring diagnostic issues and treatments with patients, explaining the rationale for decisions, setting expectations about roles and responsibilities, and implementing a core plan and ongoing evaluation. Fair process opens the door to bringing patient expertise into the clinical setting and the work of developing health care goals and strategies. This paper provides a step by step illustration of an innovative visual approach, called photovoice or photo-elicitation, to achieve fair process in clinical work with acquired brain injury survivors and others living with chronic health conditions. Applying this visual tool and methodology in the clinical setting will enhance patient-provider communication; engage patients as partners in identifying challenges, strengths, goals, and strategies; and support evaluation of progress over time. Asking patients to bring visuals of their lives into the clinical interaction can help to illuminate gaps in clinical knowledge, forge better therapeutic relationships with patients living with chronic conditions such as brain injury, and identify patient-centered goals and possibilities for healing. The process illustrated here can be used by clinicians, (primary care physicians, rehabilitation therapists, neurologists, neuropsychologists, psychologists, and others) working with people living with chronic conditions such as acquired brain injury, mental illness, physical disabilities, HIV/AIDS, substance abuse, or post-traumatic stress, and by leaders of support groups for the types of patients described above and their family members or caregivers.
Medicine, Issue 48, person-centered care, participatory visual methods, photovoice, photo-elicitation, narrative medicine, acquired brain injury, disability, rehabilitation, palliative care
Major Components of the Light Microscope
Institutions: University of Texas Health Science Center at San Antonio (UTHSCSA).
The light microscope is a basic tool for the cell biologist, who should have a thorough understanding of how it works, how it should be aligned for different applications, and how it should be maintained as required to obtain maximum image-forming capacity and resolution. The components of the microscope are described in detail here.
Basic Protocols, Issue 17, Current Protocols Wiley, Microscopy, Objectives, Condenser, Eyepiece
Proper Care and Cleaning of the Microscope
Institutions: University of Texas Health Science Center at San Antonio (UTHSCSA).
Keeping the microscope optics clean is important for high-quality imaging. Dust, fingerprints, excess immersion oil, or mounting medium on or in a microscope causes reduction in contrast and resolution. DIC is especially sensitive to contamination and scratches on the lens surfaces. This protocol details the procedure for keeping the microscope clean.
Basic Protocols, Issue 18, Current Protocols Wiley, Microscopy, Cleaning the Microscope