There has been a long-standing interest in the assessment of the neurobehavioral integrity of the newborn infant. The NICU Network Neurobehavioral Scale (NNNS) was developed as an assessment for the at-risk infant. These are infants who are at increased risk for poor developmental outcome because of insults during prenatal development, such as substance exposure or prematurity or factors such as poverty, poor nutrition or lack of prenatal care that can have adverse effects on the intrauterine environment and affect the developing fetus. The NNNS assesses the full range of infant neurobehavioral performance including neurological integrity, behavioral functioning, and signs of stress/abstinence. The NNNS is a noninvasive neonatal assessment tool with demonstrated validity as a predictor, not only of medical outcomes such as cerebral palsy diagnosis, neurological abnormalities, and diseases with risks to the brain, but also of developmental outcomes such as mental and motor functioning, behavior problems, school readiness, and IQ. The NNNS can identify infants at high risk for abnormal developmental outcome and is an important clinical tool that enables medical researchers and health practitioners to identify these infants and develop intervention programs to optimize the development of these infants as early as possible. The video shows the NNNS procedures, shows examples of normal and abnormal performance and the various clinical populations in which the exam can be used.
22 Related JoVE Articles!
Fetal Echocardiography and Pulsed-wave Doppler Ultrasound in a Rabbit Model of Intrauterine Growth Restriction
Institutions: University Hospitals Leuven, Monash University, Victoria, Australia, Katholieke Universiteit Leuven, Institut d'Investigacions Biomediques August Pi i Sunyer (IDIBAPS), Universitat de Barcelona, Centro de Investigación Biomédica en Red de Enfermedades Raras (CIBERER).
Fetal intrauterine growth restriction (IUGR) results in abnormal cardiac function that is apparent antenatally due to advances in fetoplacental Doppler ultrasound and fetal echocardiography. Increasingly, these imaging modalities are being employed clinically to examine cardiac function and assess wellbeing in utero
, thereby guiding timing of birth decisions. Here, we used a rabbit model of IUGR that allows analysis of cardiac function in a clinically relevant way. Using isoflurane induced anesthesia, IUGR is surgically created at gestational age day 25 by performing a laparotomy, exposing the bicornuate uterus and then ligating 40-50% of uteroplacental vessels supplying each gestational sac in a single uterine horn. The other horn in the rabbit bicornuate uterus serves as internal control fetuses. Then, after recovery at gestational age day 30 (full term), the same rabbit undergoes examination of fetal cardiac function. Anesthesia is induced with ketamine and xylazine intramuscularly, then maintained by a continuous intravenous infusion of ketamine and xylazine to minimize iatrogenic effects on fetal cardiac function. A repeat laparotomy is performed to expose each gestational sac and a microultrasound examination (VisualSonics VEVO 2100) of fetal cardiac function is performed. Placental insufficiency is evident by a raised pulsatility index or an absent or reversed end diastolic flow of the umbilical artery Doppler waveform. The ductus venosus and middle cerebral artery Doppler is then examined. Fetal echocardiography is performed by recording B mode, M mode and flow velocity waveforms in lateral and apical views. Offline calculations determine standard M-mode cardiac variables, tricuspid and mitral annular plane systolic excursion, speckle tracking and strain analysis, modified myocardial performance index and vascular flow velocity waveforms of interest. This small animal model of IUGR therefore affords examination of in utero
cardiac function that is consistent with current clinical practice and is therefore useful in a translational research setting.
Medicine, Issue 76, Developmental Biology, Biomedical Engineering, Molecular Biology, Anatomy, Physiology, Cardiology, Fetal Therapies, Obstetric Surgical Procedures, Fetal Development, Surgical Procedures, Operative, intrauterine growth restriction, fetal echocardiography, Doppler ultrasound, fetal hemodynamics, animal model, clinical techniques
Progressive-ratio Responding for Palatable High-fat and High-sugar Food in Mice
Institutions: University of Montreal.
Foods that are rich in fat and sugar significantly contribute to over-eating and escalating rates of obesity. The consumption of palatable foods can produce a rewarding effect that strengthens action-outcome associations and reinforces future behavior directed at obtaining these foods. Increasing evidence that the rewarding effects of energy-dense foods play a profound role in overeating and the development of obesity has heightened interest in studying the genes, molecules and neural circuitry that modulate food reward1,2
. The rewarding impact of different stimuli can be studied by measuring the willingness to work to obtain them, such as in operant conditioning tasks3
. Operant models of food reward measure acquired and voluntary behavioral responses that are directed at obtaining food. A commonly used measure of reward strength is an operant procedure known as the progressive ratio (PR) schedule of reinforcement.4,5
In the PR task, the subject is required to make an increasing number of operant responses for each successive reward. The pioneering study of Hodos (1961) demonstrated that the number of responses made to obtain the last reward, termed the breakpoint, serves as an index of reward strength4
. While operant procedures that measure changes in response rate alone cannot separate changes in reward strength from alterations in performance capacity, the breakpoint derived from the PR schedule is a well-validated measure of the rewarding effects of food. The PR task has been used extensively to assess the rewarding impact of drugs of abuse and food in rats (e.g.,6-8
), but to a lesser extent in mice9
. The increased use of genetically engineered mice and diet-induced obese mouse models has heightened demands for behavioral measures of food reward in mice. In the present article we detail the materials and procedures used to train mice to respond (lever-press) for a high-fat and high-sugar food pellets on a PR schedule of reinforcement. We show that breakpoint response thresholds increase following acute food deprivation and decrease with peripheral administration of the anorectic hormone leptin and thereby validate the use of this food-operant paradigm in mice.
Neuroscience, Issue 63, behavioral neuroscience, operant conditioning, food, reward, obesity, leptin, mouse
A Protocol for the Production of KLRG1 Tetramer
Institutions: Brown University.
Killer cell lectin-like receptor G1 (KLRG1) is a type II transmembrane glycoprotein inhibitory receptor belonging to the C type lectin-like superfamily. KLRG1 exists both as a monomer and as a disulfide-linked homodimer. This well-conserved receptor is found on the most mature and recently activated NK cells as well as on a subset of effector/memory T cells.
Using KLRG1 tetramer as well as other methods, E-, N-, and R-cadherins were identified as KLRG1 ligands. These Ca2+
-dependent cell-cell adhesion molecules comprises of an extracellular domain containing five cadherin repeats responsible for cell-cell interactions, a transmembrane domain and a cytoplasmic domain that is linked to the actin cytoskeleton.
Generation of the KLRG1 tetramer was essential to the identification of the KLRG1 ligands. KLRG1 tetramer is also a unique tool to elucidate the roles cadherin and KLRG1 play in regulating the immune response and tissue integrity.
Microbiology, Issue 35, Immunology, Basic Protocols, Tetramer, Inclusion Bodies, Refolding, Monomer, Flow Cytometry, KLRG1, Cadherins
Optical Frequency Domain Imaging of Ex vivo Pulmonary Resection Specimens: Obtaining One to One Image to Histopathology Correlation
Institutions: Harvard Medical School, Massachusetts General Hospital, Harvard Medical School, Massachusetts General Hospital, Harvard Medical School.
Lung cancer is the leading cause of cancer-related deaths1
. Squamous cell and small cell cancers typically arise in association with the conducting airways, whereas adenocarcinomas are typically more peripheral in location. Lung malignancy detection early in the disease process may be difficult due to several limitations: radiological resolution, bronchoscopic limitations in evaluating tissue underlying the airway mucosa and identifying early pathologic changes, and small sample size and/or incomplete sampling in histology biopsies. High resolution imaging modalities, such as optical frequency domain imaging (OFDI), provide non-destructive, large area 3-dimensional views of tissue microstructure to depths approaching 2 mm in real time (Figure 1
. OFDI has been utilized in a variety of applications, including evaluation of coronary artery atherosclerosis6,7
and esophageal intestinal metaplasia and dysplasia6,8-10
Bronchoscopic OCT/OFDI has been demonstrated as a safe in vivo
imaging tool for evaluating the pulmonary airways11-23
). OCT has been assessed in pulmonary airways16,23
of animal models and in vivo
. OCT imaging of normal airway has demonstrated visualization of airway layering and alveolar attachments, and evaluation of dysplastic lesions has been found useful in distinguishing grades of dysplasia in the bronchial mucosa11,12,20,21
. OFDI imaging of bronchial mucosa has been demonstrated in a short bronchial segment (0.8 cm)18
. Additionally, volumetric OFDI spanning multiple airway generations in swine and human pulmonary airways in vivo
has been described19
. Endobronchial OCT/OFDI is typically performed using thin, flexible catheters, which are compatible with standard bronchoscopic access ports. Additionally, OCT and OFDI needle-based probes have recently been developed, which may be used to image regions of the lung beyond the airway wall or pleural surface17
While OCT/OFDI has been utilized and demonstrated as feasible for in vivo
pulmonary imaging, no studies with precisely matched one-to-one OFDI:histology have been performed. Therefore, specific imaging criteria for various pulmonary pathologies have yet to be developed. Histopathological counterparts obtained in vivo
consist of only small biopsy fragments, which are difficult to correlate with large OFDI datasets. Additionally, they do not provide the comprehensive histology needed for registration with large volume OFDI. As a result, specific imaging features of pulmonary pathology cannot be developed in the in vivo
setting. Precisely matched, one-to-one OFDI and histology correlation is vital to accurately evaluate features seen in OFDI against histology as a gold standard in order to derive specific image interpretation criteria for pulmonary neoplasms and other pulmonary pathologies. Once specific imaging criteria have been developed and validated ex vivo
with matched one-to-one histology, the criteria may then be applied to in vivo
imaging studies. Here, we present a method for precise, one to one correlation between high resolution optical imaging and histology in ex vivo
lung resection specimens. Throughout this manuscript, we describe the techniques used to match OFDI images to histology. However, this method is not specific to OFDI and can be used to obtain histology-registered images for any optical imaging technique. We performed airway centered OFDI with a specialized custom built bronchoscopic 2.4 French (0.8 mm diameter) catheter. Tissue samples were marked with tissue dye, visible in both OFDI and histology. Careful orientation procedures were used to precisely correlate imaging and histological sampling locations. The techniques outlined in this manuscript were used to conduct the first demonstration of volumetric OFDI with precise correlation to tissue-based diagnosis for evaluating pulmonary pathology24
. This straightforward, effective technique may be extended to other tissue types to provide precise imaging to histology correlation needed to determine fine imaging features of both normal and diseased tissues.
Bioengineering, Issue 71, Medicine, Biomedical Engineering, Anatomy, Physiology, Cancer Biology, Pathology, Surgery, Bronchoscopic imaging, In vivo optical microscopy, Optical imaging, Optical coherence tomography, Optical frequency domain imaging, Histology correlation, animal model, histopathology, airway, lung, biopsy, imaging
Utility of Dissociated Intrinsic Hand Muscle Atrophy in the Diagnosis of Amyotrophic Lateral Sclerosis
Institutions: Westmead Hospital, University of Sydney, Australia.
The split hand
phenomenon refers to predominant wasting of thenar muscles and is an early and specific feature of amyotrophic lateral sclerosis (ALS). A novel split hand index (SI) was developed to quantify the split hand phenomenon, and its diagnostic utility was assessed in ALS patients. The split hand index was derived by dividing the product of the compound muscle action potential (CMAP) amplitude recorded over the abductor pollicis brevis and first dorsal interosseous muscles by the CMAP amplitude recorded over the abductor digiti minimi muscle. In order to assess the diagnostic utility of the split hand index, ALS patients were prospectively assessed and their results were compared to neuromuscular disorder patients. The split hand index was significantly reduced in ALS when compared to neuromuscular disorder patients (P<0.0001). Limb-onset ALS patients exhibited the greatest reduction in the split hand index, and a value of 5.2 or less reliably differentiated ALS from other neuromuscular disorders. Consequently, the split hand index appears to be a novel diagnostic biomarker for ALS, perhaps facilitating an earlier diagnosis.
Medicine, Issue 85, Amyotrophic Lateral Sclerosis (ALS), dissociated muscle atrophy, hypothenar muscles, motor neuron disease, split-hand index, thenar muscles
Determination of the Transport Rate of Xenobiotics and Nanomaterials Across the Placenta using the ex vivo Human Placental Perfusion Model
Institutions: University Hospital Zurich, EMPA Swiss Federal Laboratories for Materials Testing and Research, University of Bern.
Decades ago the human placenta was thought to be an impenetrable barrier between mother and unborn child. However, the discovery of thalidomide-induced birth defects and many later studies afterwards proved the opposite. Today several harmful xenobiotics like nicotine, heroin, methadone or drugs as well as environmental pollutants were described to overcome this barrier. With the growing use of nanotechnology, the placenta is likely to come into contact with novel nanoparticles either accidentally through exposure or intentionally in the case of potential nanomedical applications. Data from animal experiments cannot be extrapolated to humans because the placenta is the most species-specific mammalian organ 1
. Therefore, the ex vivo
dual recirculating human placental perfusion, developed by Panigel et al.
in 1967 2
and continuously modified by Schneider et al.
in 1972 3
, can serve as an excellent model to study the transfer of xenobiotics or particles.
Here, we focus on the ex vivo
dual recirculating human placental perfusion protocol and its further development to acquire reproducible results.
The placentae were obtained after informed consent of the mothers from uncomplicated term pregnancies undergoing caesarean delivery. The fetal and maternal vessels of an intact cotyledon were cannulated and perfused at least for five hours. As a model particle fluorescently labelled polystyrene particles with sizes of 80 and 500 nm in diameter were added to the maternal circuit. The 80 nm particles were able to cross the placental barrier and provide a perfect example for a substance which is transferred across the placenta to the fetus while the 500 nm particles were retained in the placental tissue or maternal circuit. The ex vivo
human placental perfusion model is one of few models providing reliable information about the transport behavior of xenobiotics at an important tissue barrier which delivers predictive and clinical relevant data.
Biomedical Engineering, Issue 76, Medicine, Bioengineering, Anatomy, Physiology, Molecular Biology, Biochemistry, Biophysics, Pharmacology, Obstetrics, Nanotechnology, Placenta, Pharmacokinetics, Nanomedicine, humans, ex vivo perfusion, perfusion, biological barrier, xenobiotics, nanomaterials, clinical model
Using Chronic Social Stress to Model Postpartum Depression in Lactating Rodents
Institutions: Tufts University Cummings School of Veterinary Medicine, Manchester Metropolitan University.
Exposure to chronic stress is a reliable predictor of depressive disorders, and social stress is a common ethologically relevant stressor in both animals and humans. However, many animal models of depression were developed in males and are not applicable or effective in studies of postpartum females. Recent studies have reported significant effects of chronic social stress during lactation, an ethologically relevant and effective stressor, on maternal behavior, growth, and behavioral neuroendocrinology. This manuscript will describe this chronic social stress paradigm using repeated exposure of a lactating dam to a novel male intruder, and the assessment of the behavioral, physiological, and neuroendocrine effects of this model. Chronic social stress (CSS) is a valuable model for studying the effects of stress on the behavior and physiology of the dam as well as her offspring and future generations. The exposure of pups to CSS can also be used as an early life stress that has long term effects on behavior, physiology, and neuroendocrinology.
Behavior, Issue 76, Neuroscience, Neurobiology, Physiology, Anatomy, Medicine, Biomedical Engineering, Neurobehavioral Manifestations, Mental Health, Mood Disorders, Depressive Disorder, Anxiety Disorders, behavioral sciences, Behavior and Behavior Mechanisms, Mental Disorders, Stress, Depression, Anxiety, Postpartum, Maternal Behavior, Nursing, Growth, Transgenerational, animal model
Setting Limits on Supersymmetry Using Simplified Models
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
From Fast Fluorescence Imaging to Molecular Diffusion Law on Live Cell Membranes in a Commercial Microscope
Institutions: Scuola Normale Superiore, Instituto Italiano di Tecnologia, University of California, Irvine.
It has become increasingly evident that the spatial distribution and the motion of membrane components like lipids and proteins are key factors in the regulation of many cellular functions. However, due to the fast dynamics and the tiny structures involved, a very high spatio-temporal resolution is required to catch the real behavior of molecules. Here we present the experimental protocol for studying the dynamics of fluorescently-labeled plasma-membrane proteins and lipids in live cells with high spatiotemporal resolution. Notably, this approach doesn’t need to track each molecule, but it calculates population behavior using all molecules in a given region of the membrane. The starting point is a fast imaging of a given region on the membrane. Afterwards, a complete spatio-temporal autocorrelation function is calculated correlating acquired images at increasing time delays, for example each 2, 3, n repetitions. It is possible to demonstrate that the width of the peak of the spatial autocorrelation function increases at increasing time delay as a function of particle movement due to diffusion. Therefore, fitting of the series of autocorrelation functions enables to extract the actual protein mean square displacement from imaging (iMSD), here presented in the form of apparent diffusivity vs average displacement. This yields a quantitative view of the average dynamics of single molecules with nanometer accuracy. By using a GFP-tagged variant of the Transferrin Receptor (TfR) and an ATTO488 labeled 1-palmitoyl-2-hydroxy-sn
-glycero-3-phosphoethanolamine (PPE) it is possible to observe the spatiotemporal regulation of protein and lipid diffusion on µm-sized membrane regions in the micro-to-milli-second time range.
Bioengineering, Issue 92, fluorescence, protein dynamics, lipid dynamics, membrane heterogeneity, transient confinement, single molecule, GFP
Detection of Invasive Pulmonary Aspergillosis in Haematological Malignancy Patients by using Lateral-flow Technology
Institutions: University of Exeter, Queen Mary University of London, St. Bartholomew's Hospital and The London NHS Trust.
Invasive pulmonary aspergillosis (IPA) is a leading cause of morbidity and mortality in haematological malignancy patients and hematopoietic stem cell transplant recipients1
. Detection of IPA represents a formidable diagnostic challenge and, in the absence of a 'gold standard', relies on a combination of clinical data and microbiology and histopathology where feasible. Diagnosis of IPA must conform to the European Organization for Research and Treatment of Cancer and the National Institute of Allergy and Infectious Diseases Mycology Study Group (EORTC/MSG) consensus defining "proven", "probable", and "possible" invasive fungal diseases2
. Currently, no nucleic acid-based tests have been externally validated for IPA detection and so polymerase chain reaction (PCR) is not included in current EORTC/MSG diagnostic criteria.
Identification of Aspergillus
in histological sections is problematic because of similarities in hyphal morphologies with other invasive fungal pathogens3
, and proven identification requires isolation of the etiologic agent in pure culture. Culture-based approaches rely on the availability of biopsy samples, but these are not always accessible in sick patients, and do not always yield viable propagules for culture when obtained.
An important feature in the pathogenesis of Aspergillus
is angio-invasion, a trait that provides opportunities to track the fungus immunologically using tests that detect characteristic antigenic signatures molecules in serum and bronchoalveolar lavage (BAL) fluids. This has led to the development of the Platelia enzyme immunoassay (GM-EIA) that detects Aspergillus
galactomannan and a 'pan-fungal' assay (Fungitell test) that detects the conserved fungal cell wall component (1 →3)-β-D-glucan, but not in the mucorales that lack this component in their cell walls1,4
. Issues surrounding the accuracy of these tests1,4-6
has led to the recent development of next-generation monoclonal antibody (MAb)-based assays that detect surrogate markers of infection1,5
recently described the generation of an Aspergillus
-specific MAb (JF5) using hybridoma technology and its use to develop an immuno-chromatographic lateral-flow device (LFD) for the point-of-care (POC) diagnosis of IPA. A major advantage of the LFD is its ability to detect activity since MAb JF5 binds to an extracellular glycoprotein antigen that is secreted during active growth of the fungus only5
. This is an important consideration when using fluids such as lung BAL for diagnosing IPA since Aspergillus
spores are a common component of inhaled air. The utility of the device in diagnosing IPA has been demonstrated using an animal model of infection, where the LFD displayed improved sensitivity and specificity compared to the Platelia GM and Fungitell (1 → 3)-β-D-glucan assays7
Here, we present a simple LFD procedure to detect Aspergillus
antigen in human serum and BAL fluids. Its speed and accuracy provides a novel adjunct point-of-care test for diagnosis of IPA in haematological malignancy patients.
Immunology, Issue 61, Invasive pulmonary aspergillosis, acute myeloid leukemia, bone marrow transplant, diagnosis, monoclonal antibody, lateral-flow technology
Diagnosing Pulmonary Tuberculosis with the Xpert MTB/RIF Test
Institutions: University of Bern, MCL Laboratories Inc..
Tuberculosis (TB) due to Mycobacterium tuberculosis
(MTB) remains a major public health issue: the infection affects up to one third of the world population1
, and almost two million people are killed by TB each year.2
Universal access to high-quality, patient-centered treatment for all TB patients is emphasized by WHO's Stop TB Strategy.3
The rapid detection of MTB in respiratory specimens and drug therapy based on reliable drug resistance testing results are a prerequisite for the successful implementation of this strategy. However, in many areas of the world, TB diagnosis still relies on insensitive, poorly standardized sputum microscopy methods. Ineffective TB detection and the emergence and transmission of drug-resistant MTB strains increasingly jeopardize global TB control activities.2
Effective diagnosis of pulmonary TB requires the availability - on a global scale - of standardized, easy-to-use, and robust diagnostic tools that would allow the direct detection of both the MTB complex and resistance to key antibiotics, such as rifampicin (RIF). The latter result can serve as marker for multidrug-resistant MTB (MDR TB) and has been reported in > 95% of the MDR-TB isolates.4, 5
The rapid availability of reliable test results is likely to directly translate into sound patient management decisions that, ultimately, will cure the individual patient and break the chain of TB transmission in the community.2
Cepheid's (Sunnyvale, CA, U.S.A.) Xpert MTB/RIF assay6, 7
meets the demands outlined above in a remarkable manner. It is a nucleic-acids amplification test for 1) the detection of MTB complex DNA in sputum or concentrated sputum sediments; and 2) the detection of RIF resistance-associated mutations of the rpoB
It is designed for use with Cepheid's GeneXpert Dx System that integrates and automates sample processing, nucleic acid amplification, and detection of the target sequences using real-time PCR and reverse transcriptase PCR. The system consists of an instrument, personal computer, barcode scanner, and preloaded software for running tests and viewing the results.9
It employs single-use disposable Xpert MTB/RIF cartridges that hold PCR reagents and host the PCR process. Because the cartridges are self-contained, cross-contamination between samples is eliminated.6
Current nucleic acid amplification methods used to detect MTB are complex, labor-intensive, and technically demanding. The Xpert MTB/RIF assay has the potential to bring standardized, sensitive and very specific diagnostic testing for both TB and drug resistance to universal-access point-of-care settings3
, provided that they will be able to afford it. In order to facilitate access, the Foundation for Innovative New Diagnostics (FIND) has negotiated significant price reductions. Current FIND-negotiated prices, along with the list of countries eligible for the discounts, are available on the web.10
Immunology, Issue 62, tuberculosis, drug resistance, rifampicin, rapid diagnosis, Xpert MTB/RIF test
Construction of Vapor Chambers Used to Expose Mice to Alcohol During the Equivalent of all Three Trimesters of Human Development
Institutions: University of New Mexico Health Sciences Center.
Exposure to alcohol during development can result in a constellation of morphological and behavioral abnormalities that are collectively known as Fetal Alcohol Spectrum Disorders (FASDs). At the most severe end of the spectrum is Fetal Alcohol Syndrome (FAS), characterized by growth retardation, craniofacial dysmorphology, and neurobehavioral deficits. Studies with animal models, including rodents, have elucidated many molecular and cellular mechanisms involved in the pathophysiology of FASDs. Ethanol administration to pregnant rodents has been used to model human exposure during the first and second trimesters of pregnancy. Third trimester ethanol consumption in humans has been modeled using neonatal rodents. However, few rodent studies have characterized the effect of ethanol exposure during the equivalent to all three trimesters of human pregnancy, a pattern of exposure that is common in pregnant women. Here, we show how to build vapor chambers from readily obtainable materials that can each accommodate up to six standard mouse cages. We describe a vapor chamber paradigm that can be used to model exposure to ethanol, with minimal handling, during all three trimesters. Our studies demonstrate that pregnant dams developed significant metabolic tolerance to ethanol. However, neonatal mice did not develop metabolic tolerance and the number of fetuses, fetus weight, placenta weight, number of pups/litter, number of dead pups/litter, and pup weight were not significantly affected by ethanol exposure. An important advantage of this paradigm is its applicability to studies with genetically-modified mice. Additionally, this paradigm minimizes handling of animals, a major confound in fetal alcohol research.
Medicine, Issue 89, fetal, ethanol, exposure, paradigm, vapor, development, alcoholism, teratogenic, animal, mouse, model
Flexible Colonoscopy in Mice to Evaluate the Severity of Colitis and Colorectal Tumors Using a Validated Endoscopic Scoring System
Institutions: Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland.
The use of modern endoscopy for research purposes has greatly facilitated our understanding of gastrointestinal pathologies. In particular, experimental endoscopy has been highly useful for studies that require repeated assessments in a single laboratory animal, such as those evaluating mechanisms of chronic inflammatory bowel disease and the progression of colorectal cancer. However, the methods used across studies are highly variable. At least three endoscopic scoring systems have been published for murine colitis and published protocols for the assessment of colorectal tumors fail to address the presence of concomitant colonic inflammation. This study develops and validates a reproducible endoscopic scoring system that integrates evaluation of both inflammation and tumors simultaneously. This novel scoring system has three major components: 1) assessment of the extent and severity of colorectal inflammation (based on perianal findings, transparency of the wall, mucosal bleeding, and focal lesions), 2) quantitative recording of tumor lesions (grid map and bar graph), and 3) numerical sorting of clinical cases by their pathological and research relevance based on decimal units with assigned categories of observed lesions and endoscopic complications (decimal identifiers). The video and manuscript presented herein were prepared, following IACUC-approved protocols, to allow investigators to score their own experimental mice using a well-validated and highly reproducible endoscopic methodology, with the system option to differentiate distal from proximal endoscopic colitis (D-PECS).
Medicine, Issue 80, Crohn's disease, ulcerative colitis, colon cancer, Clostridium difficile, SAMP mice, DSS/AOM-colitis, decimal scoring identifier
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Adaptation of Semiautomated Circulating Tumor Cell (CTC) Assays for Clinical and Preclinical Research Applications
Institutions: London Health Sciences Centre, Western University, London Health Sciences Centre, Lawson Health Research Institute, Western University.
The majority of cancer-related deaths occur subsequent to the development of metastatic disease. This highly lethal disease stage is associated with the presence of circulating tumor cells (CTCs). These rare cells have been demonstrated to be of clinical significance in metastatic breast, prostate, and colorectal cancers. The current gold standard in clinical CTC detection and enumeration is the FDA-cleared CellSearch system (CSS). This manuscript outlines the standard protocol utilized by this platform as well as two additional adapted protocols that describe the detailed process of user-defined marker optimization for protein characterization of patient CTCs and a comparable protocol for CTC capture in very low volumes of blood, using standard CSS reagents, for studying in vivo
preclinical mouse models of metastasis. In addition, differences in CTC quality between healthy donor blood spiked with cells from tissue culture versus patient blood samples are highlighted. Finally, several commonly discrepant items that can lead to CTC misclassification errors are outlined. Taken together, these protocols will provide a useful resource for users of this platform interested in preclinical and clinical research pertaining to metastasis and CTCs.
Medicine, Issue 84, Metastasis, circulating tumor cells (CTCs), CellSearch system, user defined marker characterization, in vivo, preclinical mouse model, clinical research
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g.
carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33
. To help improve this understanding, proton magnetic resonance spectroscopy (1
H-MRS) can be used as it allows the in vivo
quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41
. In fact, a recent study demonstrated that 1
H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34
. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1
H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31
. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Vascular Occlusion Training for Inclusion Body Myositis: A Novel Therapeutic Approach
Institutions: University of São Paulo, University of São Paulo.
Inclusion body myositis (IBM) is a rare idiopathic inflammatory myopathy. It is known to produces remarkable muscle weakness and to greatly compromise function and quality of life. Moreover, clinical practice suggests that, unlike other inflammatory myopathies, the majority of IBM patients are not responsive to treatment with immunosuppressive or immunomodulatory drugs to counteract disease progression1
. Additionally, conventional resistance training programs have been proven ineffective in restoring muscle function and muscle mass in these patients2,3
. Nevertheless, we have recently observed that restricting muscle blood flow using tourniquet cuffs in association with moderate intensity resistance training in an IBM patient produced a significant gain in muscle mass and function, along with substantial benefits in quality of life4
. Thus, a new non-pharmacological approach for IBM patients has been proposed. Herein, we describe the details of a proposed protocol for vascular occlusion associated with a resistance training program for this population.
Medicine, Issue 40, exercise training, therapeutical, myositis, vascular occlusion
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo
mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo
mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo
mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1
and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo
mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2
. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo
mutations. This is the case for autism and schizophrenia3
. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo
mutations would more frequently come from males, particularly older males4
. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo
mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo
mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing