JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Volume quantification of acute infratentorial hemorrhage with computed tomography: validation of the formula 1/2ABC and 2/3SH.
PUBLISHED: 01-01-2013
To compare the accuracy of formula 1/2ABC with 2/3SH on volume estimation for hypertensive infratentorial hematoma.
Authors: Beilei Lei, Huaxin Sheng, Haichen Wang, Christopher D. Lascola, David S. Warner, Daniel T. Laskowitz, Michael L. James.
Published: 07-03-2014
Intracerebral hemorrhage (ICH) is a common form of cerebrovascular disease and is associated with significant morbidity and mortality. Lack of effective treatment and failure of large clinical trials aimed at hemostasis and clot removal demonstrate the need for further mechanism-driven investigation of ICH. This research may be performed through the framework provided by preclinical models. Two murine models in popular use include intrastriatal (basal ganglia) injection of either autologous whole blood or clostridial collagenase. Since, each model represents distinctly different pathophysiological features related to ICH, use of a particular model may be selected based on what aspect of the disease is to be studied. For example, autologous blood injection most accurately represents the brain's response to the presence of intraparenchymal blood, and may most closely replicate lobar hemorrhage. Clostridial collagenase injection most accurately represents the small vessel rupture and hematoma evolution characteristic of deep hemorrhages. Thus, each model results in different hematoma formation, neuroinflammatory response, cerebral edema development, and neurobehavioral outcomes. Robustness of a purported therapeutic intervention can be best assessed using both models. In this protocol, induction of ICH using both models, immediate post-operative demonstration of injury, and early post-operative care techniques are demonstrated. Both models result in reproducible injuries, hematoma volumes, and neurobehavioral deficits. Because of the heterogeneity of human ICH, multiple preclinical models are needed to thoroughly explore pathophysiologic mechanisms and test potential therapeutic strategies.
25 Related JoVE Articles!
Play Button
Simultaneous Quantification of T-Cell Receptor Excision Circles (TRECs) and K-Deleting Recombination Excision Circles (KRECs) by Real-time PCR
Authors: Alessandra Sottini, Federico Serana, Diego Bertoli, Marco Chiarini, Monica Valotti, Marion Vaglio Tessitore, Luisa Imberti.
Institutions: Spedali Civili di Brescia.
T-cell receptor excision circles (TRECs) and K-deleting recombination excision circles (KRECs) are circularized DNA elements formed during recombination process that creates T- and B-cell receptors. Because TRECs and KRECs are unable to replicate, they are diluted after each cell division, and therefore persist in the cell. Their quantity in peripheral blood can be considered as an estimation of thymic and bone marrow output. By combining well established and commonly used TREC assay with a modified version of KREC assay, we have developed a duplex quantitative real-time PCR that allows quantification of both newly-produced T and B lymphocytes in a single assay. The number of TRECs and KRECs are obtained using a standard curve prepared by serially diluting TREC and KREC signal joints cloned in a bacterial plasmid, together with a fragment of T-cell receptor alpha constant gene that serves as reference gene. Results are reported as number of TRECs and KRECs/106 cells or per ml of blood. The quantification of these DNA fragments have been proven useful for monitoring immune reconstitution following bone marrow transplantation in both children and adults, for improved characterization of immune deficiencies, or for better understanding of certain immunomodulating drug activity.
Immunology, Issue 94, B lymphocytes, primary immunodeficiency, real-time PCR, immune recovery, T-cell homeostasis, T lymphocytes, thymic output, bone marrow output
Play Button
Ultrasonic Assessment of Myocardial Microstructure
Authors: Pranoti Hiremath, Michael Bauer, Hui-Wen Cheng, Kazumasa Unno, Ronglih Liao, Susan Cheng.
Institutions: Harvard Medical School, Brigham and Women's Hospital, Harvard Medical School.
Echocardiography is a widely accessible imaging modality that is commonly used to noninvasively characterize and quantify changes in cardiac structure and function. Ultrasonic assessments of cardiac tissue can include analyses of backscatter signal intensity within a given region of interest. Previously established techniques have relied predominantly on the integrated or mean value of backscatter signal intensities, which may be susceptible to variability from aliased data from low frame rates and time delays for algorithms based on cyclic variation. Herein, we describe an ultrasound-based imaging algorithm that extends from previous methods, can be applied to a single image frame and accounts for the full distribution of signal intensity values derived from a given myocardial sample. When applied to representative mouse and human imaging data, the algorithm distinguishes between subjects with and without exposure to chronic afterload resistance. The algorithm offers an enhanced surrogate measure of myocardial microstructure and can be performed using open-access image analysis software.
Medicine, Issue 83, echocardiography, image analysis, myocardial fibrosis, hypertension, cardiac cycle, open-access image analysis software
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
The ChroP Approach Combines ChIP and Mass Spectrometry to Dissect Locus-specific Proteomic Landscapes of Chromatin
Authors: Monica Soldi, Tiziana Bonaldi.
Institutions: European Institute of Oncology.
Chromatin is a highly dynamic nucleoprotein complex made of DNA and proteins that controls various DNA-dependent processes. Chromatin structure and function at specific regions is regulated by the local enrichment of histone post-translational modifications (hPTMs) and variants, chromatin-binding proteins, including transcription factors, and DNA methylation. The proteomic characterization of chromatin composition at distinct functional regions has been so far hampered by the lack of efficient protocols to enrich such domains at the appropriate purity and amount for the subsequent in-depth analysis by Mass Spectrometry (MS). We describe here a newly designed chromatin proteomics strategy, named ChroP (Chromatin Proteomics), whereby a preparative chromatin immunoprecipitation is used to isolate distinct chromatin regions whose features, in terms of hPTMs, variants and co-associated non-histonic proteins, are analyzed by MS. We illustrate here the setting up of ChroP for the enrichment and analysis of transcriptionally silent heterochromatic regions, marked by the presence of tri-methylation of lysine 9 on histone H3. The results achieved demonstrate the potential of ChroP in thoroughly characterizing the heterochromatin proteome and prove it as a powerful analytical strategy for understanding how the distinct protein determinants of chromatin interact and synergize to establish locus-specific structural and functional configurations.
Biochemistry, Issue 86, chromatin, histone post-translational modifications (hPTMs), epigenetics, mass spectrometry, proteomics, SILAC, chromatin immunoprecipitation , histone variants, chromatome, hPTMs cross-talks
Play Button
Accuracy in Dental Medicine, A New Way to Measure Trueness and Precision
Authors: Andreas Ender, Albert Mehl.
Institutions: University of Zürich.
Reference scanners are used in dental medicine to verify a lot of procedures. The main interest is to verify impression methods as they serve as a base for dental restorations. The current limitation of many reference scanners is the lack of accuracy scanning large objects like full dental arches, or the limited possibility to assess detailed tooth surfaces. A new reference scanner, based on focus variation scanning technique, was evaluated with regards to highest local and general accuracy. A specific scanning protocol was tested to scan original tooth surface from dental impressions. Also, different model materials were verified. The results showed a high scanning accuracy of the reference scanner with a mean deviation of 5.3 ± 1.1 µm for trueness and 1.6 ± 0.6 µm for precision in case of full arch scans. Current dental impression methods showed much higher deviations (trueness: 20.4 ± 2.2 µm, precision: 12.5 ± 2.5 µm) than the internal scanning accuracy of the reference scanner. Smaller objects like single tooth surface can be scanned with an even higher accuracy, enabling the system to assess erosive and abrasive tooth surface loss. The reference scanner can be used to measure differences for a lot of dental research fields. The different magnification levels combined with a high local and general accuracy can be used to assess changes of single teeth or restorations up to full arch changes.
Medicine, Issue 86, Laboratories, Dental, Calibration, Technology, Dental impression, Accuracy, Trueness, Precision, Full arch scan, Abrasion
Play Button
Laboratory Estimation of Net Trophic Transfer Efficiencies of PCB Congeners to Lake Trout (Salvelinus namaycush) from Its Prey
Authors: Charles P. Madenjian, Richard R. Rediske, James P. O'Keefe, Solomon R. David.
Institutions: U. S. Geological Survey, Grand Valley State University, Shedd Aquarium.
A technique for laboratory estimation of net trophic transfer efficiency (γ) of polychlorinated biphenyl (PCB) congeners to piscivorous fish from their prey is described herein. During a 135-day laboratory experiment, we fed bloater (Coregonus hoyi) that had been caught in Lake Michigan to lake trout (Salvelinus namaycush) kept in eight laboratory tanks. Bloater is a natural prey for lake trout. In four of the tanks, a relatively high flow rate was used to ensure relatively high activity by the lake trout, whereas a low flow rate was used in the other four tanks, allowing for low lake trout activity. On a tank-by-tank basis, the amount of food eaten by the lake trout on each day of the experiment was recorded. Each lake trout was weighed at the start and end of the experiment. Four to nine lake trout from each of the eight tanks were sacrificed at the start of the experiment, and all 10 lake trout remaining in each of the tanks were euthanized at the end of the experiment. We determined concentrations of 75 PCB congeners in the lake trout at the start of the experiment, in the lake trout at the end of the experiment, and in bloaters fed to the lake trout during the experiment. Based on these measurements, γ was calculated for each of 75 PCB congeners in each of the eight tanks. Mean γ was calculated for each of the 75 PCB congeners for both active and inactive lake trout. Because the experiment was replicated in eight tanks, the standard error about mean γ could be estimated. Results from this type of experiment are useful in risk assessment models to predict future risk to humans and wildlife eating contaminated fish under various scenarios of environmental contamination.
Environmental Sciences, Issue 90, trophic transfer efficiency, polychlorinated biphenyl congeners, lake trout, activity, contaminants, accumulation, risk assessment, toxic equivalents
Play Button
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Authors: Daniel T. Claiborne, Jessica L. Prince, Eric Hunter.
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro replication of HIV-1 as influenced by the gag gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro replication of chronically derived gag-pro sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Modeling Stroke in Mice: Permanent Coagulation of the Distal Middle Cerebral Artery
Authors: Gemma Llovera, Stefan Roth, Nikolaus Plesnila, Roland Veltkamp, Arthur Liesz.
Institutions: University Hospital Munich, Munich Cluster for Systems Neurology (SyNergy), University Heidelberg, Charing Cross Hospital.
Stroke is the third most common cause of death and a main cause of acquired adult disability in developed countries. Only very limited therapeutical options are available for a small proportion of stroke patients in the acute phase. Current research is intensively searching for novel therapeutic strategies and is increasingly focusing on the sub-acute and chronic phase after stroke because more patients might be eligible for therapeutic interventions in a prolonged time window. These delayed mechanisms include important pathophysiological pathways such as post-stroke inflammation, angiogenesis, neuronal plasticity and regeneration. In order to analyze these mechanisms and to subsequently evaluate novel drug targets, experimental stroke models with clinical relevance, low mortality and high reproducibility are sought after. Moreover, mice are the smallest mammals in which a focal stroke lesion can be induced and for which a broad spectrum of transgenic models are available. Therefore, we describe here the mouse model of transcranial, permanent coagulation of the middle cerebral artery via electrocoagulation distal of the lenticulostriatal arteries, the so-called “coagulation model”. The resulting infarct in this model is located mainly in the cortex; the relative infarct volume in relation to brain size corresponds to the majority of human strokes. Moreover, the model fulfills the above-mentioned criteria of reproducibility and low mortality. In this video we demonstrate the surgical methods of stroke induction in the “coagulation model” and report histological and functional analysis tools.
Medicine, Issue 89, stroke, brain ischemia, animal model, middle cerebral artery, electrocoagulation
Play Button
Identification of Key Factors Regulating Self-renewal and Differentiation in EML Hematopoietic Precursor Cells by RNA-sequencing Analysis
Authors: Shan Zong, Shuyun Deng, Kenian Chen, Jia Qian Wu.
Institutions: The University of Texas Graduate School of Biomedical Sciences at Houston.
Hematopoietic stem cells (HSCs) are used clinically for transplantation treatment to rebuild a patient's hematopoietic system in many diseases such as leukemia and lymphoma. Elucidating the mechanisms controlling HSCs self-renewal and differentiation is important for application of HSCs for research and clinical uses. However, it is not possible to obtain large quantity of HSCs due to their inability to proliferate in vitro. To overcome this hurdle, we used a mouse bone marrow derived cell line, the EML (Erythroid, Myeloid, and Lymphocytic) cell line, as a model system for this study. RNA-sequencing (RNA-Seq) has been increasingly used to replace microarray for gene expression studies. We report here a detailed method of using RNA-Seq technology to investigate the potential key factors in regulation of EML cell self-renewal and differentiation. The protocol provided in this paper is divided into three parts. The first part explains how to culture EML cells and separate Lin-CD34+ and Lin-CD34- cells. The second part of the protocol offers detailed procedures for total RNA preparation and the subsequent library construction for high-throughput sequencing. The last part describes the method for RNA-Seq data analysis and explains how to use the data to identify differentially expressed transcription factors between Lin-CD34+ and Lin-CD34- cells. The most significantly differentially expressed transcription factors were identified to be the potential key regulators controlling EML cell self-renewal and differentiation. In the discussion section of this paper, we highlight the key steps for successful performance of this experiment. In summary, this paper offers a method of using RNA-Seq technology to identify potential regulators of self-renewal and differentiation in EML cells. The key factors identified are subjected to downstream functional analysis in vitro and in vivo.
Genetics, Issue 93, EML Cells, Self-renewal, Differentiation, Hematopoietic precursor cell, RNA-Sequencing, Data analysis
Play Button
The Rabbit Blood-shunt Model for the Study of Acute and Late Sequelae of Subarachnoid Hemorrhage: Technical Aspects
Authors: Lukas Andereggen, Volker Neuschmelting, Michael von Gunten, Hans Rudolf Widmer, Jukka Takala, Stephan M. Jakob, Javier Fandino, Serge Marbacher.
Institutions: University and Bern University Hospital (Inselspital), Kantonsspital Aarau, Boston Children's Hospital, Boston Children's Hospital, University and Bern University Hospital (Inselspital), University Hospital Cologne, Länggasse Bern.
Early brain injury and delayed cerebral vasospasm both contribute to unfavorable outcomes after subarachnoid hemorrhage (SAH). Reproducible and controllable animal models that simulate both conditions are presently uncommon. Therefore, new models are needed in order to mimic human pathophysiological conditions resulting from SAH. This report describes the technical nuances of a rabbit blood-shunt SAH model that enables control of intracerebral pressure (ICP). An extracorporeal shunt is placed between the arterial system and the subarachnoid space, which enables examiner-independent SAH in a closed cranium. Step-by-step procedural instructions and necessary equipment are described, as well as technical considerations to produce the model with minimal mortality and morbidity. Important details required for successful surgical creation of this robust, simple and consistent ICP-controlled SAH rabbit model are described.
Medicine, Issue 92, Subarachnoid hemorrhage, animal models, rabbit, extracorporeal blood shunt, early brain injury, delayed cerebral vasospasm, microsurgery.
Play Button
Dual-phase Cone-beam Computed Tomography to See, Reach, and Treat Hepatocellular Carcinoma during Drug-eluting Beads Transarterial Chemo-embolization
Authors: Vania Tacher, MingDe Lin, Nikhil Bhagat, Nadine Abi Jaoudeh, Alessandro Radaelli, Niels Noordhoek, Bart Carelsen, Bradford J. Wood, Jean-François Geschwind.
Institutions: The Johns Hopkins Hospital, Philips Research North America, National Institutes of Health, Philips Healthcare.
The advent of cone-beam computed tomography (CBCT) in the angiography suite has been revolutionary in interventional radiology. CBCT offers 3 dimensional (3D) diagnostic imaging in the interventional suite and can enhance minimally-invasive therapy beyond the limitations of 2D angiography alone. The role of CBCT has been recognized in transarterial chemo-embolization (TACE) treatment of hepatocellular carcinoma (HCC). The recent introduction of a CBCT technique: dual-phase CBCT (DP-CBCT) improves intra-arterial HCC treatment with drug-eluting beads (DEB-TACE). DP-CBCT can be used to localize liver tumors with the diagnostic accuracy of multi-phasic multidetector computed tomography (M-MDCT) and contrast enhanced magnetic resonance imaging (CE-MRI) (See the tumor), to guide intra-arterially guidewire and microcatheter to the desired location for selective therapy (Reach the tumor), and to evaluate treatment success during the procedure (Treat the tumor). The purpose of this manuscript is to illustrate how DP-CBCT is used in DEB-TACE to see, reach, and treat HCC.
Medicine, Issue 82, Carcinoma, Hepatocellular, Tomography, X-Ray Computed, Surgical Procedures, Minimally Invasive, Digestive System Diseases, Diagnosis, Therapeutics, Surgical Procedures, Operative, Equipment and Supplies, Transarterial chemo-embolization, Hepatocellular carcinoma, Dual-phase cone-beam computed tomography, 3D roadmap, Drug-Eluting Beads
Play Button
Prehospital Thrombolysis: A Manual from Berlin
Authors: Martin Ebinger, Sascha Lindenlaub, Alexander Kunz, Michal Rozanski, Carolin Waldschmidt, Joachim E. Weber, Matthias Wendt, Benjamin Winter, Philipp A. Kellner, Sabina Kaczmarek, Matthias Endres, Heinrich J. Audebert.
Institutions: Charité - Universitätsmedizin Berlin, Charité - Universitätsmedizin Berlin, Universitätsklinikum Hamburg - Eppendorf, Berliner Feuerwehr, STEMO-Consortium.
In acute ischemic stroke, time from symptom onset to intervention is a decisive prognostic factor. In order to reduce this time, prehospital thrombolysis at the emergency site would be preferable. However, apart from neurological expertise and laboratory investigations a computed tomography (CT) scan is necessary to exclude hemorrhagic stroke prior to thrombolysis. Therefore, a specialized ambulance equipped with a CT scanner and point-of-care laboratory was designed and constructed. Further, a new stroke identifying interview algorithm was developed and implemented in the Berlin emergency medical services. Since February 2011 the identification of suspected stroke in the dispatch center of the Berlin Fire Brigade prompts the deployment of this ambulance, a stroke emergency mobile (STEMO). On arrival, a neurologist, experienced in stroke care and with additional training in emergency medicine, takes a neurological examination. If stroke is suspected a CT scan excludes intracranial hemorrhage. The CT-scans are telemetrically transmitted to the neuroradiologist on-call. If coagulation status of the patient is normal and patient's medical history reveals no contraindication, prehospital thrombolysis is applied according to current guidelines (intravenous recombinant tissue plasminogen activator, iv rtPA, alteplase, Actilyse). Thereafter patients are transported to the nearest hospital with a certified stroke unit for further treatment and assessment of strokeaetiology. After a pilot-phase, weeks were randomized into blocks either with or without STEMO care. Primary end-point of this study is time from alarm to the initiation of thrombolysis. We hypothesized that alarm-to-treatment time can be reduced by at least 20 min compared to regular care.
Medicine, Issue 81, Telemedicine, Emergency Medical Services, Stroke, Tomography, X-Ray Computed, Emergency Treatment,[stroke, thrombolysis, prehospital, emergency medical services, ambulance
Play Button
Focal Cerebral Ischemia Model by Endovascular Suture Occlusion of the Middle Cerebral Artery in the Rat
Authors: Kutluay Uluç, Amrendra Miranpuri, Gregory C. Kujoth, Erinç Aktüre, Mustafa K. Başkaya.
Institutions: University of Wisconsin-Madison.
Stroke is the leading cause of disability and the third leading cause of death in adults worldwide1. In human stroke, there exists a highly variable clinical state; in the development of animal models of focal ischemia, however, achieving reproducibility of experimentally induced infarct volume is essential. The rat is a widely used animal model for stroke due to its relatively low animal husbandry costs and to the similarity of its cranial circulation to that of humans2,3. In humans, the middle cerebral artery (MCA) is most commonly affected in stroke syndromes and multiple methods of MCA occlusion (MCAO) have been described to mimic this clinical syndrome in animal models. Because recanalization commonly occurs following an acute stroke in the human, reperfusion after a period of occlusion has been included in many of these models. In this video, we demonstrate the transient endovascular suture MCAO model in the spontaneously hypertensive rat (SHR). A filament with a silicon tip coating is placed intraluminally at the MCA origin for 60 minutes, followed by reperfusion. Note that the optimal occlusion period may vary in other rat strains, such as Wistar or Sprague-Dawley. Several behavioral indicators of stroke in the rat are shown. Focal ischemia is confirmed using T2-weighted magnetic resonance images and by staining brain sections with 2,3,5-triphenyltetrazolium chloride (TTC) 24 hours after MCAO.
Neuroscience, Issue 48, Stroke, cerebral ischemia, middle cerebral artery occlusion, intraluminal filament, rat, magnetic resonance imaging, surgery, neuroscience, brain
Play Button
Autologous Blood Injection to Model Spontaneous Intracerebral Hemorrhage in Mice
Authors: Lauren H. Sansing, Scott E. Kasner, Louise McCullough, Puneet Agarwal, Frank A. Welsh, Katalin Kariko.
Institutions: University of Connecticut Health Center, School of Medicine, University of Pennsylvania, Hartford Hospital, School of Medicine, University of Pennsylvania.
Investigation of the pathophysiology of injury after intracerebral hemorrhage (ICH) requires a reproducible animal model. While ICH accounts for 10-15% of all strokes, there remains no specific effective therapy. The autologous blood injection model in mice involves the stereotaxic injection of arterial blood into the basal ganglia mimicking a spontaneous hypertensive hemorrhage in man. The response to hemorrhage can then be studied in vivo and the neurobehavioral deficits quantified, allowing for description of the ensuing pathology and the testing of potential therapeutic agents. The procedure described in this protocol uses the double injection technique to minimize risk of blood reflux up the needle track, no anticoagulants in the pumping system, and eliminates all dead space and expandable tubing in the system.
Neuroscience, Issue 54, stroke, intracerebral hemorrhage, mice, animal model
Play Button
Quantification of Atherosclerotic Plaque Activity and Vascular Inflammation using [18-F] Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography (FDG-PET/CT)
Authors: Nehal N. Mehta, Drew A. Torigian, Joel M. Gelfand, Babak Saboury, Abass Alavi.
Institutions: University of Pennsylvania, Perelman School of Medicine, University of Pennsylvania, Perelman School of Medicine, University of Pennsylvania, Perelman School of Medicine.
Conventional non-invasive imaging modalities of atherosclerosis such as coronary artery calcium (CAC)1 and carotid intimal medial thickness (C-IMT)2 provide information about the burden of disease. However, despite multiple validation studies of CAC3-5, and C-IMT2,6, these modalities do not accurately assess plaque characteristics7,8, and the composition and inflammatory state of the plaque determine its stability and, therefore, the risk of clinical events9-13. [18F]-2-fluoro-2-deoxy-D-glucose (FDG) imaging using positron-emission tomography (PET)/computed tomography (CT) has been extensively studied in oncologic metabolism14,15. Studies using animal models and immunohistochemistry in humans show that FDG-PET/CT is exquisitely sensitive for detecting macrophage activity16, an important source of cellular inflammation in vessel walls. More recently, we17,18 and others have shown that FDG-PET/CT enables highly precise, novel measurements of inflammatory activity of activity of atherosclerotic plaques in large and medium-sized arteries9,16,19,20. FDG-PET/CT studies have many advantages over other imaging modalities: 1) high contrast resolution; 2) quantification of plaque volume and metabolic activity allowing for multi-modal atherosclerotic plaque quantification; 3) dynamic, real-time, in vivo imaging; 4) minimal operator dependence. Finally, vascular inflammation detected by FDG-PET/CT has been shown to predict cardiovascular (CV) events independent of traditional risk factors21,22 and is also highly associated with overall burden of atherosclerosis23. Plaque activity by FDG-PET/CT is modulated by known beneficial CV interventions such as short term (12 week) statin therapy24 as well as longer term therapeutic lifestyle changes (16 months)25. The current methodology for quantification of FDG uptake in atherosclerotic plaque involves measurement of the standardized uptake value (SUV) of an artery of interest and of the venous blood pool in order to calculate a target to background ratio (TBR), which is calculated by dividing the arterial SUV by the venous blood pool SUV. This method has shown to represent a stable, reproducible phenotype over time, has a high sensitivity for detection of vascular inflammation, and also has high inter-and intra-reader reliability26. Here we present our methodology for patient preparation, image acquisition, and quantification of atherosclerotic plaque activity and vascular inflammation using SUV, TBR, and a global parameter called the metabolic volumetric product (MVP). These approaches may be applied to assess vascular inflammation in various study samples of interest in a consistent fashion as we have shown in several prior publications.9,20,27,28
Medicine, Issue 63, FDG-PET/CT, atherosclerosis, vascular inflammation, quantitative radiology, imaging
Play Button
A Low Mortality Rat Model to Assess Delayed Cerebral Vasospasm After Experimental Subarachnoid Hemorrhage
Authors: Rahul V. Dudhani, Michele Kyle, Christina Dedeo, Margaret Riordan, Eric M. Deshaies.
Institutions: SUNY Upstate Medical University, SUNY Upstate Medical University.
Objective: To characterize and establish a reproducible model that demonstrates delayed cerebral vasospasm after aneurysmal subarachnoid hemorrhage (SAH) in rats, in order to identify the initiating events, pathophysiological changes and potential targets for treatment. Methods: Twenty-eight male Sprague-Dawley rats (250 - 300 g) were arbitrarily assigned to one of two groups - SAH or saline control. Rat subarachnoid hemorrhage in the SAH group (n=15) was induced by double injection of autologous blood, 48 hr apart, into the cisterna magna. Similarly, normal saline (n=13) was injected into the cisterna magna of the saline control group. Rats were sacrificed on day five after the second blood injection and the brains were preserved for histological analysis. The degree of vasospasm was measured using sections of the basilar artery, by measuring the internal luminal cross sectional area using NIH Image-J software. The significance was tested using Tukey/Kramer's statistical analysis. Results: After analysis of histological sections, basilar artery luminal cross sectional area were smaller in the SAH than in the saline group, consistent with cerebral vasospasm in the former group. In the SAH group, basilar artery internal area (.056 μm ± 3) were significantly smaller from vasospasm five days after the second blood injection (seven days after the initial blood injection), compared to the saline control group with internal area (.069 ± 3; p=0.004). There were no mortalities from cerebral vasospasm. Conclusion: The rat double SAH model induces a mild, survivable, basilar artery vasospasm that can be used to study the pathophysiological mechanisms of cerebral vasospasm in a small animal model. A low and acceptable mortality rate is a significant criterion to be satisfied for an ideal SAH animal model so that the mechanisms of vasospasm can be elucidated 7, 8. Further modifications of the model can be made to adjust for increased severity of vasospasm and neurological exams.
Medicine, Issue 71, Anatomy, Physiology, Neurobiology, Neuroscience, Immunology, Surgery, Aneurysm, cerebral, hemorrhage, model, mortality, rat, rodent, subarachnoid, vasospasm, animal model
Play Button
Patient-specific Modeling of the Heart: Estimation of Ventricular Fiber Orientations
Authors: Fijoy Vadakkumpadan, Hermenegild Arevalo, Natalia A. Trayanova.
Institutions: Johns Hopkins University.
Patient-specific simulations of heart (dys)function aimed at personalizing cardiac therapy are hampered by the absence of in vivo imaging technology for clinically acquiring myocardial fiber orientations. The objective of this project was to develop a methodology to estimate cardiac fiber orientations from in vivo images of patient heart geometries. An accurate representation of ventricular geometry and fiber orientations was reconstructed, respectively, from high-resolution ex vivo structural magnetic resonance (MR) and diffusion tensor (DT) MR images of a normal human heart, referred to as the atlas. Ventricular geometry of a patient heart was extracted, via semiautomatic segmentation, from an in vivo computed tomography (CT) image. Using image transformation algorithms, the atlas ventricular geometry was deformed to match that of the patient. Finally, the deformation field was applied to the atlas fiber orientations to obtain an estimate of patient fiber orientations. The accuracy of the fiber estimates was assessed using six normal and three failing canine hearts. The mean absolute difference between inclination angles of acquired and estimated fiber orientations was 15.4 °. Computational simulations of ventricular activation maps and pseudo-ECGs in sinus rhythm and ventricular tachycardia indicated that there are no significant differences between estimated and acquired fiber orientations at a clinically observable level.The new insights obtained from the project will pave the way for the development of patient-specific models of the heart that can aid physicians in personalized diagnosis and decisions regarding electrophysiological interventions.
Bioengineering, Issue 71, Biomedical Engineering, Medicine, Anatomy, Physiology, Cardiology, Myocytes, Cardiac, Image Processing, Computer-Assisted, Magnetic Resonance Imaging, MRI, Diffusion Magnetic Resonance Imaging, Cardiac Electrophysiology, computerized simulation (general), mathematical modeling (systems analysis), Cardiomyocyte, biomedical image processing, patient-specific modeling, Electrophysiology, simulation
Play Button
Using High Resolution Computed Tomography to Visualize the Three Dimensional Structure and Function of Plant Vasculature
Authors: Andrew J. McElrone, Brendan Choat, Dilworth Y. Parkinson, Alastair A. MacDowell, Craig R. Brodersen.
Institutions: U.S. Department of Agriculture, University of California - Davis, University of Western Sydney, Lawrence Berkeley National Lab, University of Florida .
High resolution x-ray computed tomography (HRCT) is a non-destructive diagnostic imaging technique with sub-micron resolution capability that is now being used to evaluate the structure and function of plant xylem network in three dimensions (3D) (e.g. Brodersen et al. 2010; 2011; 2012a,b). HRCT imaging is based on the same principles as medical CT systems, but a high intensity synchrotron x-ray source results in higher spatial resolution and decreased image acquisition time. Here, we demonstrate in detail how synchrotron-based HRCT (performed at the Advanced Light Source-LBNL Berkeley, CA, USA) in combination with Avizo software (VSG Inc., Burlington, MA, USA) is being used to explore plant xylem in excised tissue and living plants. This new imaging tool allows users to move beyond traditional static, 2D light or electron micrographs and study samples using virtual serial sections in any plane. An infinite number of slices in any orientation can be made on the same sample, a feature that is physically impossible using traditional microscopy methods. Results demonstrate that HRCT can be applied to both herbaceous and woody plant species, and a range of plant organs (i.e. leaves, petioles, stems, trunks, roots). Figures presented here help demonstrate both a range of representative plant vascular anatomy and the type of detail extracted from HRCT datasets, including scans for coast redwood (Sequoia sempervirens), walnut (Juglans spp.), oak (Quercus spp.), and maple (Acer spp.) tree saplings to sunflowers (Helianthus annuus), grapevines (Vitis spp.), and ferns (Pteridium aquilinum and Woodwardia fimbriata). Excised and dried samples from woody species are easiest to scan and typically yield the best images. However, recent improvements (i.e. more rapid scans and sample stabilization) have made it possible to use this visualization technique on green tissues (e.g. petioles) and in living plants. On occasion some shrinkage of hydrated green plant tissues will cause images to blur and methods to avoid these issues are described. These recent advances with HRCT provide promising new insights into plant vascular function.
Plant Biology, Issue 74, Cellular Biology, Molecular Biology, Biophysics, Structural Biology, Physics, Environmental Sciences, Agriculture, botany, environmental effects (biological, animal and plant), plants, radiation effects (biological, animal and plant), CT scans, advanced visualization techniques, xylem networks, plant vascular function, synchrotron, x-ray micro-tomography, ALS 8.3.2, xylem, phloem, tomography, imaging
Play Button
Characterization of Surface Modifications by White Light Interferometry: Applications in Ion Sputtering, Laser Ablation, and Tribology Experiments
Authors: Sergey V. Baryshev, Robert A. Erck, Jerry F. Moore, Alexander V. Zinovev, C. Emil Tripa, Igor V. Veryovkin.
Institutions: Argonne National Laboratory, Argonne National Laboratory, MassThink LLC.
In materials science and engineering it is often necessary to obtain quantitative measurements of surface topography with micrometer lateral resolution. From the measured surface, 3D topographic maps can be subsequently analyzed using a variety of software packages to extract the information that is needed. In this article we describe how white light interferometry, and optical profilometry (OP) in general, combined with generic surface analysis software, can be used for materials science and engineering tasks. In this article, a number of applications of white light interferometry for investigation of surface modifications in mass spectrometry, and wear phenomena in tribology and lubrication are demonstrated. We characterize the products of the interaction of semiconductors and metals with energetic ions (sputtering), and laser irradiation (ablation), as well as ex situ measurements of wear of tribological test specimens. Specifically, we will discuss: Aspects of traditional ion sputtering-based mass spectrometry such as sputtering rates/yields measurements on Si and Cu and subsequent time-to-depth conversion. Results of quantitative characterization of the interaction of femtosecond laser irradiation with a semiconductor surface. These results are important for applications such as ablation mass spectrometry, where the quantities of evaporated material can be studied and controlled via pulse duration and energy per pulse. Thus, by determining the crater geometry one can define depth and lateral resolution versus experimental setup conditions. Measurements of surface roughness parameters in two dimensions, and quantitative measurements of the surface wear that occur as a result of friction and wear tests. Some inherent drawbacks, possible artifacts, and uncertainty assessments of the white light interferometry approach will be discussed and explained.
Materials Science, Issue 72, Physics, Ion Beams (nuclear interactions), Light Reflection, Optical Properties, Semiconductor Materials, White Light Interferometry, Ion Sputtering, Laser Ablation, Femtosecond Lasers, Depth Profiling, Time-of-flight Mass Spectrometry, Tribology, Wear Analysis, Optical Profilometry, wear, friction, atomic force microscopy, AFM, scanning electron microscopy, SEM, imaging, visualization
Play Button
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Authors: Wenan Chen, Ashwin Belle, Charles Cockrell, Kevin R. Ward, Kayvan Najarian.
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Play Button
Quantifying Agonist Activity at G Protein-coupled Receptors
Authors: Frederick J. Ehlert, Hinako Suga, Michael T. Griffin.
Institutions: University of California, Irvine, University of California, Chapman University.
When an agonist activates a population of G protein-coupled receptors (GPCRs), it elicits a signaling pathway that culminates in the response of the cell or tissue. This process can be analyzed at the level of a single receptor, a population of receptors, or a downstream response. Here we describe how to analyze the downstream response to obtain an estimate of the agonist affinity constant for the active state of single receptors. Receptors behave as quantal switches that alternate between active and inactive states (Figure 1). The active state interacts with specific G proteins or other signaling partners. In the absence of ligands, the inactive state predominates. The binding of agonist increases the probability that the receptor will switch into the active state because its affinity constant for the active state (Kb) is much greater than that for the inactive state (Ka). The summation of the random outputs of all of the receptors in the population yields a constant level of receptor activation in time. The reciprocal of the concentration of agonist eliciting half-maximal receptor activation is equivalent to the observed affinity constant (Kobs), and the fraction of agonist-receptor complexes in the active state is defined as efficacy (ε) (Figure 2). Methods for analyzing the downstream responses of GPCRs have been developed that enable the estimation of the Kobs and relative efficacy of an agonist 1,2. In this report, we show how to modify this analysis to estimate the agonist Kb value relative to that of another agonist. For assays that exhibit constitutive activity, we show how to estimate Kb in absolute units of M-1. Our method of analyzing agonist concentration-response curves 3,4 consists of global nonlinear regression using the operational model 5. We describe a procedure using the software application, Prism (GraphPad Software, Inc., San Diego, CA). The analysis yields an estimate of the product of Kobs and a parameter proportional to efficacy (τ). The estimate of τKobs of one agonist, divided by that of another, is a relative measure of Kb (RAi) 6. For any receptor exhibiting constitutive activity, it is possible to estimate a parameter proportional to the efficacy of the free receptor complex (τsys). In this case, the Kb value of an agonist is equivalent to τKobssys 3. Our method is useful for determining the selectivity of an agonist for receptor subtypes and for quantifying agonist-receptor signaling through different G proteins.
Molecular Biology, Issue 58, agonist activity, active state, ligand bias, constitutive activity, G protein-coupled receptor
Play Button
Methods for ECG Evaluation of Indicators of Cardiac Risk, and Susceptibility to Aconitine-induced Arrhythmias in Rats Following Status Epilepticus
Authors: Steven L. Bealer, Cameron S. Metcalf, Jason G. Little.
Institutions: University of Utah.
Lethal cardiac arrhythmias contribute to mortality in a number of pathological conditions. Several parameters obtained from a non-invasive, easily obtained electrocardiogram (ECG) are established, well-validated prognostic indicators of cardiac risk in patients suffering from a number of cardiomyopathies. Increased heart rate, decreased heart rate variability (HRV), and increased duration and variability of cardiac ventricular electrical activity (QT interval) are all indicative of enhanced cardiac risk 1-4. In animal models, it is valuable to compare these ECG-derived variables and susceptibility to experimentally induced arrhythmias. Intravenous infusion of the arrhythmogenic agent aconitine has been widely used to evaluate susceptibility to arrhythmias in a range of experimental conditions, including animal models of depression 5 and hypertension 6, following exercise 7 and exposure to air pollutants 8, as well as determination of the antiarrhythmic efficacy of pharmacological agents 9,10. It should be noted that QT dispersion in humans is a measure of QT interval variation across the full set of leads from a standard 12-lead ECG. Consequently, the measure of QT dispersion from the 2-lead ECG in the rat described in this protocol is different than that calculated from human ECG records. This represents a limitation in the translation of the data obtained from rodents to human clinical medicine. Status epilepticus (SE) is a single seizure or series of continuously recurring seizures lasting more than 30 min 11,12 11,12, and results in mortality in 20% of cases 13. Many individuals survive the SE, but die within 30 days 14,15. The mechanism(s) of this delayed mortality is not fully understood. It has been suggested that lethal ventricular arrhythmias contribute to many of these deaths 14-17. In addition to SE, patients experiencing spontaneously recurring seizures, i.e. epilepsy, are at risk of premature sudden and unexpected death associated with epilepsy (SUDEP) 18. As with SE, the precise mechanisms mediating SUDEP are not known. It has been proposed that ventricular abnormalities and resulting arrhythmias make a significant contribution 18-22. To investigate the mechanisms of seizure-related cardiac death, and the efficacy of cardioprotective therapies, it is necessary to obtain both ECG-derived indicators of risk and evaluate susceptibility to cardiac arrhythmias in animal models of seizure disorders 23-25. Here we describe methods for implanting ECG electrodes in the Sprague-Dawley laboratory rat (Rattus norvegicus), following SE, collection and analysis of ECG recordings, and induction of arrhythmias during iv infusion of aconitine. These procedures can be used to directly determine the relationships between ECG-derived measures of cardiac electrical activity and susceptibility to ventricular arrhythmias in rat models of seizure disorders, or any pathology associated with increased risk of sudden cardiac death.
Medicine, Issue 50, cardiac, seizure disorders, QTc, QTd, cardiac arrhythmias, rat
Play Button
Concentration Determination of Nucleic Acids and Proteins Using the Micro-volume Bio-spec Nano Spectrophotometer
Authors: Suja Sukumaran.
Institutions: Scientific Instruments.
Nucleic Acid quantitation procedures have advanced significantly in the last three decades. More and more, molecular biologists require consistent small-volume analysis of nucleic acid samples for their experiments. The BioSpec-nano provides a potential solution to the problems of inaccurate, non-reproducible results, inherent in current DNA quantitation methods, via specialized optics and a sensitive PDA detector. The BioSpec-nano also has automated functionality such that mounting, measurement, and cleaning are done by the instrument, thereby eliminating tedious, repetitive, and inconsistent placement of the fiber optic element and manual cleaning. In this study, data is presented on the quantification of DNA and protein, as well as on measurement reproducibility and accuracy. Automated sample contact and rapid scanning allows measurement in three seconds, resulting in excellent throughput. Data analysis is carried out using the built-in features of the software. The formula used for calculating DNA concentration is: Sample Concentration = DF · (OD260-OD320)· NACF (1) Where DF = sample dilution factor and NACF = nucleic acid concentration factor. The Nucleic Acid concentration factor is set in accordance with the analyte selected1. Protein concentration results can be expressed as μg/ mL or as moles/L by entering e280 and molecular weight values respectively. When residue values for Tyr, Trp and Cysteine (S-S bond) are entered in the e280Calc tab, the extinction coefficient values are calculated as e280 = 5500 x (Trp residues) + 1490 x (Tyr residues) + 125 x (cysteine S-S bond). The e280 value is used by the software for concentration calculation. In addition to concentration determination of nucleic acids and protein, the BioSpec-nano can be used as an ultra micro-volume spectrophotometer for many other analytes or as a standard spectrophotometer using 5 mm pathlength cells.
Molecular Biology, Issue 48, Nucleic acid quantitation, protein quantitation, micro-volume analysis, label quantitation
Play Button
Linearization of the Bradford Protein Assay
Authors: Orna Ernst, Tsaffrir Zor.
Institutions: Tel Aviv University.
Determination of microgram quantities of protein in the Bradford Coomassie brilliant blue assay is accomplished by measurement of absorbance at 590 nm. This most common assay enables rapid and simple protein quantification in cell lysates, cellular fractions, or recombinant protein samples, for the purpose of normalization of biochemical measurements. However, an intrinsic nonlinearity compromises the sensitivity and accuracy of this method. It is shown that under standard assay conditions, the ratio of the absorbance measurements at 590 nm and 450 nm is strictly linear with protein concentration. This simple procedure increases the accuracy and improves the sensitivity of the assay about 10-fold, permitting quantification down to 50 ng of bovine serum albumin. Furthermore, the interference commonly introduced by detergents that are used to create the cell lysates is greatly reduced by the new protocol. A linear equation developed on the basis of mass action and Beer's law perfectly fits the experimental data.
Cellular Biology, Issue 38, Bradford, protein assay, protein quantification, Coomassie brilliant blue
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.