JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Global DNA methylation of ischemic stroke subtypes.
PUBLISHED: 01-01-2014
Ischemic stroke (IS), a heterogeneous multifactorial disorder, is among the leading causes of mortality and long-term disability in the western world. Epidemiological data provides evidence for a genetic component to the disease, but its epigenetic involvement is still largely unknown. Epigenetic mechanisms, such as DNA methylation, change over time and may be associated with aging processes and with modulation of the risk of various pathologies, such as cardiovascular disease and stroke. We analyzed 2 independent cohorts of IS patients. Global DNA methylation was measured by luminometric methylation assay (LUMA) of DNA blood samples. Univariate and multivariate regression analyses were used to assess the methylation differences between the 3 most common IS subtypes, large-artery atherosclerosis (LAA), small-artery disease (SAD), and cardio-aortic embolism (CE). A total of 485 IS patients from 2 independent hospital cohorts (n?=?281 and n?=?204) were included, distributed across 3 IS subtypes: LAA (78/281, 59/204), SAD (97/281, 53/204), and CE (106/281, 89/204). In univariate analyses, no statistical differences in LUMA levels were observed between the 3 etiologies in either cohort. Multivariate analysis, adjusted by age, sex, hyperlipidemia, and smoking habit, confirmed the lack of differences in methylation levels between the analyzed IS subtypes in both cohorts. Despite differences in pathogenesis, our results showed no global methylation differences between LAA, SAD, and CE subtypes of IS. Further work is required to establish whether the epigenetic mechanism of methylation might play a role in this complex disease.
Chromatin is a highly dynamic nucleoprotein complex made of DNA and proteins that controls various DNA-dependent processes. Chromatin structure and function at specific regions is regulated by the local enrichment of histone post-translational modifications (hPTMs) and variants, chromatin-binding proteins, including transcription factors, and DNA methylation. The proteomic characterization of chromatin composition at distinct functional regions has been so far hampered by the lack of efficient protocols to enrich such domains at the appropriate purity and amount for the subsequent in-depth analysis by Mass Spectrometry (MS). We describe here a newly designed chromatin proteomics strategy, named ChroP (Chromatin Proteomics), whereby a preparative chromatin immunoprecipitation is used to isolate distinct chromatin regions whose features, in terms of hPTMs, variants and co-associated non-histonic proteins, are analyzed by MS. We illustrate here the setting up of ChroP for the enrichment and analysis of transcriptionally silent heterochromatic regions, marked by the presence of tri-methylation of lysine 9 on histone H3. The results achieved demonstrate the potential of ChroP in thoroughly characterizing the heterochromatin proteome and prove it as a powerful analytical strategy for understanding how the distinct protein determinants of chromatin interact and synergize to establish locus-specific structural and functional configurations.
26 Related JoVE Articles!
Play Button
Methylated DNA Immunoprecipitation
Authors: Kelsie L. Thu, Emily A. Vucic, Jennifer Y. Kennett, Cameron Heryet, Carolyn J. Brown, Wan L. Lam, Ian M. Wilson.
Institutions: BC Cancer Research Centre, University of British Columbia - UBC, These authors contributed equally., University of British Columbia - UBC, BC Cancer Agency, University of British Columbia - UBC.
The identification of DNA methylation patterns is a common procedure in the study of epigenetics, as methylation is known to have significant effects on gene expression, and is involved with normal development as well as disease 1-4. Thus, the ability to discriminate between methylated DNA and non-methylated DNA is essential for generating methylation profiles for such studies. Methylated DNA immunoprecipitation (MeDIP) is an efficient technique for the extraction of methylated DNA from a sample of interest 5-7. A sample of as little as 200 ng of DNA is sufficient for the antibody, or immunoprecipitation (IP), reaction. DNA is sonicated into fragments ranging in size from 300-1000 bp, and is divided into immunoprecipitated (IP) and input (IN) portions. IP DNA is subsequently heat denatured and then incubated with anti-5'mC, allowing the monoclonal antibody to bind methylated DNA. After this, magnetic beads containing a secondary antibody with affinity for the primary antibody are added, and incubated. These bead-linked antibodies will bind the monoclonal antibody used in the first step. DNA bound to the antibody complex (methylated DNA) is separated from the rest of the DNA by using a magnet to pull the complexes out of solution. Several washes using IP buffer are then performed to remove the unbound, non-methylated DNA. The methylated DNA/antibody complexes are then digested with Proteinase K to digest the antibodies leaving only the methylated DNA intact. The enriched DNA is purified by phenol:chloroform extraction to remove the protein matter and then precipitated and resuspended in water for later use. PCR techniques can be used to validate the efficiency of the MeDIP procedure by analyzing the amplification products of IP and IN DNA for regions known to lack and known to contain methylated sequences. The purified methylated DNA can then be used for locus-specific (PCR) or genome-wide (microarray and sequencing) methylation studies, and is particularly useful when applied in conjunction with other research tools such as gene expression profiling and array comparative genome hybridization (CGH) 8. Further investigation into DNA methylation will lead to the discovery of new epigenetic targets, which in turn, may be useful in developing new therapeutic or prognostic research tools for diseases such as cancer that are characterized by aberrantly methylated DNA 2, 4, 9-11.
Cell Biology, Issue 23, DNA methylation, immunoprecipitation, epigenomics, epigenetics, methylcytosine, MeDIP protocol, 5-methylcytosine antibody, anti-5-methylcytosine, microarray
Play Button
High Sensitivity 5-hydroxymethylcytosine Detection in Balb/C Brain Tissue
Authors: Theodore Davis, Romualdas Vaisvila.
Institutions: New England Biolabs.
DNA hydroxymethylation is a long known modification of DNA, but has recently become a focus in epigenetic research. Mammalian DNA is enzymatically modified at the 5th carbon position of cytosine (C) residues to 5-mC, predominately in the context of CpG dinucleotides. 5-mC is amenable to enzymatic oxidation to 5-hmC by the Tet family of enzymes, which are believed to be involved in development and disease. Currently, the biological role of 5-hmC is not fully understood, but is generating a lot of interest due to its potential as a biomarker. This is due to several groundbreaking studies identifying 5-hydroxymethylcytosine in mouse embryonic stem (ES) and neuronal cells. Research techniques, including bisulfite sequencing methods, are unable to easily distinguish between 5-mC and 5-hmC . A few protocols exist that can measure global amounts of 5-hydroxymethylcytosine in the genome, including liquid chromatography coupled with mass spectrometry analysis or thin layer chromatography of single nucleosides digested from genomic DNA. Antibodies that target 5-hydroxymethylcytosine also exist, which can be used for dot blot analysis, immunofluorescence, or precipitation of hydroxymethylated DNA, but these antibodies do not have single base resolution.In addition, resolution depends on the size of the immunoprecipitated DNA and for microarray experiments, depends on probe design. Since it is unknown exactly where 5-hydroxymethylcytosine exists in the genome or its role in epigenetic regulation, new techniques are required that can identify locus specific hydroxymethylation. The EpiMark 5-hmC and 5-mC Analysis Kit provides a solution for distinguishing between these two modifications at specific loci. The EpiMark 5-hmC and 5-mC Analysis Kit is a simple and robust method for the identification and quantitation of 5-methylcytosine and 5-hydroxymethylcytosine within a specific DNA locus. This enzymatic approach utilizes the differential methylation sensitivity of the isoschizomers MspI and HpaII in a simple 3-step protocol. Genomic DNA of interest is treated with T4-BGT, adding a glucose moeity to 5-hydroxymethylcytosine. This reaction is sequence-independent, therefore all 5-hmC will be glucosylated; unmodified or 5-mC containing DNA will not be affected. This glucosylation is then followed by restriction endonuclease digestion. MspI and HpaII recognize the same sequence (CCGG) but are sensitive to different methylation states. HpaII cleaves only a completely unmodified site: any modification (5-mC, 5-hmC or 5-ghmC) at either cytosine blocks cleavage. MspI recognizes and cleaves 5-mC and 5-hmC, but not 5-ghmC. The third part of the protocol is interrogation of the locus by PCR. As little as 20 ng of input DNA can be used. Amplification of the experimental (glucosylated and digested) and control (mock glucosylated and digested) target DNA with primers flanking a CCGG site of interest (100-200 bp) is performed. If the CpG site contains 5-hydroxymethylcytosine, a band is detected after glucosylation and digestion, but not in the non-glucosylated control reaction. Real time PCR will give an approximation of how much hydroxymethylcytosine is in this particular site. In this experiment, we will analyze the 5-hydroxymethylcytosine amount in a mouse Babl/C brain sample by end point PCR.
Neuroscience, Issue 48, EpiMark, Epigenetics, 5-hydroxymethylcytosine, 5-methylcytosine, methylation, hydroxymethylation
Play Button
A Chromatin Assay for Human Brain Tissue
Authors: Anouch Matevossian, Schahram Akbarian.
Institutions: University of Massachusetts Medical School.
Chronic neuropsychiatric illnesses such as schizophrenia, bipolar disease and autism are thought to result from a combination of genetic and environmental factors that might result in epigenetic alterations of gene expression and other molecular pathology. Traditionally, however, expression studies in postmortem brain were confined to quantification of mRNA or protein. The limitations encountered in postmortem brain research such as variabilities in autolysis time and tissue integrities are also likely to impact any studies of higher order chromatin structures. However, the nucleosomal organization of genomic DNA including DNA:core histone binding - appears to be largely preserved in representative samples provided by various brain banks. Therefore, it is possible to study the methylation pattern and other covalent modifications of the core histones at defined genomic loci in postmortem brain. Here, we present a simplified native chromatin immunoprecipitation (NChIP) protocol for frozen (never-fixed) human brain specimens. Starting with micrococcal nuclease digestion of brain homogenates, NChIP followed by qPCR can be completed within three days. The methodology presented here should be useful to elucidate epigenetic mechanisms of gene expression in normal and diseased human brain.
Neuroscience, Issue 13, Postmortem brain, Nucleosome, Histone, Methylation, Epigenetic, Chromatin, Human Brain
Play Button
DNA Extraction from Paraffin Embedded Material for Genetic and Epigenetic Analyses
Authors: Larissa A. Pikor, Katey S. S. Enfield, Heryet Cameron, Wan L. Lam.
Institutions: BC Cancer Research Centre, University of British Columbia - UBC, BC Cancer Agency, University of British Columbia - UBC.
Disease development and progression are characterized by frequent genetic and epigenetic aberrations including chromosomal rearrangements, copy number gains and losses and DNA methylation. Advances in high-throughput, genome-wide profiling technologies, such as microarrays, have significantly improved our ability to identify and detect these specific alterations. However as technology continues to improve, a limiting factor remains sample quality and availability. Furthermore, follow-up clinical information and disease outcome are often collected years after the initial specimen collection. Specimens, typically formalin-fixed and paraffin embedded (FFPE), are stored in hospital archives for years to decades. DNA can be efficiently and effectively recovered from paraffin-embedded specimens if the appropriate method of extraction is applied. High quality DNA extracted from properly preserved and stored specimens can support quantitative assays for comparisons of normal and diseased tissues and generation of genetic and epigenetic signatures 1. To extract DNA from paraffin-embedded samples, tissue cores or microdissected tissue are subjected to xylene treatment, which dissolves the paraffin from the tissue, and then rehydrated using a series of ethanol washes. Proteins and harmful enzymes such as nucleases are subsequently digested by proteinase K. The addition of lysis buffer, which contains denaturing agents such as sodium dodecyl sulfate (SDS), facilitates digestion 2. Nucleic acids are purified from the tissue lysate using buffer-saturated phenol and high speed centrifugation which generates a biphasic solution. DNA and RNA remain in the upper aqueous phase, while proteins, lipids and polysaccharides are sequestered in the inter- and organic-phases respectively. Retention of the aqueous phase and repeated phenol extractions generates a clean sample. Following phenol extractions, RNase A is added to eliminate contaminating RNA. Additional phenol extractions following incubation with RNase A are used to remove any remaining enzyme. The addition of sodium acetate and isopropanol precipitates DNA, and high speed centrifugation is used to pellet the DNA and facilitate isopropanol removal. Excess salts carried over from precipitation can interfere with subsequent enzymatic assays, but can be removed from the DNA by washing with 70% ethanol, followed by centrifugation to re-pellet the DNA 3. DNA is re-suspended in distilled water or the buffer of choice, quantified and stored at -20°C. Purified DNA can subsequently be used in downstream applications which include, but are not limited to, PCR, array comparative genomic hybridization 4 (array CGH), methylated DNA Immunoprecipitation (MeDIP) and sequencing, allowing for an integrative analysis of tissue/tumor samples.
Genetics, Issue 49, DNA extraction, paraffin embedded tissue, phenol:chloroform extraction, genetic analysis, epigenetic analysis
Play Button
Determination of DNA Methylation of Imprinted Genes in Arabidopsis Endosperm
Authors: Matthew Rea, Ming Chen, Shan Luan, Drutdaman Bhangu, Max Braud, Wenyan Xiao.
Institutions: Saint Louis University.
Arabidopsis thaliana is an excellent model organism for studying epigenetic mechanisms. One of the reasons is the loss-of-function null mutant of DNA methyltransferases is viable, thus providing a system to study how loss of DNA methylation in a genome affects growth and development. Imprinting refers to differential expression of maternal and paternal alleles and plays an important role in reproduction development in both mammal and plants. DNA methylation is critical for determining whether the maternal or paternal alleles of an imprinted gene is expressed or silenced. In flowering plants, there is a double fertilization event in reproduction: one sperm cell fertilizes the egg cell to form embryo and a second sperm fuses with the central cell to give rise to endosperm. Endosperm is the tissue where imprinting occurs in plants. MEDEA, a SET domain Polycomb group gene, and FWA, a transcription factor regulating flowering, are the first two genes shown to be imprinted in endosperm and their expression is controlled by DNA methylation and demethylation in plants. In order to determine imprinting status of a gene and methylation pattern in endosperm, we need to be able to isolate endosperm first. Since seed is tiny in Arabidopsis, it remains challenging to isolate Arabidopsis endosperm and examine its methylation. In this video protocol, we report how to conduct a genetic cross, to isolate endosperm tissue from seeds, and to determine the methylation status by bisulfite sequencing.
Plant Biology, Issue 47, DNA methylation, imprinting, bisulfite sequencing, endosperm, Arabidopsis
Play Button
2-Vessel Occlusion/Hypotension: A Rat Model of Global Brain Ischemia
Authors: Thomas H. Sanderson, Joseph M. Wider.
Institutions: Wayne State University School of Medicine, Wayne State University School of Medicine, Wayne State University School of Medicine.
Cardiac arrest followed by resuscitation often results in dramatic brain damage caused by ischemia and subsequent reperfusion of the brain. Global brain ischemia produces damage to specific brain regions shown to be highly sensitive to ischemia 1. Hippocampal neurons have higher sensitivity to ischemic insults compared to other cell populations, and specifically, the CA1 region of the hippocampus is particularly vulnerable to ischemia/reperfusion 2. The design of therapeutic interventions, or study of mechanisms involved in cerebral damage, requires a model that produces damage similar to the clinical condition and in a reproducible manner. Bilateral carotid vessel occlusion with hypotension (2VOH) is a model that produces reversible forebrain ischemia, emulating the cerebral events that can occur during cardiac arrest and resuscitation. We describe a model modified from Smith et al. (1984) 2, as first presented in its current form in Sanderson, et al. (2008) 3, which produces reproducible injury to selectively vulnerable brain regions 3-6. The reliability of this model is dictated by precise control of systemic blood pressure during applied hypotension, the duration of ischemia, close temperature control, a specific anesthesia regimen, and diligent post-operative care. An 8-minute ischemic insult produces cell death of CA1 hippocampal neurons that progresses over the course of 6 to 24 hr of reperfusion, while less vulnerable brain regions are spared. This progressive cell death is easily quantified after 7-14 days of reperfusion, as a near complete loss of CA1 neurons is evident at this time. In addition to this brain injury model, we present a method for CA1 damage quantification using a simple, yet thorough, methodology. Importantly, quantification can be accomplished using a simple camera-mounted microscope, and a free ImageJ (NIH) software plugin, obviating the need for cost-prohibitive stereology software programs and a motorized microscopic stage for damage assessment.
Medicine, Issue 76, Biomedical Engineering, Neurobiology, Neuroscience, Immunology, Anatomy, Physiology, Cardiology, Brain Ischemia, ischemia, reperfusion, cardiac arrest, resuscitation, 2VOH, brain injury model, CA1 hippocampal neurons, brain, neuron, blood vessel, occlusion, hypotension, animal model
Play Button
Compensatory Limb Use and Behavioral Assessment of Motor Skill Learning Following Sensorimotor Cortex Injury in a Mouse Model of Ischemic Stroke
Authors: Abigail L. Kerr, Kelly A. Tennant.
Institutions: Illinois Wesleyan University, University of Victoria.
Mouse models have become increasingly popular in the field of behavioral neuroscience, and specifically in studies of experimental stroke. As models advance, it is important to develop sensitive behavioral measures specific to the mouse. The present protocol describes a skilled motor task for use in mouse models of stroke. The Pasta Matrix Reaching Task functions as a versatile and sensitive behavioral assay that permits experimenters to collect accurate outcome data and manipulate limb use to mimic human clinical phenomena including compensatory strategies (i.e., learned non-use) and focused rehabilitative training. When combined with neuroanatomical tools, this task also permits researchers to explore the mechanisms that support behavioral recovery of function (or lack thereof) following stroke. The task is both simple and affordable to set up and conduct, offering a variety of training and testing options for numerous research questions concerning functional outcome following injury. Though the task has been applied to mouse models of stroke, it may also be beneficial in studies of functional outcome in other upper extremity injury models.
Behavior, Issue 89, Upper extremity impairment, Murine model, Rehabilitation, Reaching, Non-paretic limb training, Good limb training, Less-affected limb training, Learned non-use, Pasta matrix reaching task
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Embolic Middle Cerebral Artery Occlusion (MCAO) for Ischemic Stroke with Homologous Blood Clots in Rats
Authors: Rong Jin, Xiaolei Zhu, Guohong Li.
Institutions: Louisiana State University Health Science Center, Shreveport.
Clinically, thrombolytic therapy with use of recombinant tissue plasminogen activator (tPA) remains the most effective treatment for acute ischemic stroke. However, the use of tPA is limited by its narrow therapeutic window and by increased risk of hemorrhagic transformation. There is an urgent need to develop suitable stroke models to study new thrombolytic agents and strategies for treatment of ischemic stroke. At present, two major types of ischemic stroke models have been developed in rats and mice: intraluminal suture MCAO and embolic MCAO. Although MCAO models via the intraluminal suture technique have been widely used in mechanism-driven stroke research, these suture models do not mimic the clinical situation and are not suitable for thrombolytic studies. Among these models, the embolic MCAO model closely mimics human ischemic stroke and is suitable for preclinical investigation of thrombolytic therapy. This embolic model was first developed in rats by Overgaard et al.1 in 1992 and further characterized by Zhang et al. in 19972. Although embolic MCAO has gained increasing attention, there are technical problems faced by many laboratories. To meet increasing needs for thrombolytic research, we present a highly reproducible model of embolic MCAO in the rat, which can develop a predictable infarct volume within the MCA territory. In brief, a modified PE-50 tube is gently advanced from the external carotid artery (ECA) into the lumen of the internal carotid artery (ICA) until the tip of the catheter reaches the origin of the MCA. Through the catheter, a single homologous blood clot is placed at the origin of the MCA. To identify the success of MCA occlusion, regional cerebral blood flow was monitored, neurological deficits and infarct volumes were measured. The techniques presented in this paper should help investigators to overcome technical problems for establishing this model for stroke research.
Medicine, Issue 91, ischemic stroke, model, embolus, middle cerebral artery occlusion, thrombolytic therapy
Play Button
Permanent Cerebral Vessel Occlusion via Double Ligature and Transection
Authors: Melissa F. Davis, Christopher Lay, Ron D. Frostig.
Institutions: University of California, Irvine, University of California, Irvine, University of California, Irvine, University of California, Irvine.
Stroke is a leading cause of death, disability, and socioeconomic loss worldwide. The majority of all strokes result from an interruption in blood flow (ischemia) 1. Middle cerebral artery (MCA) delivers a great majority of blood to the lateral surface of the cortex 2, is the most common site of human stroke 3, and ischemia within its territory can result in extensive dysfunction or death 1,4,5. Survivors of ischemic stroke often suffer loss or disruption of motor capabilities, sensory deficits, and infarct. In an effort to capture these key characteristics of stroke, and thereby develop effective treatment, a great deal of emphasis is placed upon animal models of ischemia in MCA. Here we present a method of permanently occluding a cortical surface blood vessel. We will present this method using an example of a relevant vessel occlusion that models the most common type, location, and outcome of human stroke, permanent middle cerebral artery occlusion (pMCAO). In this model, we surgically expose MCA in the adult rat and subsequently occlude via double ligature and transection of the vessel. This pMCAO blocks the proximal cortical branch of MCA, causing ischemia in all of MCA cortical territory, a large portion of the cortex. This method of occlusion can also be used to occlude more distal portions of cortical vessels in order to achieve more focal ischemia targeting a smaller region of cortex. The primary disadvantages of pMCAO are that the surgical procedure is somewhat invasive as a small craniotomy is required to access MCA, though this results in minimal tissue damage. The primary advantages of this model, however, are: the site of occlusion is well defined, the degree of blood flow reduction is consistent, functional and neurological impairment occurs rapidly, infarct size is consistent, and the high rate of survival allows for long-term chronic assessment.
Medicine, Issue 77, Biomedical Engineering, Anatomy, Physiology, Neurobiology, Neuroscience, Behavior, Surgery, Therapeutics, Surgical Procedures, Operative, Investigative Techniques, Life Sciences (General), Behavioral Sciences, Animal models, Stroke, ischemia, imaging, middle cerebral artery, vessel occlusion, rodent model, surgical techniques, animal model
Play Button
Modeling Stroke in Mice: Permanent Coagulation of the Distal Middle Cerebral Artery
Authors: Gemma Llovera, Stefan Roth, Nikolaus Plesnila, Roland Veltkamp, Arthur Liesz.
Institutions: University Hospital Munich, Munich Cluster for Systems Neurology (SyNergy), University Heidelberg, Charing Cross Hospital.
Stroke is the third most common cause of death and a main cause of acquired adult disability in developed countries. Only very limited therapeutical options are available for a small proportion of stroke patients in the acute phase. Current research is intensively searching for novel therapeutic strategies and is increasingly focusing on the sub-acute and chronic phase after stroke because more patients might be eligible for therapeutic interventions in a prolonged time window. These delayed mechanisms include important pathophysiological pathways such as post-stroke inflammation, angiogenesis, neuronal plasticity and regeneration. In order to analyze these mechanisms and to subsequently evaluate novel drug targets, experimental stroke models with clinical relevance, low mortality and high reproducibility are sought after. Moreover, mice are the smallest mammals in which a focal stroke lesion can be induced and for which a broad spectrum of transgenic models are available. Therefore, we describe here the mouse model of transcranial, permanent coagulation of the middle cerebral artery via electrocoagulation distal of the lenticulostriatal arteries, the so-called “coagulation model”. The resulting infarct in this model is located mainly in the cortex; the relative infarct volume in relation to brain size corresponds to the majority of human strokes. Moreover, the model fulfills the above-mentioned criteria of reproducibility and low mortality. In this video we demonstrate the surgical methods of stroke induction in the “coagulation model” and report histological and functional analysis tools.
Medicine, Issue 89, stroke, brain ischemia, animal model, middle cerebral artery, electrocoagulation
Play Button
Optimized Analysis of DNA Methylation and Gene Expression from Small, Anatomically-defined Areas of the Brain
Authors: Marc Bettscheider, Arleta Kuczynska, Osborne Almeida, Dietmar Spengler.
Institutions: Max Planck Institute of Psychiatry.
Exposure to diet, drugs and early life adversity during sensitive windows of life 1,2 can lead to lasting changes in gene expression that contribute to the display of physiological and behavioural phenotypes. Such environmental programming is likely to increase the susceptibility to metabolic, cardiovascular and mental diseases 3,4. DNA methylation and histone modifications are considered key processes in the mediation of the gene-environment dialogue and appear also to underlay environmental programming 5. In mammals, DNA methylation typically comprises the covalent addition of a methyl group at the 5-position of cytosine within the context of CpG dinucleotides. CpG methylation occurs in a highly tissue- and cell-specific manner making it a challenge to study discrete, small regions of the brain where cellular heterogeneity is high and tissue quantity limited. Moreover, because gene expression and methylation are closely linked events, increased value can be gained by comparing both parameters in the same sample. Here, a step-by-step protocol (Figure 1) for the investigation of epigenetic programming in the brain is presented using the 'maternal separation' paradigm of early life adversity for illustrative purposes. The protocol describes the preparation of micropunches from differentially-aged mouse brains from which DNA and RNA can be simultaneously isolated, thus allowing DNA methylation and gene expression analyses in the same sample.
Neuroscience, Issue 65, Genetics, Physiology, Epigenetics, DNA methylation, early-life stress, maternal separation, bisulfite sequencing
Play Button
DNA Methylation: Bisulphite Modification and Analysis
Authors: Kate Patterson, Laura Molloy, Wenjia Qu, Susan Clark.
Institutions: Garvan Institute of Medical Research, University of NSW.
Epigenetics describes the heritable changes in gene function that occur independently to the DNA sequence. The molecular basis of epigenetic gene regulation is complex, but essentially involves modifications to the DNA itself or the proteins with which DNA associates. The predominant epigenetic modification of DNA in mammalian genomes is methylation of cytosine nucleotides (5-MeC). DNA methylation provides instruction to gene expression machinery as to where and when the gene should be expressed. The primary target sequence for DNA methylation in mammals is 5'-CpG-3' dinucleotides (Figure 1). CpG dinucleotides are not uniformly distributed throughout the genome, but are concentrated in regions of repetitive genomic sequences and CpG "islands" commonly associated with gene promoters (Figure 1). DNA methylation patterns are established early in development, modulated during tissue specific differentiation and disrupted in many disease states including cancer. To understand the biological role of DNA methylation and its role in human disease, precise, efficient and reproducible methods are required to detect and quantify individual 5-MeCs. This protocol for bisulphite conversion is the "gold standard" for DNA methylation analysis and facilitates identification and quantification of DNA methylation at single nucleotide resolution. The chemistry of cytosine deamination by sodium bisulphite involves three steps (Figure 2). (1) Sulphonation: The addition of bisulphite to the 5-6 double bond of cytosine (2) Hydrolic Deamination: hydrolytic deamination of the resulting cytosine-bisulphite derivative to give a uracil-bisulphite derivative (3) Alkali Desulphonation: Removal of the sulphonate group by an alkali treatment, to give uracil. Bisulphite preferentially deaminates cytosine to uracil in single stranded DNA, whereas 5-MeC, is refractory to bisulphite-mediated deamination. Upon PCR amplification, uracil is amplified as thymine while 5-MeC residues remain as cytosines, allowing methylated CpGs to be distinguished from unmethylated CpGs by presence of a cytosine "C" versus thymine "T" residue during sequencing. DNA modification by bisulphite conversion is a well-established protocol that can be exploited for many methods of DNA methylation analysis. Since the detection of 5-MeC by bisulphite conversion was first demonstrated by Frommer et al.1 and Clark et al.2, methods based around bisulphite conversion of genomic DNA account for the majority of new data on DNA methylation. Different methods of post PCR analysis may be utilized, depending on the degree of specificity and resolution of methylation required. Cloning and sequencing is still the most readily available method that can give single nucleotide resolution for methylation across the DNA molecule.
Genetics, Issue 56, epigenetics, DNA methylation, Bisulphite, 5-methylcytosine (5-MeC), PCR
Play Button
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Authors: Daniel T. Claiborne, Jessica L. Prince, Eric Hunter.
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro replication of HIV-1 as influenced by the gag gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro replication of chronically derived gag-pro sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Play Button
Single Oocyte Bisulfite Mutagenesis
Authors: Michelle M. Denomme, Liyue Zhang, Mellissa R.W. Mann.
Institutions: Schulich School of Medicine and Dentistry, University of Western Ontario, Schulich School of Medicine and Dentistry, University of Western Ontario, Children's Health Research Institute.
Epigenetics encompasses all heritable and reversible modifications to chromatin that alter gene accessibility, and thus are the primary mechanisms for regulating gene transcription1. DNA methylation is an epigenetic modification that acts predominantly as a repressive mark. Through the covalent addition of a methyl group onto cytosines in CpG dinucleotides, it can recruit additional repressive proteins and histone modifications to initiate processes involved in condensing chromatin and silencing genes2. DNA methylation is essential for normal development as it plays a critical role in developmental programming, cell differentiation, repression of retroviral elements, X-chromosome inactivation and genomic imprinting. One of the most powerful methods for DNA methylation analysis is bisulfite mutagenesis. Sodium bisulfite is a DNA mutagen that deaminates cytosines into uracils. Following PCR amplification and sequencing, these conversion events are detected as thymines. Methylated cytosines are protected from deamination and thus remain as cytosines, enabling identification of DNA methylation at the individual nucleotide level3. Development of the bisulfite mutagenesis assay has advanced from those originally reported4-6 towards ones that are more sensitive and reproducible7. One key advancement was embedding smaller amounts of DNA in an agarose bead, thereby protecting DNA from the harsh bisulfite treatment8. This enabled methylation analysis to be performed on pools of oocytes and blastocyst-stage embryos9. The most sophisticated bisulfite mutagenesis protocol to date is for individual blastocyst-stage embryos10. However, since blastocysts have on average 64 cells (containing 120-720 pg of genomic DNA), this method is not efficacious for methylation studies on individual oocytes or cleavage-stage embryos. Taking clues from agarose embedding of minute DNA amounts including oocytes11, here we present a method whereby oocytes are directly embedded in an agarose and lysis solution bead immediately following retrieval and removal of the zona pellucida from the oocyte. This enables us to bypass the two main challenges of single oocyte bisulfite mutagenesis: protecting a minute amount of DNA from degradation, and subsequent loss during the numerous protocol steps. Importantly, as data are obtained from single oocytes, the issue of PCR bias within pools is eliminated. Furthermore, inadvertent cumulus cell contamination is detectable by this method since any sample with more than one methylation pattern may be excluded from analysis12. This protocol provides an improved method for successful and reproducible analyses of DNA methylation at the single-cell level and is ideally suited for individual oocytes as well as cleavage-stage embryos.
Genetics, Issue 64, Developmental Biology, Biochemistry, Bisulfite mutagenesis, DNA methylation, individual oocyte, individual embryo, mouse model, PCR, epigenetics
Play Button
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Authors: Phoebe Spetsieris, Yilong Ma, Shichun Peng, Ji Hyun Ko, Vijay Dhawan, Chris C. Tang, David Eidelberg.
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
Play Button
A Zebrafish Model of Diabetes Mellitus and Metabolic Memory
Authors: Robert V. Intine, Ansgar S. Olsen, Michael P. Sarras Jr..
Institutions: Rosalind Franklin University of Medicine and Science, Rosalind Franklin University of Medicine and Science.
Diabetes mellitus currently affects 346 million individuals and this is projected to increase to 400 million by 2030. Evidence from both the laboratory and large scale clinical trials has revealed that diabetic complications progress unimpeded via the phenomenon of metabolic memory even when glycemic control is pharmaceutically achieved. Gene expression can be stably altered through epigenetic changes which not only allow cells and organisms to quickly respond to changing environmental stimuli but also confer the ability of the cell to "memorize" these encounters once the stimulus is removed. As such, the roles that these mechanisms play in the metabolic memory phenomenon are currently being examined. We have recently reported the development of a zebrafish model of type I diabetes mellitus and characterized this model to show that diabetic zebrafish not only display the known secondary complications including the changes associated with diabetic retinopathy, diabetic nephropathy and impaired wound healing but also exhibit impaired caudal fin regeneration. This model is unique in that the zebrafish is capable to regenerate its damaged pancreas and restore a euglycemic state similar to what would be expected in post-transplant human patients. Moreover, multiple rounds of caudal fin amputation allow for the separation and study of pure epigenetic effects in an in vivo system without potential complicating factors from the previous diabetic state. Although euglycemia is achieved following pancreatic regeneration, the diabetic secondary complication of fin regeneration and skin wound healing persists indefinitely. In the case of impaired fin regeneration, this pathology is retained even after multiple rounds of fin regeneration in the daughter fin tissues. These observations point to an underlying epigenetic process existing in the metabolic memory state. Here we present the methods needed to successfully generate the diabetic and metabolic memory groups of fish and discuss the advantages of this model.
Medicine, Issue 72, Genetics, Genomics, Physiology, Anatomy, Biomedical Engineering, Metabolomics, Zebrafish, diabetes, metabolic memory, tissue regeneration, streptozocin, epigenetics, Danio rerio, animal model, diabetes mellitus, diabetes, drug discovery, hyperglycemia
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Play Button
Assessment and Evaluation of the High Risk Neonate: The NICU Network Neurobehavioral Scale
Authors: Barry M. Lester, Lynne Andreozzi-Fontaine, Edward Tronick, Rosemarie Bigsby.
Institutions: Brown University, Women & Infants Hospital of Rhode Island, University of Massachusetts, Boston.
There has been a long-standing interest in the assessment of the neurobehavioral integrity of the newborn infant. The NICU Network Neurobehavioral Scale (NNNS) was developed as an assessment for the at-risk infant. These are infants who are at increased risk for poor developmental outcome because of insults during prenatal development, such as substance exposure or prematurity or factors such as poverty, poor nutrition or lack of prenatal care that can have adverse effects on the intrauterine environment and affect the developing fetus. The NNNS assesses the full range of infant neurobehavioral performance including neurological integrity, behavioral functioning, and signs of stress/abstinence. The NNNS is a noninvasive neonatal assessment tool with demonstrated validity as a predictor, not only of medical outcomes such as cerebral palsy diagnosis, neurological abnormalities, and diseases with risks to the brain, but also of developmental outcomes such as mental and motor functioning, behavior problems, school readiness, and IQ. The NNNS can identify infants at high risk for abnormal developmental outcome and is an important clinical tool that enables medical researchers and health practitioners to identify these infants and develop intervention programs to optimize the development of these infants as early as possible. The video shows the NNNS procedures, shows examples of normal and abnormal performance and the various clinical populations in which the exam can be used.
Behavior, Issue 90, NICU Network Neurobehavioral Scale, NNNS, High risk infant, Assessment, Evaluation, Prediction, Long term outcome
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
Play Button
Microarray-based Identification of Individual HERV Loci Expression: Application to Biomarker Discovery in Prostate Cancer
Authors: Philippe Pérot, Valérie Cheynet, Myriam Decaussin-Petrucci, Guy Oriol, Nathalie Mugnier, Claire Rodriguez-Lafrasse, Alain Ruffion, François Mallet.
Institutions: Joint Unit Hospices de Lyon-bioMérieux, BioMérieux, Hospices Civils de Lyon, Lyon 1 University, BioMérieux, Hospices Civils de Lyon, Hospices Civils de Lyon.
The prostate-specific antigen (PSA) is the main diagnostic biomarker for prostate cancer in clinical use, but it lacks specificity and sensitivity, particularly in low dosage values1​​. ‘How to use PSA' remains a current issue, either for diagnosis as a gray zone corresponding to a concentration in serum of 2.5-10 ng/ml which does not allow a clear differentiation to be made between cancer and noncancer2 or for patient follow-up as analysis of post-operative PSA kinetic parameters can pose considerable challenges for their practical application3,4. Alternatively, noncoding RNAs (ncRNAs) are emerging as key molecules in human cancer, with the potential to serve as novel markers of disease, e.g. PCA3 in prostate cancer5,6 and to reveal uncharacterized aspects of tumor biology. Moreover, data from the ENCODE project published in 2012 showed that different RNA types cover about 62% of the genome. It also appears that the amount of transcriptional regulatory motifs is at least 4.5x higher than the one corresponding to protein-coding exons. Thus, long terminal repeats (LTRs) of human endogenous retroviruses (HERVs) constitute a wide range of putative/candidate transcriptional regulatory sequences, as it is their primary function in infectious retroviruses. HERVs, which are spread throughout the human genome, originate from ancestral and independent infections within the germ line, followed by copy-paste propagation processes and leading to multicopy families occupying 8% of the human genome (note that exons span 2% of our genome). Some HERV loci still express proteins that have been associated with several pathologies including cancer7-10. We have designed a high-density microarray, in Affymetrix format, aiming to optimally characterize individual HERV loci expression, in order to better understand whether they can be active, if they drive ncRNA transcription or modulate coding gene expression. This tool has been applied in the prostate cancer field (Figure 1).
Medicine, Issue 81, Cancer Biology, Genetics, Molecular Biology, Prostate, Retroviridae, Biomarkers, Pharmacological, Tumor Markers, Biological, Prostatectomy, Microarray Analysis, Gene Expression, Diagnosis, Human Endogenous Retroviruses, HERV, microarray, Transcriptome, prostate cancer, Affymetrix
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Application of MassSQUIRM for Quantitative Measurements of Lysine Demethylase Activity
Authors: Lauren P. Blair, Nathan L. Avaritt, Alan J. Tackett.
Institutions: University of Arkansas for Medical Sciences .
Recently, epigenetic regulators have been discovered as key players in many different diseases 1-3. As a result, these enzymes are prime targets for small molecule studies and drug development 4. Many epigenetic regulators have only recently been discovered and are still in the process of being classified. Among these enzymes are lysine demethylases which remove methyl groups from lysines on histones and other proteins. Due to the novel nature of this class of enzymes, few assays have been developed to study their activity. This has been a road block to both the classification and high throughput study of histone demethylases. Currently, very few demethylase assays exist. Those that do exist tend to be qualitative in nature and cannot simultaneously discern between the different lysine methylation states (un-, mono-, di- and tri-). Mass spectrometry is commonly used to determine demethylase activity but current mass spectrometric assays do not address whether differentially methylated peptides ionize differently. Differential ionization of methylated peptides makes comparing methylation states difficult and certainly not quantitative (Figure 1A). Thus available assays are not optimized for the comprehensive analysis of demethylase activity. Here we describe a method called MassSQUIRM (mass spectrometric quantitation using isotopic reductive methylation) that is based on reductive methylation of amine groups with deuterated formaldehyde to force all lysines to be di-methylated, thus making them essentially the same chemical species and therefore ionize the same (Figure 1B). The only chemical difference following the reductive methylation is hydrogen and deuterium, which does not affect MALDI ionization efficiencies. The MassSQUIRM assay is specific for demethylase reaction products with un-, mono- or di-methylated lysines. The assay is also applicable to lysine methyltransferases giving the same reaction products. Here, we use a combination of reductive methylation chemistry and MALDI mass spectrometry to measure the activity of LSD1, a lysine demethylase capable of removing di- and mono-methyl groups, on a synthetic peptide substrate 5. This assay is simple and easily amenable to any lab with access to a MALDI mass spectrometer in lab or through a proteomics facility. The assay has ~8-fold dynamic range and is readily scalable to plate format 5.
Molecular Biology, Issue 61, LSD1, lysine demethylase, mass spectrometry, reductive methylation, demethylase quantification
Play Button
Photothrombotic Ischemia: A Minimally Invasive and Reproducible Photochemical Cortical Lesion Model for Mouse Stroke Studies
Authors: Vivien Labat-gest, Simone Tomasi.
Institutions: University of Turin , University of Turin , University of Turin , University of Turin .
The photothrombotic stroke model aims to induce an ischemic damage within a given cortical area by means of photo-activation of a previously injected light-sensitive dye. Following illumination, the dye is activated and produces singlet oxygen that damages components of endothelial cell membranes, with subsequent platelet aggregation and thrombi formation, which eventually determines the interruption of local blood flow. This approach, initially proposed by Rosenblum and El-Sabban in 1977, was later improved by Watson in 1985 in rat brain and set the basis of the current model. Also, the increased availability of transgenic mouse lines further contributed to raise the interest on the photothrombosis model. Briefly, a photosensitive dye (Rose Bengal) is injected intraperitoneally and enters the blood stream. When illuminated by a cold light source, the dye becomes activated and induces endothelial damage with platelet activation and thrombosis, resulting in local blood flow interruption. The light source can be applied on the intact skull with no need of craniotomy, which allows targeting of any cortical area of interest in a reproducible and non-invasive way. The mouse is then sutured and allowed to wake up. The evaluation of ischemic damage can be quickly accomplished by triphenyl-tetrazolium chloride or cresyl violet staining. This technique produces infarction of small size and well-delimited boundaries, which is highly advantageous for precise cell characterization or functional studies. Furthermore, it is particularly suitable for studying cellular and molecular responses underlying brain plasticity in transgenic mice.
Medicine, Issue 76, Biomedical Engineering, Immunology, Anatomy, Physiology, Neuroscience, Neurobiology, Surgery, Cerebral Cortex, Brain Ischemia, Stroke, Brain Injuries, Brain Ischemia, Thrombosis, Photothrombosis, Rose Bengal, experimental stroke, animal models, cortex, injury, protocol, method, technique, video, ischemia, animal model
Play Button
Neuronal Nuclei Isolation from Human Postmortem Brain Tissue
Authors: Anouch Matevossian, Schahram Akbarian.
Institutions: University of Massachusetts Medical School.
Neurons in the human brain become postmitotic largely during prenatal development, and thus maintain their nuclei throughout the full lifespan. However, little is known about changes in neuronal chromatin and nuclear organization during the course of development and aging, or in chronic neuropsychiatric disease. However, to date most chromatin and DNA based assays (other than FISH) lack single cell resolution. To this end, the considerable cellular heterogeneity of brain tissue poses a significant limitation, because typically various subpopulations of neurons are intermingled with different types of glia and other non-neuronal cells. One possible solution would be to grow cell-type specific cultures, but most CNS cells, including neurons, are ex vivo sustainable, at best, for only a few weeks and thus would provide an incomplete model for epigenetic mechanisms potentially operating across the full lifespan. Here, we provide a protocol to extract and purify nuclei from frozen (never fixed) human postmortem brain. The method involves extraction of nuclei in hypotonic lysis buffer, followed by ultracentrifugation and immunotagging with anti-NeuN antibody. Labeled neuronal nuclei are then collected separately using fluorescence-activated sorting. This method should be applicable to any brain region in a wide range of species and suitable for chromatin immunoprecipitation studies with site- and modification-specific anti-histone antibodies, and for DNA methylation and other assays.
Neuroscience, Issue 20, FACS, postmortem brain, epigenetic, human brain, nueronal nuclei, immunotagging
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.