JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Normalization of high dimensional genomics data where the distribution of the altered variables is skewed.
PLoS ONE
PUBLISHED: 05-25-2011
Genome-wide analysis of gene expression or protein binding patterns using different array or sequencing based technologies is now routinely performed to compare different populations, such as treatment and reference groups. It is often necessary to normalize the data obtained to remove technical variation introduced in the course of conducting experimental work, but standard normalization techniques are not capable of eliminating technical bias in cases where the distribution of the truly altered variables is skewed, i.e. when a large fraction of the variables are either positively or negatively affected by the treatment. However, several experiments are likely to generate such skewed distributions, including ChIP-chip experiments for the study of chromatin, gene expression experiments for the study of apoptosis, and SNP-studies of copy number variation in normal and tumour tissues. A preliminary study using spike-in array data established that the capacity of an experiment to identify altered variables and generate unbiased estimates of the fold change decreases as the fraction of altered variables and the skewness increases. We propose the following work-flow for analyzing high-dimensional experiments with regions of altered variables: (1) Pre-process raw data using one of the standard normalization techniques. (2) Investigate if the distribution of the altered variables is skewed. (3) If the distribution is not believed to be skewed, no additional normalization is needed. Otherwise, re-normalize the data using a novel HMM-assisted normalization procedure. (4) Perform downstream analysis. Here, ChIP-chip data and simulated data were used to evaluate the performance of the work-flow. It was found that skewed distributions can be detected by using the novel DSE-test (Detection of Skewed Experiments). Furthermore, applying the HMM-assisted normalization to experiments where the distribution of the truly altered variables is skewed results in considerably higher sensitivity and lower bias than can be attained using standard and invariant normalization methods.
Authors: Natalie J. Saez, Hervé Nozach, Marilyne Blemont, Renaud Vincentelli.
Published: 07-30-2014
ABSTRACT
Escherichia coli (E. coli) is the most widely used expression system for the production of recombinant proteins for structural and functional studies. However, purifying proteins is sometimes challenging since many proteins are expressed in an insoluble form. When working with difficult or multiple targets it is therefore recommended to use high throughput (HTP) protein expression screening on a small scale (1-4 ml cultures) to quickly identify conditions for soluble expression. To cope with the various structural genomics programs of the lab, a quantitative (within a range of 0.1-100 mg/L culture of recombinant protein) and HTP protein expression screening protocol was implemented and validated on thousands of proteins. The protocols were automated with the use of a liquid handling robot but can also be performed manually without specialized equipment. Disulfide-rich venom proteins are gaining increasing recognition for their potential as therapeutic drug leads. They can be highly potent and selective, but their complex disulfide bond networks make them challenging to produce. As a member of the FP7 European Venomics project (www.venomics.eu), our challenge is to develop successful production strategies with the aim of producing thousands of novel venom proteins for functional characterization. Aided by the redox properties of disulfide bond isomerase DsbC, we adapted our HTP production pipeline for the expression of oxidized, functional venom peptides in the E. coli cytoplasm. The protocols are also applicable to the production of diverse disulfide-rich proteins. Here we demonstrate our pipeline applied to the production of animal venom proteins. With the protocols described herein it is likely that soluble disulfide-rich proteins will be obtained in as little as a week. Even from a small scale, there is the potential to use the purified proteins for validating the oxidation state by mass spectrometry, for characterization in pilot studies, or for sensitive micro-assays.
24 Related JoVE Articles!
Play Button
Efficient Chromatin Immunoprecipitation using Limiting Amounts of Biomass
Authors: Dean Tantin, Warren P. Voth, Arvind Shakya.
Institutions: University of Utah School of Medicine.
Chromatin immunoprecipitation (ChIP) is a widely-used method for determining the interactions of different proteins with DNA in chromatin of living cells. Examples include sequence-specific DNA binding transcription factors, histones and their different modification states, enzymes such as RNA polymerases and ancillary factors, and DNA repair components. Despite its ubiquity, there is a lack of up-to-date, detailed methodologies for both bench preparation of material and for accurate analysis allowing quantitative metrics of interaction. Due to this lack of information, and also because, like any immunoprecipitation, conditions must be re-optimized for new sets of experimental conditions, the ChIP assay is susceptible to inaccurate or poorly quantitative results. Our protocol is ultimately derived from seminal work on transcription factor:DNA interactions1,2 , but incorporates a number of improvements to sensitivity and reproducibility for difficult-to-obtain cell types. The protocol has been used successfully3,4 , both using qPCR to quantify DNA enrichment, or using a semi-quantitative variant of the below protocol. This quantitative analysis of PCR-amplified material is performed computationally, and represents a limiting factor in the assay. Important controls and other considerations include the use of an isotype-matched antibody, as well as evaluation of a control region of genomic DNA, such as an intergenic region predicted not to be bound by the protein under study (or anticipated not to show changes under the experimental conditions). In addition, a standard curve of input material for every ChIP sample is used to derive absolute levels of enrichment in the experimental material. Use of standard curves helps to take into account differences between primer sets, regardless of how carefully they are designed, and also efficiency differences throughout the range of template concentrations for a single primer set. Our protocol is different from others that are available5-8 in that we extensively cover the later, analysis phase.
Molecular Biology, Issue 75, Genetics, Cellular Biology, Biomedical Engineering, Microbiology, Immunology, Biochemistry, Proteins, life sciences, animal models, chromatin immunoprecipitation, ChIP, chromatin, immunoprecipitation, gene regulation, T lymphocyte, transcription factor, chromatin modification, DNA, quantitative PCR, PCR, cells, isolation, animal model
50064
Play Button
Assaying β-amyloid Toxicity using a Transgenic C. elegans Model
Authors: Vishantie Dostal, Christopher D. Link.
Institutions: University of Colorado, University of Colorado.
Accumulation of the β-amyloid peptide (Aβ) is generally believed to be central to the induction of Alzheimer's disease, but the relevant mechanism(s) of toxicity are still unclear. Aβ is also deposited intramuscularly in Inclusion Body Myositis, a severe human myopathy. The intensely studied nematode worm Caenorhabditis elegans can be transgenically engineered to express human Aβ. Depending on the tissue or timing of Aβ expression, transgenic worms can have readily measurable phenotypes that serve as a read-out of Aβ toxicity. For example, transgenic worms with pan-neuronal Aβ expression have defects is associative learning (Dosanjh et al. 2009), while transgenic worms with constitutive muscle-specific expression show a progressive, age-dependent paralysis phenotype (Link, 1995; Cohen et al. 2006). One particularly useful C. elegans model employs a temperature-sensitive mutation in the mRNA surveillance system to engineer temperature-inducible muscle expression of an Aβ transgene, resulting in a reproducible paralysis phenotype upon temperature upshift (Link et al. 2003). Treatments that counter Aβ toxicity in this model [e.g., expression of a protective transgene (Hassan et al. 2009) or exposure to Ginkgo biloba extracts (Wu et al. 2006)] reproducibly alter the rate of paralysis induced by temperature upshift of these transgenic worms. Here we describe our protocol for measuring the rate of paralysis in this transgenic C. elegans model, with particular attention to experimental variables that can influence this measurement.
Neuroscience, Issue 44, Alzheimer's disease, paralysis, compound screening, Inclusion Body Myositis, invertebrate model
2252
Play Button
Large Scale Non-targeted Metabolomic Profiling of Serum by Ultra Performance Liquid Chromatography-Mass Spectrometry (UPLC-MS)
Authors: Corey D. Broeckling, Adam L. Heuberger, Jessica E. Prenni.
Institutions: Colorado State University.
Non-targeted metabolite profiling by ultra performance liquid chromatography coupled with mass spectrometry (UPLC-MS) is a powerful technique to investigate metabolism. The approach offers an unbiased and in-depth analysis that can enable the development of diagnostic tests, novel therapies, and further our understanding of disease processes. The inherent chemical diversity of the metabolome creates significant analytical challenges and there is no single experimental approach that can detect all metabolites. Additionally, the biological variation in individual metabolism and the dependence of metabolism on environmental factors necessitates large sample numbers to achieve the appropriate statistical power required for meaningful biological interpretation. To address these challenges, this tutorial outlines an analytical workflow for large scale non-targeted metabolite profiling of serum by UPLC-MS. The procedure includes guidelines for sample organization and preparation, data acquisition, quality control, and metabolite identification and will enable reliable acquisition of data for large experiments and provide a starting point for laboratories new to non-targeted metabolite profiling by UPLC-MS.
Chemistry, Issue 73, Biochemistry, Genetics, Molecular Biology, Physiology, Genomics, Proteins, Proteomics, Metabolomics, Metabolite Profiling, Non-targeted metabolite profiling, mass spectrometry, Ultra Performance Liquid Chromatography, UPLC-MS, serum, spectrometry
50242
Play Button
Light/dark Transition Test for Mice
Authors: Keizo Takao, Tsuyoshi Miyakawa.
Institutions: Graduate School of Medicine, Kyoto University.
Although all of the mouse genome sequences have been determined, we do not yet know the functions of most of these genes. Gene-targeting techniques, however, can be used to delete or manipulate a specific gene in mice. The influence of a given gene on a specific behavior can then be determined by conducting behavioral analyses of the mutant mice. As a test for behavioral phenotyping of mutant mice, the light/dark transition test is one of the most widely used tests to measure anxiety-like behavior in mice. The test is based on the natural aversion of mice to brightly illuminated areas and on their spontaneous exploratory behavior in novel environments. The test is sensitive to anxiolytic drug treatment. The apparatus consists of a dark chamber and a brightly illuminated chamber. Mice are allowed to move freely between the two chambers. The number of entries into the bright chamber and the duration of time spent there are indices of bright-space anxiety in mice. To obtain phenotyping results of a strain of mutant mice that can be readily reproduced and compared with those of other mutants, the behavioral test methods should be as identical as possible between laboratories. The procedural differences that exist between laboratories, however, make it difficult to replicate or compare the results among laboratories. Here, we present our protocol for the light/dark transition test as a movie so that the details of the protocol can be demonstrated. In our laboratory, we have assessed more than 60 strains of mutant mice using the protocol shown in the movie. Those data will be disclosed as a part of a public database that we are now constructing. Visualization of the protocol will facilitate understanding of the details of the entire experimental procedure, allowing for standardization of the protocols used across laboratories and comparisons of the behavioral phenotypes of various strains of mutant mice assessed using this test.
Neuroscience, Issue 1, knockout mice, transgenic mice, behavioral test, phenotyping
104
Play Button
Performing Custom MicroRNA Microarray Experiments
Authors: Xiaoxiao Zhang, Yan Zeng.
Institutions: University of Minnesota , University of Minnesota .
microRNAs (miRNAs) are a large family of ˜ 22 nucleotides (nt) long RNA molecules that are widely expressed in eukaryotes 1. Complex genomes encode at least hundreds of miRNAs, which primarily inhibit the expression of a vast number of target genes post-transcriptionally 2, 3. miRNAs control a broad range of biological processes 1. In addition, altered miRNA expression has been associated with human diseases such as cancers, and miRNAs may serve as biomarkers for diseases and prognosis 4, 5. It is important, therefore, to understand the expression and functions of miRNAs under many different conditions. Three major approaches have been employed to profile miRNA expression: real-time PCR, microarray, and deep sequencing. The technique of miRNA microarray has the advantage of being high-throughput, generally less expensive, and most of the experimental and analysis steps can be carried out in a molecular biology laboratory at most universities, medical schools and associated hospitals. Here, we describe a method for performing custom miRNA microarray experiments. A miRNA probe set will be printed on glass slides to produce miRNA microarrays. RNA is isolated using a method or reagent that preserves small RNA species, and then labeled with a fluorescence dye. As a control, reference DNA oligonucleotides corresponding to a subset of miRNAs are also labeled with a different fluorescence dye. The reference DNA will serve to demonstrate the quality of the slide and hybridization and will also be used for data normalization. The RNA and DNA are mixed and hybridized to a microarray slide containing probes for most of the miRNAs in the database. After washing, the slide is scanned to obtain images, and intensities of the individual spots quantified. These raw signals will be further processed and analyzed as the expression data of the corresponding miRNAs. Microarray slides can be stripped and regenerated to reduce the cost of microarrays and to enhance the consistency of microarray experiments. The same principles and procedures are applicable to other types of custom microarray experiments.
Molecular Biology, Issue 56, Genetics, microRNA, custom microarray, oligonucleotide probes, RNA labeling
3250
Play Button
Rapid and Efficient Zebrafish Genotyping Using PCR with High-resolution Melt Analysis
Authors: Lingyan Xing, Tyler S. Quist, Tamara J. Stevenson, Timothy J. Dahlem, Joshua L. Bonkowsky.
Institutions: University of Utah School of Medicine, University of Utah School of Medicine, University of Utah School of Medicine, University of Utah School of Medicine, University of Utah School of Medicine.
Zebrafish is a powerful vertebrate model system for studying development, modeling disease, and performing drug screening. Recently a variety of genetic tools have been introduced, including multiple strategies for inducing mutations and generating transgenic lines. However, large-scale screening is limited by traditional genotyping methods, which are time-consuming and labor-intensive. Here we describe a technique to analyze zebrafish genotypes by PCR combined with high-resolution melting analysis (HRMA). This approach is rapid, sensitive, and inexpensive, with lower risk of contamination artifacts. Genotyping by PCR with HRMA can be used for embryos or adult fish, including in high-throughput screening protocols.
Basic Protocol, Issue 84, genotyping, high-resolution melting analysis (HRMA), PCR, zebrafish, mutation, transgenes
51138
Play Button
Using an Automated 3D-tracking System to Record Individual and Shoals of Adult Zebrafish
Authors: Hans Maaswinkel, Liqun Zhu, Wei Weng.
Institutions: xyZfish.
Like many aquatic animals, zebrafish (Danio rerio) moves in a 3D space. It is thus preferable to use a 3D recording system to study its behavior. The presented automatic video tracking system accomplishes this by using a mirror system and a calibration procedure that corrects for the considerable error introduced by the transition of light from water to air. With this system it is possible to record both single and groups of adult zebrafish. Before use, the system has to be calibrated. The system consists of three modules: Recording, Path Reconstruction, and Data Processing. The step-by-step protocols for calibration and using the three modules are presented. Depending on the experimental setup, the system can be used for testing neophobia, white aversion, social cohesion, motor impairments, novel object exploration etc. It is especially promising as a first-step tool to study the effects of drugs or mutations on basic behavioral patterns. The system provides information about vertical and horizontal distribution of the zebrafish, about the xyz-components of kinematic parameters (such as locomotion, velocity, acceleration, and turning angle) and it provides the data necessary to calculate parameters for social cohesions when testing shoals.
Behavior, Issue 82, neuroscience, Zebrafish, Danio rerio, anxiety, Shoaling, Pharmacology, 3D-tracking, MK801
50681
Play Button
Developing Neuroimaging Phenotypes of the Default Mode Network in PTSD: Integrating the Resting State, Working Memory, and Structural Connectivity
Authors: Noah S. Philip, S. Louisa Carpenter, Lawrence H. Sweet.
Institutions: Alpert Medical School, Brown University, University of Georgia.
Complementary structural and functional neuroimaging techniques used to examine the Default Mode Network (DMN) could potentially improve assessments of psychiatric illness severity and provide added validity to the clinical diagnostic process. Recent neuroimaging research suggests that DMN processes may be disrupted in a number of stress-related psychiatric illnesses, such as posttraumatic stress disorder (PTSD). Although specific DMN functions remain under investigation, it is generally thought to be involved in introspection and self-processing. In healthy individuals it exhibits greatest activity during periods of rest, with less activity, observed as deactivation, during cognitive tasks, e.g., working memory. This network consists of the medial prefrontal cortex, posterior cingulate cortex/precuneus, lateral parietal cortices and medial temporal regions. Multiple functional and structural imaging approaches have been developed to study the DMN. These have unprecedented potential to further the understanding of the function and dysfunction of this network. Functional approaches, such as the evaluation of resting state connectivity and task-induced deactivation, have excellent potential to identify targeted neurocognitive and neuroaffective (functional) diagnostic markers and may indicate illness severity and prognosis with increased accuracy or specificity. Structural approaches, such as evaluation of morphometry and connectivity, may provide unique markers of etiology and long-term outcomes. Combined, functional and structural methods provide strong multimodal, complementary and synergistic approaches to develop valid DMN-based imaging phenotypes in stress-related psychiatric conditions. This protocol aims to integrate these methods to investigate DMN structure and function in PTSD, relating findings to illness severity and relevant clinical factors.
Medicine, Issue 89, default mode network, neuroimaging, functional magnetic resonance imaging, diffusion tensor imaging, structural connectivity, functional connectivity, posttraumatic stress disorder
51651
Play Button
How to Create and Use Binocular Rivalry
Authors: David Carmel, Michael Arcaro, Sabine Kastner, Uri Hasson.
Institutions: New York University, New York University, Princeton University, Princeton University.
Each of our eyes normally sees a slightly different image of the world around us. The brain can combine these two images into a single coherent representation. However, when the eyes are presented with images that are sufficiently different from each other, an interesting thing happens: Rather than fusing the two images into a combined conscious percept, what transpires is a pattern of perceptual alternations where one image dominates awareness while the other is suppressed; dominance alternates between the two images, typically every few seconds. This perceptual phenomenon is known as binocular rivalry. Binocular rivalry is considered useful for studying perceptual selection and awareness in both human and animal models, because unchanging visual input to each eye leads to alternations in visual awareness and perception. To create a binocular rivalry stimulus, all that is necessary is to present each eye with a different image at the same perceived location. There are several ways of doing this, but newcomers to the field are often unsure which method would best suit their specific needs. The purpose of this article is to describe a number of inexpensive and straightforward ways to create and use binocular rivalry. We detail methods that do not require expensive specialized equipment and describe each method's advantages and disadvantages. The methods described include the use of red-blue goggles, mirror stereoscopes and prism goggles.
Neuroscience, Issue 45, Binocular rivalry, continuous flash suppression, vision, visual awareness, perceptual competition, unconscious processing, neuroimaging
2030
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Imaging of Biological Tissues by Desorption Electrospray Ionization Mass Spectrometry
Authors: Rachel V. Bennett, Chaminda M. Gamage, Facundo M. Fernández.
Institutions: Georgia Institute of Technology.
Mass spectrometry imaging (MSI) provides untargeted molecular information with the highest specificity and spatial resolution for investigating biological tissues at the hundreds to tens of microns scale. When performed under ambient conditions, sample pre-treatment becomes unnecessary, thus simplifying the protocol while maintaining the high quality of information obtained. Desorption electrospray ionization (DESI) is a spray-based ambient MSI technique that allows for the direct sampling of surfaces in the open air, even in vivo. When used with a software-controlled sample stage, the sample is rastered underneath the DESI ionization probe, and through the time domain, m/z information is correlated with the chemical species' spatial distribution. The fidelity of the DESI-MSI output depends on the source orientation and positioning with respect to the sample surface and mass spectrometer inlet. Herein, we review how to prepare tissue sections for DESI imaging and additional experimental conditions that directly affect image quality. Specifically, we describe the protocol for the imaging of rat brain tissue sections by DESI-MSI.
Bioengineering, Issue 77, Molecular Biology, Biomedical Engineering, Chemistry, Biochemistry, Biophysics, Physics, Cellular Biology, Molecular Imaging, Mass Spectrometry, MS, MSI, Desorption electrospray ionization, DESI, Ambient mass spectrometry, tissue, sectioning, biomarker, imaging
50575
Play Button
Photobleaching Assays (FRAP & FLIP) to Measure Chromatin Protein Dynamics in Living Embryonic Stem Cells
Authors: Malka Nissim-Rafinia, Eran Meshorer.
Institutions: The Hebrew University of Jerusalem.
Fluorescence Recovery After Photobleaching (FRAP) and Fluorescence Loss In Photobleaching (FLIP) enable the study of protein dynamics in living cells with good spatial and temporal resolution. Here we describe how to perform FRAP and FLIP assays of chromatin proteins, including H1 and HP1, in mouse embryonic stem (ES) cells. In a FRAP experiment, cells are transfected, either transiently or stably, with a protein of interest fused with the green fluorescent protein (GFP) or derivatives thereof (YFP, CFP, Cherry, etc.). In the transfected, fluorescing cells, an intense focused laser beam bleaches a relatively small region of interest (ROI). The laser wavelength is selected according to the fluorescent protein used for fusion. The laser light irreversibly bleaches the fluorescent signal of molecules in the ROI and, immediately following bleaching, the recovery of the fluorescent signal in the bleached area - mediated by the replacement of the bleached molecules with the unbleached molecules - is monitored using time lapse imaging. The generated fluorescence recovery curves provide information on the protein's mobility. If the fluorescent molecules are immobile, no fluorescence recovery will be observed. In a complementary approach, Fluorescence Loss in Photobleaching (FLIP), the laser beam bleaches the same spot repeatedly and the signal intensity is measured elsewhere in the fluorescing cell. FLIP experiments therefore measure signal decay rather than fluorescence recovery and are useful to determine protein mobility as well as protein shuttling between cellular compartments. Transient binding is a common property of chromatin-associated proteins. Although the major fraction of each chromatin protein is bound to chromatin at any given moment at steady state, the binding is transient and most chromatin proteins have a high turnover on chromatin, with a residence time in the order of seconds. These properties are crucial for generating high plasticity in genome expression1. Photobleaching experiments are therefore particularly useful to determine chromatin plasticity using GFP-fusion versions of chromatin structural proteins, especially in ES cells, where the dynamic exchange of chromatin proteins (including heterochromatin protein 1 (HP1), linker histone H1 and core histones) is higher than in differentiated cells2,3.
Developmental Biology, Issue 52, Live imaging, FRAP, FLIP, embryonic stem (ES) cells, chromatin, chromatin plasticity, protein dynamics
2696
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
50427
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
50680
Play Button
Microarray-based Identification of Individual HERV Loci Expression: Application to Biomarker Discovery in Prostate Cancer
Authors: Philippe Pérot, Valérie Cheynet, Myriam Decaussin-Petrucci, Guy Oriol, Nathalie Mugnier, Claire Rodriguez-Lafrasse, Alain Ruffion, François Mallet.
Institutions: Joint Unit Hospices de Lyon-bioMérieux, BioMérieux, Hospices Civils de Lyon, Lyon 1 University, BioMérieux, Hospices Civils de Lyon, Hospices Civils de Lyon.
The prostate-specific antigen (PSA) is the main diagnostic biomarker for prostate cancer in clinical use, but it lacks specificity and sensitivity, particularly in low dosage values1​​. ‘How to use PSA' remains a current issue, either for diagnosis as a gray zone corresponding to a concentration in serum of 2.5-10 ng/ml which does not allow a clear differentiation to be made between cancer and noncancer2 or for patient follow-up as analysis of post-operative PSA kinetic parameters can pose considerable challenges for their practical application3,4. Alternatively, noncoding RNAs (ncRNAs) are emerging as key molecules in human cancer, with the potential to serve as novel markers of disease, e.g. PCA3 in prostate cancer5,6 and to reveal uncharacterized aspects of tumor biology. Moreover, data from the ENCODE project published in 2012 showed that different RNA types cover about 62% of the genome. It also appears that the amount of transcriptional regulatory motifs is at least 4.5x higher than the one corresponding to protein-coding exons. Thus, long terminal repeats (LTRs) of human endogenous retroviruses (HERVs) constitute a wide range of putative/candidate transcriptional regulatory sequences, as it is their primary function in infectious retroviruses. HERVs, which are spread throughout the human genome, originate from ancestral and independent infections within the germ line, followed by copy-paste propagation processes and leading to multicopy families occupying 8% of the human genome (note that exons span 2% of our genome). Some HERV loci still express proteins that have been associated with several pathologies including cancer7-10. We have designed a high-density microarray, in Affymetrix format, aiming to optimally characterize individual HERV loci expression, in order to better understand whether they can be active, if they drive ncRNA transcription or modulate coding gene expression. This tool has been applied in the prostate cancer field (Figure 1).
Medicine, Issue 81, Cancer Biology, Genetics, Molecular Biology, Prostate, Retroviridae, Biomarkers, Pharmacological, Tumor Markers, Biological, Prostatectomy, Microarray Analysis, Gene Expression, Diagnosis, Human Endogenous Retroviruses, HERV, microarray, Transcriptome, prostate cancer, Affymetrix
50713
Play Button
An Experimental and Bioinformatics Protocol for RNA-seq Analyses of Photoperiodic Diapause in the Asian Tiger Mosquito, Aedes albopictus
Authors: Monica F. Poelchau, Xin Huang, Allison Goff, Julie Reynolds, Peter Armbruster.
Institutions: Georgetown University, The Ohio State University.
Photoperiodic diapause is an important adaptation that allows individuals to escape harsh seasonal environments via a series of physiological changes, most notably developmental arrest and reduced metabolism. Global gene expression profiling via RNA-Seq can provide important insights into the transcriptional mechanisms of photoperiodic diapause. The Asian tiger mosquito, Aedes albopictus, is an outstanding organism for studying the transcriptional bases of diapause due to its ease of rearing, easily induced diapause, and the genomic resources available. This manuscript presents a general experimental workflow for identifying diapause-induced transcriptional differences in A. albopictus. Rearing techniques, conditions necessary to induce diapause and non-diapause development, methods to estimate percent diapause in a population, and RNA extraction and integrity assessment for mosquitoes are documented. A workflow to process RNA-Seq data from Illumina sequencers culminates in a list of differentially expressed genes. The representative results demonstrate that this protocol can be used to effectively identify genes differentially regulated at the transcriptional level in A. albopictus due to photoperiodic differences. With modest adjustments, this workflow can be readily adapted to study the transcriptional bases of diapause or other important life history traits in other mosquitoes.
Genetics, Issue 93, Aedes albopictus Asian tiger mosquito, photoperiodic diapause, RNA-Seq de novo transcriptome assembly, mosquito husbandry
51961
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
The ChroP Approach Combines ChIP and Mass Spectrometry to Dissect Locus-specific Proteomic Landscapes of Chromatin
Authors: Monica Soldi, Tiziana Bonaldi.
Institutions: European Institute of Oncology.
Chromatin is a highly dynamic nucleoprotein complex made of DNA and proteins that controls various DNA-dependent processes. Chromatin structure and function at specific regions is regulated by the local enrichment of histone post-translational modifications (hPTMs) and variants, chromatin-binding proteins, including transcription factors, and DNA methylation. The proteomic characterization of chromatin composition at distinct functional regions has been so far hampered by the lack of efficient protocols to enrich such domains at the appropriate purity and amount for the subsequent in-depth analysis by Mass Spectrometry (MS). We describe here a newly designed chromatin proteomics strategy, named ChroP (Chromatin Proteomics), whereby a preparative chromatin immunoprecipitation is used to isolate distinct chromatin regions whose features, in terms of hPTMs, variants and co-associated non-histonic proteins, are analyzed by MS. We illustrate here the setting up of ChroP for the enrichment and analysis of transcriptionally silent heterochromatic regions, marked by the presence of tri-methylation of lysine 9 on histone H3. The results achieved demonstrate the potential of ChroP in thoroughly characterizing the heterochromatin proteome and prove it as a powerful analytical strategy for understanding how the distinct protein determinants of chromatin interact and synergize to establish locus-specific structural and functional configurations.
Biochemistry, Issue 86, chromatin, histone post-translational modifications (hPTMs), epigenetics, mass spectrometry, proteomics, SILAC, chromatin immunoprecipitation , histone variants, chromatome, hPTMs cross-talks
51220
Play Button
Profiling of Estrogen-regulated MicroRNAs in Breast Cancer Cells
Authors: Anne Katchy, Cecilia Williams.
Institutions: University of Houston.
Estrogen plays vital roles in mammary gland development and breast cancer progression. It mediates its function by binding to and activating the estrogen receptors (ERs), ERα, and ERβ. ERα is frequently upregulated in breast cancer and drives the proliferation of breast cancer cells. The ERs function as transcription factors and regulate gene expression. Whereas ERα's regulation of protein-coding genes is well established, its regulation of noncoding microRNA (miRNA) is less explored. miRNAs play a major role in the post-transcriptional regulation of genes, inhibiting their translation or degrading their mRNA. miRNAs can function as oncogenes or tumor suppressors and are also promising biomarkers. Among the miRNA assays available, microarray and quantitative real-time polymerase chain reaction (qPCR) have been extensively used to detect and quantify miRNA levels. To identify miRNAs regulated by estrogen signaling in breast cancer, their expression in ERα-positive breast cancer cell lines were compared before and after estrogen-activation using both the µParaflo-microfluidic microarrays and Dual Labeled Probes-low density arrays. Results were validated using specific qPCR assays, applying both Cyanine dye-based and Dual Labeled Probes-based chemistry. Furthermore, a time-point assay was used to identify regulations over time. Advantages of the miRNA assay approach used in this study is that it enables a fast screening of mature miRNA regulations in numerous samples, even with limited sample amounts. The layout, including the specific conditions for cell culture and estrogen treatment, biological and technical replicates, and large-scale screening followed by in-depth confirmations using separate techniques, ensures a robust detection of miRNA regulations, and eliminates false positives and other artifacts. However, mutated or unknown miRNAs, or regulations at the primary and precursor transcript level, will not be detected. The method presented here represents a thorough investigation of estrogen-mediated miRNA regulation.
Medicine, Issue 84, breast cancer, microRNA, estrogen, estrogen receptor, microarray, qPCR
51285
Play Button
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Authors: Aya D. Pusic, Yelena Y. Grinberg, Heidi M. Mitchell, Richard P. Kraig.
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers. In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic and modulate SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
2910
Play Button
Dithranol as a Matrix for Matrix Assisted Laser Desorption/Ionization Imaging on a Fourier Transform Ion Cyclotron Resonance Mass Spectrometer
Authors: Cuong H. Le, Jun Han, Christoph H. Borchers.
Institutions: University of Victoria, University of Victoria.
Mass spectrometry imaging (MSI) determines the spatial localization and distribution patterns of compounds on the surface of a tissue section, mainly using MALDI (matrix assisted laser desorption/ionization)-based analytical techniques. New matrices for small-molecule MSI, which can improve the analysis of low-molecular weight (MW) compounds, are needed. These matrices should provide increased analyte signals while decreasing MALDI background signals. In addition, the use of ultrahigh-resolution instruments, such as Fourier transform ion cyclotron resonance (FTICR) mass spectrometers, has the ability to resolve analyte signals from matrix signals, and this can partially overcome many problems associated with the background originating from the MALDI matrix. The reduction in the intensities of the metastable matrix clusters by FTICR MS can also help to overcome some of the interferences associated with matrix peaks on other instruments. High-resolution instruments such as the FTICR mass spectrometers are advantageous as they can produce distribution patterns of many compounds simultaneously while still providing confidence in chemical identifications. Dithranol (DT; 1,8-dihydroxy-9,10-dihydroanthracen-9-one) has previously been reported as a MALDI matrix for tissue imaging. In this work, a protocol for the use of DT for MALDI imaging of endogenous lipids from the surfaces of mammalian tissue sections, by positive-ion MALDI-MS, on an ultrahigh-resolution hybrid quadrupole FTICR instrument has been provided.
Basic Protocol, Issue 81, eye, molecular imaging, chemistry technique, analytical, mass spectrometry, matrix assisted laser desorption/ionization (MALDI), tandem mass spectrometry, lipid, tissue imaging, bovine lens, dithranol, matrix, FTICR (Fourier Transform Ion Cyclotron Resonance)
50733
Play Button
A Novel Bayesian Change-point Algorithm for Genome-wide Analysis of Diverse ChIPseq Data Types
Authors: Haipeng Xing, Willey Liao, Yifan Mo, Michael Q. Zhang.
Institutions: Stony Brook University, Cold Spring Harbor Laboratory, University of Texas at Dallas.
ChIPseq is a widely used technique for investigating protein-DNA interactions. Read density profiles are generated by using next-sequencing of protein-bound DNA and aligning the short reads to a reference genome. Enriched regions are revealed as peaks, which often differ dramatically in shape, depending on the target protein1. For example, transcription factors often bind in a site- and sequence-specific manner and tend to produce punctate peaks, while histone modifications are more pervasive and are characterized by broad, diffuse islands of enrichment2. Reliably identifying these regions was the focus of our work. Algorithms for analyzing ChIPseq data have employed various methodologies, from heuristics3-5 to more rigorous statistical models, e.g. Hidden Markov Models (HMMs)6-8. We sought a solution that minimized the necessity for difficult-to-define, ad hoc parameters that often compromise resolution and lessen the intuitive usability of the tool. With respect to HMM-based methods, we aimed to curtail parameter estimation procedures and simple, finite state classifications that are often utilized. Additionally, conventional ChIPseq data analysis involves categorization of the expected read density profiles as either punctate or diffuse followed by subsequent application of the appropriate tool. We further aimed to replace the need for these two distinct models with a single, more versatile model, which can capably address the entire spectrum of data types. To meet these objectives, we first constructed a statistical framework that naturally modeled ChIPseq data structures using a cutting edge advance in HMMs9, which utilizes only explicit formulas-an innovation crucial to its performance advantages. More sophisticated then heuristic models, our HMM accommodates infinite hidden states through a Bayesian model. We applied it to identifying reasonable change points in read density, which further define segments of enrichment. Our analysis revealed how our Bayesian Change Point (BCP) algorithm had a reduced computational complexity-evidenced by an abridged run time and memory footprint. The BCP algorithm was successfully applied to both punctate peak and diffuse island identification with robust accuracy and limited user-defined parameters. This illustrated both its versatility and ease of use. Consequently, we believe it can be implemented readily across broad ranges of data types and end users in a manner that is easily compared and contrasted, making it a great tool for ChIPseq data analysis that can aid in collaboration and corroboration between research groups. Here, we demonstrate the application of BCP to existing transcription factor10,11 and epigenetic data12 to illustrate its usefulness.
Genetics, Issue 70, Bioinformatics, Genomics, Molecular Biology, Cellular Biology, Immunology, Chromatin immunoprecipitation, ChIP-Seq, histone modifications, segmentation, Bayesian, Hidden Markov Models, epigenetics
4273
Play Button
Using an Automated Cell Counter to Simplify Gene Expression Studies: siRNA Knockdown of IL-4 Dependent Gene Expression in Namalwa Cells
Authors: Adam M. McCoy, Claudia Litterst, Michelle L. Collins, Luis A. Ugozzoli.
Institutions: Bio-Rad Laboratories.
The use of siRNA mediated gene knockdown is continuing to be an important tool in studies of gene expression. siRNA studies are being conducted not only to study the effects of downregulating single genes, but also to interrogate signaling pathways and other complex interaction networks. These pathway analyses require both the use of relevant cellular models and methods that cause less perturbation to the cellular physiology. Electroporation is increasingly being used as an effective way to introduce siRNA and other nucleic acids into difficult to transfect cell lines and primary cells without altering the signaling pathway under investigation. There are multiple critical steps to a successful siRNA experiment, and there are ways to simplify the work while improving the data quality at several experimental stages. To help you get started with your siRNA mediated gene knockdown project, we will demonstrate how to perform a pathway study complete from collecting and counting the cells prior to electroporation through post transfection real-time PCR gene expression analysis. The following study investigates the role of the transcriptional activator STAT6 in IL-4 dependent gene expression of CCL17 in a Burkitt lymphoma cell line (Namalwa). The techniques demonstrated are useful for a wide range of siRNA-based experiments on both adherent and suspension cells. We will also show how to streamline cell counting with the TC10 automated cell counter, how to electroporate multiple samples simultaneously using the MXcell electroporation system, and how to simultaneously assess RNA quality and quantity with the Experion automated electrophoresis system.
Cellular Biology, Issue 38, Cell Counting, Gene Silencing, siRNA, Namalwa Cells, IL4, Gene Expression, Electroporation, Real Time PCR
1904
Play Button
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Authors: Gauthier Julie, Fadi F. Hamdan, Guy A. Rouleau.
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1 and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo mutations. This is the case for autism and schizophrenia3. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo mutations would more frequently come from males, particularly older males4. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing
2534
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.