JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Leveraging non-targeted metabolite profiling via statistical genomics.
PUBLISHED: 01-15-2013
One of the challenges of systems biology is to integrate multiple sources of data in order to build a cohesive view of the system of study. Here we describe the mass spectrometry based profiling of maize kernels, a model system for genomic studies and a cornerstone of the agroeconomy. Using a network analysis, we can include 97.5% of the 8,710 features detected from 210 varieties into a single framework. More conservatively, 47.1% of compounds detected can be organized into a network with 48 distinct modules. Eigenvalues were calculated for each module and then used as inputs for genome-wide association studies. Nineteen modules returned significant results, illustrating the genetic control of biochemical networks within the maize kernel. Our approach leverages the correlations between the genome and metabolome to mutually enhance their annotation and thus enable biological interpretation. This method is applicable to any organism with sufficient bioinformatic resources.
Non-targeted metabolite profiling by ultra performance liquid chromatography coupled with mass spectrometry (UPLC-MS) is a powerful technique to investigate metabolism. The approach offers an unbiased and in-depth analysis that can enable the development of diagnostic tests, novel therapies, and further our understanding of disease processes. The inherent chemical diversity of the metabolome creates significant analytical challenges and there is no single experimental approach that can detect all metabolites. Additionally, the biological variation in individual metabolism and the dependence of metabolism on environmental factors necessitates large sample numbers to achieve the appropriate statistical power required for meaningful biological interpretation. To address these challenges, this tutorial outlines an analytical workflow for large scale non-targeted metabolite profiling of serum by UPLC-MS. The procedure includes guidelines for sample organization and preparation, data acquisition, quality control, and metabolite identification and will enable reliable acquisition of data for large experiments and provide a starting point for laboratories new to non-targeted metabolite profiling by UPLC-MS.
21 Related JoVE Articles!
Play Button
Annotation of Plant Gene Function via Combined Genomics, Metabolomics and Informatics
Authors: Takayuki Tohge, Alisdair R. Fernie.
Institutions: Max-Planck-Institut.
Given the ever expanding number of model plant species for which complete genome sequences are available and the abundance of bio-resources such as knockout mutants, wild accessions and advanced breeding populations, there is a rising burden for gene functional annotation. In this protocol, annotation of plant gene function using combined co-expression gene analysis, metabolomics and informatics is provided (Figure 1). This approach is based on the theory of using target genes of known function to allow the identification of non-annotated genes likely to be involved in a certain metabolic process, with the identification of target compounds via metabolomics. Strategies are put forward for applying this information on populations generated by both forward and reverse genetics approaches in spite of none of these are effortless. By corollary this approach can also be used as an approach to characterise unknown peaks representing new or specific secondary metabolites in the limited tissues, plant species or stress treatment, which is currently the important trial to understanding plant metabolism.
Plant Biology, Issue 64, Genetics, Bioinformatics, Metabolomics, Plant metabolism, Transcriptome analysis, Functional annotation, Computational biology, Plant biology, Theoretical biology, Spectroscopy and structural analysis
Play Button
Untargeted Metabolomics from Biological Sources Using Ultraperformance Liquid Chromatography-High Resolution Mass Spectrometry (UPLC-HRMS)
Authors: Nathaniel W. Snyder, Maya Khezam, Clementina A. Mesaros, Andrew Worth, Ian A. Blair.
Institutions: University of Pennsylvania .
Here we present a workflow to analyze the metabolic profiles for biological samples of interest including; cells, serum, or tissue. The sample is first separated into polar and non-polar fractions by a liquid-liquid phase extraction, and partially purified to facilitate downstream analysis. Both aqueous (polar metabolites) and organic (non-polar metabolites) phases of the initial extraction are processed to survey a broad range of metabolites. Metabolites are separated by different liquid chromatography methods based upon their partition properties. In this method, we present microflow ultra-performance (UP)LC methods, but the protocol is scalable to higher flows and lower pressures. Introduction into the mass spectrometer can be through either general or compound optimized source conditions. Detection of a broad range of ions is carried out in full scan mode in both positive and negative mode over a broad m/z range using high resolution on a recently calibrated instrument. Label-free differential analysis is carried out on bioinformatics platforms. Applications of this approach include metabolic pathway screening, biomarker discovery, and drug development.
Biochemistry, Issue 75, Chemistry, Molecular Biology, Cellular Biology, Physiology, Medicine, Pharmacology, Genetics, Genomics, Mass Spectrometry, MS, Metabolism, Metabolomics, untargeted, extraction, lipids, accurate mass, liquid chromatography, ultraperformance liquid chromatography, UPLC, high resolution mass spectrometry, HRMS, spectrometry
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Detecting Somatic Genetic Alterations in Tumor Specimens by Exon Capture and Massively Parallel Sequencing
Authors: Helen H Won, Sasinya N Scott, A. Rose Brannon, Ronak H Shah, Michael F Berger.
Institutions: Memorial Sloan-Kettering Cancer Center, Memorial Sloan-Kettering Cancer Center.
Efforts to detect and investigate key oncogenic mutations have proven valuable to facilitate the appropriate treatment for cancer patients. The establishment of high-throughput, massively parallel "next-generation" sequencing has aided the discovery of many such mutations. To enhance the clinical and translational utility of this technology, platforms must be high-throughput, cost-effective, and compatible with formalin-fixed paraffin embedded (FFPE) tissue samples that may yield small amounts of degraded or damaged DNA. Here, we describe the preparation of barcoded and multiplexed DNA libraries followed by hybridization-based capture of targeted exons for the detection of cancer-associated mutations in fresh frozen and FFPE tumors by massively parallel sequencing. This method enables the identification of sequence mutations, copy number alterations, and select structural rearrangements involving all targeted genes. Targeted exon sequencing offers the benefits of high throughput, low cost, and deep sequence coverage, thus conferring high sensitivity for detecting low frequency mutations.
Molecular Biology, Issue 80, Molecular Diagnostic Techniques, High-Throughput Nucleotide Sequencing, Genetics, Neoplasms, Diagnosis, Massively parallel sequencing, targeted exon sequencing, hybridization capture, cancer, FFPE, DNA mutations
Play Button
The ChroP Approach Combines ChIP and Mass Spectrometry to Dissect Locus-specific Proteomic Landscapes of Chromatin
Authors: Monica Soldi, Tiziana Bonaldi.
Institutions: European Institute of Oncology.
Chromatin is a highly dynamic nucleoprotein complex made of DNA and proteins that controls various DNA-dependent processes. Chromatin structure and function at specific regions is regulated by the local enrichment of histone post-translational modifications (hPTMs) and variants, chromatin-binding proteins, including transcription factors, and DNA methylation. The proteomic characterization of chromatin composition at distinct functional regions has been so far hampered by the lack of efficient protocols to enrich such domains at the appropriate purity and amount for the subsequent in-depth analysis by Mass Spectrometry (MS). We describe here a newly designed chromatin proteomics strategy, named ChroP (Chromatin Proteomics), whereby a preparative chromatin immunoprecipitation is used to isolate distinct chromatin regions whose features, in terms of hPTMs, variants and co-associated non-histonic proteins, are analyzed by MS. We illustrate here the setting up of ChroP for the enrichment and analysis of transcriptionally silent heterochromatic regions, marked by the presence of tri-methylation of lysine 9 on histone H3. The results achieved demonstrate the potential of ChroP in thoroughly characterizing the heterochromatin proteome and prove it as a powerful analytical strategy for understanding how the distinct protein determinants of chromatin interact and synergize to establish locus-specific structural and functional configurations.
Biochemistry, Issue 86, chromatin, histone post-translational modifications (hPTMs), epigenetics, mass spectrometry, proteomics, SILAC, chromatin immunoprecipitation , histone variants, chromatome, hPTMs cross-talks
Play Button
A Strategy for Sensitive, Large Scale Quantitative Metabolomics
Authors: Xiaojing Liu, Zheng Ser, Ahmad A. Cluntun, Samantha J. Mentch, Jason W. Locasale.
Institutions: Cornell University, Cornell University.
Metabolite profiling has been a valuable asset in the study of metabolism in health and disease. However, current platforms have different limiting factors, such as labor intensive sample preparations, low detection limits, slow scan speeds, intensive method optimization for each metabolite, and the inability to measure both positively and negatively charged ions in single experiments. Therefore, a novel metabolomics protocol could advance metabolomics studies. Amide-based hydrophilic chromatography enables polar metabolite analysis without any chemical derivatization. High resolution MS using the Q-Exactive (QE-MS) has improved ion optics, increased scan speeds (256 msec at resolution 70,000), and has the capability of carrying out positive/negative switching. Using a cold methanol extraction strategy, and coupling an amide column with QE-MS enables robust detection of 168 targeted polar metabolites and thousands of additional features simultaneously.  Data processing is carried out with commercially available software in a highly efficient way, and unknown features extracted from the mass spectra can be queried in databases.
Chemistry, Issue 87, high-resolution mass spectrometry, metabolomics, positive/negative switching, low mass calibration, Orbitrap
Play Button
Analysis of Nephron Composition and Function in the Adult Zebrafish Kidney
Authors: Kristen K. McCampbell, Kristin N. Springer, Rebecca A. Wingert.
Institutions: University of Notre Dame.
The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.
Cellular Biology, Issue 90, zebrafish; kidney; nephron; nephrology; renal; regeneration; proximal tubule; distal tubule; segment; mesonephros; physiology; acute kidney injury (AKI)
Play Button
Multi-step Preparation Technique to Recover Multiple Metabolite Compound Classes for In-depth and Informative Metabolomic Analysis
Authors: Charmion Cruickshank-Quinn, Kevin D. Quinn, Roger Powell, Yanhui Yang, Michael Armstrong, Spencer Mahaffey, Richard Reisdorph, Nichole Reisdorph.
Institutions: National Jewish Health, University of Colorado Denver.
Metabolomics is an emerging field which enables profiling of samples from living organisms in order to obtain insight into biological processes. A vital aspect of metabolomics is sample preparation whereby inconsistent techniques generate unreliable results. This technique encompasses protein precipitation, liquid-liquid extraction, and solid-phase extraction as a means of fractionating metabolites into four distinct classes. Improved enrichment of low abundance molecules with a resulting increase in sensitivity is obtained, and ultimately results in more confident identification of molecules. This technique has been applied to plasma, bronchoalveolar lavage fluid, and cerebrospinal fluid samples with volumes as low as 50 µl.  Samples can be used for multiple downstream applications; for example, the pellet resulting from protein precipitation can be stored for later analysis. The supernatant from that step undergoes liquid-liquid extraction using water and strong organic solvent to separate the hydrophilic and hydrophobic compounds. Once fractionated, the hydrophilic layer can be processed for later analysis or discarded if not needed. The hydrophobic fraction is further treated with a series of solvents during three solid-phase extraction steps to separate it into fatty acids, neutral lipids, and phospholipids. This allows the technician the flexibility to choose which class of compounds is preferred for analysis. It also aids in more reliable metabolite identification since some knowledge of chemical class exists.
Bioengineering, Issue 89, plasma, chemistry techniques, analytical, solid phase extraction, mass spectrometry, metabolomics, fluids and secretions, profiling, small molecules, lipids, liquid chromatography, liquid-liquid extraction, cerebrospinal fluid, bronchoalveolar lavage fluid
Play Button
A Manual Small Molecule Screen Approaching High-throughput Using Zebrafish Embryos
Authors: Shahram Jevin Poureetezadi, Eric K. Donahue, Rebecca A. Wingert.
Institutions: University of Notre Dame.
Zebrafish have become a widely used model organism to investigate the mechanisms that underlie developmental biology and to study human disease pathology due to their considerable degree of genetic conservation with humans. Chemical genetics entails testing the effect that small molecules have on a biological process and is becoming a popular translational research method to identify therapeutic compounds. Zebrafish are specifically appealing to use for chemical genetics because of their ability to produce large clutches of transparent embryos, which are externally fertilized. Furthermore, zebrafish embryos can be easily drug treated by the simple addition of a compound to the embryo media. Using whole-mount in situ hybridization (WISH), mRNA expression can be clearly visualized within zebrafish embryos. Together, using chemical genetics and WISH, the zebrafish becomes a potent whole organism context in which to determine the cellular and physiological effects of small molecules. Innovative advances have been made in technologies that utilize machine-based screening procedures, however for many labs such options are not accessible or remain cost-prohibitive. The protocol described here explains how to execute a manual high-throughput chemical genetic screen that requires basic resources and can be accomplished by a single individual or small team in an efficient period of time. Thus, this protocol provides a feasible strategy that can be implemented by research groups to perform chemical genetics in zebrafish, which can be useful for gaining fundamental insights into developmental processes, disease mechanisms, and to identify novel compounds and signaling pathways that have medically relevant applications.
Developmental Biology, Issue 93, zebrafish, chemical genetics, chemical screen, in vivo small molecule screen, drug discovery, whole mount in situ hybridization (WISH), high-throughput screening (HTS), high-content screening (HCS)
Play Button
Purification of Transcripts and Metabolites from Drosophila Heads
Authors: Kurt Jensen, Jonatan Sanchez-Garcia, Caroline Williams, Swati Khare, Krishanu Mathur, Rita M. Graze, Daniel A. Hahn, Lauren M. McIntyre, Diego E. Rincon-Limas, Pedro Fernandez-Funez.
Institutions: University of Florida , University of Florida , University of Florida , University of Florida .
For the last decade, we have tried to understand the molecular and cellular mechanisms of neuronal degeneration using Drosophila as a model organism. Although fruit flies provide obvious experimental advantages, research on neurodegenerative diseases has mostly relied on traditional techniques, including genetic interaction, histology, immunofluorescence, and protein biochemistry. These techniques are effective for mechanistic, hypothesis-driven studies, which lead to a detailed understanding of the role of single genes in well-defined biological problems. However, neurodegenerative diseases are highly complex and affect multiple cellular organelles and processes over time. The advent of new technologies and the omics age provides a unique opportunity to understand the global cellular perturbations underlying complex diseases. Flexible model organisms such as Drosophila are ideal for adapting these new technologies because of their strong annotation and high tractability. One challenge with these small animals, though, is the purification of enough informational molecules (DNA, mRNA, protein, metabolites) from highly relevant tissues such as fly brains. Other challenges consist of collecting large numbers of flies for experimental replicates (critical for statistical robustness) and developing consistent procedures for the purification of high-quality biological material. Here, we describe the procedures for collecting thousands of fly heads and the extraction of transcripts and metabolites to understand how global changes in gene expression and metabolism contribute to neurodegenerative diseases. These procedures are easily scalable and can be applied to the study of proteomic and epigenomic contributions to disease.
Genetics, Issue 73, Biochemistry, Molecular Biology, Neurobiology, Neuroscience, Bioengineering, Cellular Biology, Anatomy, Neurodegenerative Diseases, Biological Assay, Drosophila, fruit fly, head separation, purification, mRNA, RNA, cDNA, DNA, transcripts, metabolites, replicates, SCA3, neurodegeneration, NMR, gene expression, animal model
Play Button
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Authors: Alla Gagarinova, Mohan Babu, Jack Greenblatt, Andrew Emili.
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g. protein-protein) and functional (e.g. gene-gene or genetic) interactions (GI)1. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7, but GI information remains sparse for prokaryotes8, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10. Here, we present the key steps required to perform quantitative E. coli Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format. Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g. the 'Keio' collection11) and essential gene hypomorphic mutations (i.e. alleles conferring reduced protein expression, stability, or activity9, 12, 13) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e. slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2 as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9.
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
Play Button
Stable Isotopic Profiling of Intermediary Metabolic Flux in Developing and Adult Stage Caenorhabditis elegans
Authors: Marni J. Falk, Meera Rao, Julian Ostrovsky, Evgueni Daikhin, Ilana Nissim, Marc Yudkoff.
Institutions: The Children's Hospital of Philadelphia, University of Pennsylvania.
Stable isotopic profiling has long permitted sensitive investigations of the metabolic consequences of genetic mutations and/or pharmacologic therapies in cellular and mammalian models. Here, we describe detailed methods to perform stable isotopic profiling of intermediary metabolism and metabolic flux in the nematode, Caenorhabditis elegans. Methods are described for profiling whole worm free amino acids, labeled carbon dioxide, labeled organic acids, and labeled amino acids in animals exposed to stable isotopes either from early development on nematode growth media agar plates or beginning as young adults while exposed to various pharmacologic treatments in liquid culture. Free amino acids are quantified by high performance liquid chromatography (HPLC) in whole worm aliquots extracted in 4% perchloric acid. Universally labeled 13C-glucose or 1,6-13C2-glucose is utilized as the stable isotopic precursor whose labeled carbon is traced by mass spectrometry in carbon dioxide (both atmospheric and dissolved) as well as in metabolites indicative of flux through glycolysis, pyruvate metabolism, and the tricarboxylic acid cycle. Representative results are included to demonstrate effects of isotope exposure time, various bacterial clearing protocols, and alternative worm disruption methods in wild-type nematodes, as well as the relative extent of isotopic incorporation in mitochondrial complex III mutant worms (isp-1(qm150)) relative to wild-type worms. Application of stable isotopic profiling in living nematodes provides a novel capacity to investigate at the whole animal level real-time metabolic alterations that are caused by individual genetic disorders and/or pharmacologic therapies.
Developmental Biology, Issue 48, Stable isotope, amino acid quantitation, organic acid quantitation, nematodes, metabolism
Play Button
Environmentally Induced Heritable Changes in Flax
Authors: Cory Johnson, Tiffanie Moss, Christopher Cullis.
Institutions: Case Western Reserve University.
Some flax varieties respond to nutrient stress by modifying their genome and these modifications can be inherited through many generations. Also associated with these genomic changes are heritable phenotypic variations 1,2. The flax variety Stormont Cirrus (Pl) when grown under three different nutrient conditions can either remain inducible (under the control conditions), or become stably modified to either the large or small genotroph by growth under high or low nutrient conditions respectively. The lines resulting from the initial growth under each of these conditions appear to grow better when grown under the same conditions in subsequent generations, notably the Pl line grows best under the control treatment indicating that the plants growing under both the high and low nutrients are under stress. One of the genomic changes that are associated with the induction of heritable changes is the appearance of an insertion element (LIS-1) 3, 4 while the plants are growing under the nutrient stress. With respect to this insertion event, the flax variety Stormont Cirrus (Pl) when grown under three different nutrient conditions can either remain unchanged (under the control conditions), have the insertion appear in all the plants (under low nutrients) and have this transmitted to the next generation, or have the insertion (or parts of it) appear but not be transmitted through generations (under high nutrients) 4. The frequency of the appearance of this insertion indicates that it is under positive selection, which is also consistent with the growth response in subsequent generations. Leaves or meristems harvested at various stages of growth are used for DNA and RNA isolation. The RNA is used to identify variation in expression associated with the various growth environments and/or t he presence/absence of LIS-1. The isolated DNA is used to identify those plants in which the insertion has occurred.
Plant Biology, Issue 47, Flax, genome variation, environmental stress, small RNAs, altered gene expression
Play Button
Concentration of Metabolites from Low-density Planktonic Communities for Environmental Metabolomics using Nuclear Magnetic Resonance Spectroscopy
Authors: R. Craig Everroad, Seiji Yoshida, Yuuri Tsuboi, Yasuhiro Date, Jun Kikuchi, Shigeharu Moriya.
Institutions: RIKEN Advanced Science Institute, Yokohama City University, RIKEN Plant Science Center, Nagoya University.
Environmental metabolomics is an emerging field that is promoting new understanding in how organisms respond to and interact with the environment and each other at the biochemical level1. Nuclear magnetic resonance (NMR) spectroscopy is one of several technologies, including gas chromatography–mass spectrometry (GC-MS), with considerable promise for such studies. Advantages of NMR are that it is suitable for untargeted analyses, provides structural information and spectra can be queried in quantitative and statistical manners against recently available databases of individual metabolite spectra2,3. In addition, NMR spectral data can be combined with data from other omics levels (e.g. transcriptomics, genomics) to provide a more comprehensive understanding of the physiological responses of taxa to each other and the environment4,5,6. However, NMR is less sensitive than other metabolomic techniques, making it difficult to apply to natural microbial systems where sample populations can be low-density and metabolite concentrations low compared to metabolites from well-defined and readily extractable sources such as whole tissues, biofluids or cell-cultures. Consequently, the few direct environmental metabolomic studies of microbes performed to date have been limited to culture-based or easily defined high-density ecosystems such as host-symbiont systems, constructed co-cultures or manipulations of the gut environment where stable isotope labeling can be additionally used to enhance NMR signals7,8,9,10,11,12. Methods that facilitate the concentration and collection of environmental metabolites at concentrations suitable for NMR are lacking. Since recent attention has been given to the environmental metabolomics of organisms within the aquatic environment, where much of the energy and material flow is mediated by the planktonic community13,14, we have developed a method for the concentration and extraction of whole-community metabolites from planktonic microbial systems by filtration. Commercially available hydrophilic poly-1,1-difluoroethene (PVDF) filters are specially treated to completely remove extractables, which can otherwise appear as contaminants in subsequent analyses. These treated filters are then used to filter environmental or experimental samples of interest. Filters containing the wet sample material are lyophilized and aqueous-soluble metabolites are extracted directly for conventional NMR spectroscopy using a standardized potassium phosphate extraction buffer2. Data derived from these methods can be analyzed statistically to identify meaningful patterns, or integrated with other omics levels for comprehensive understanding of community and ecosystem function.
Molecular Biology, Issue 62, environmental metabolomics, metabolic profiling, microbial ecology, plankton, NMR spectroscopy, PCA
Play Button
Genome Editing with CompoZr Custom Zinc Finger Nucleases (ZFNs)
Authors: Keith Hansen, Matthew J. Coussens, Jack Sago, Shilpi Subramanian, Monika Gjoka, Dave Briner.
Institutions: Sigma Life Science.
Genome editing is a powerful technique that can be used to elucidate gene function and the genetic basis of disease. Traditional gene editing methods such as chemical-based mutagenesis or random integration of DNA sequences confer indiscriminate genetic changes in an overall inefficient manner and require incorporation of undesirable synthetic sequences or use of aberrant culture conditions, potentially confusing biological study. By contrast, transient ZFN expression in a cell can facilitate precise, heritable gene editing in a highly efficient manner without the need for administration of chemicals or integration of synthetic transgenes. Zinc finger nucleases (ZFNs) are enzymes which bind and cut distinct sequences of double-stranded DNA (dsDNA). A functional CompoZr ZFN unit consists of two individual monomeric proteins that bind a DNA "half-site" of approximately 15-18 nucleotides (see Figure 1). When two ZFN monomers "home" to their adjacent target sites the DNA-cleavage domains dimerize and create a double-strand break (DSB) in the DNA.1 Introduction of ZFN-mediated DSBs in the genome lays a foundation for highly efficient genome editing. Imperfect repair of DSBs in a cell via the non-homologous end-joining (NHEJ) DNA repair pathway can result in small insertions and deletions (indels). Creation of indels within the gene coding sequence of a cell can result in frameshift and subsequent functional knockout of a gene locus at high efficiency.2 While this protocol describes the use of ZFNs to create a gene knockout, integration of transgenes may also be conducted via homology-directed repair at the ZFN cut site. The CompoZr Custom ZFN Service represents a systematic, comprehensive, and well-characterized approach to targeted gene editing for the scientific community with ZFN technology. Sigma scientists work closely with investigators to 1) perform due diligence analysis including analysis of relevant gene structure, biology, and model system pursuant to the project goals, 2) apply this knowledge to develop a sound targeting strategy, 3) then design, build, and functionally validate ZFNs for activity in a relevant cell line. The investigator receives positive control genomic DNA and primers, and ready-to-use ZFN reagents supplied in both plasmid DNA and in-vitro transcribed mRNA format. These reagents may then be delivered for transient expression in the investigator’s cell line or cell type of choice. Samples are then tested for gene editing at the locus of interest by standard molecular biology techniques including PCR amplification, enzymatic digest, and electrophoresis. After positive signal for gene editing is detected in the initial population, cells are single-cell cloned and genotyped for identification of mutant clones/alleles.
Genetics, Issue 64, Molecular Biology, Zinc Finger Nuclease, Genome Engineering, Genomic Editing, Gene Modification, Gene Knockout, Gene Integration, non-homologous end joining, homologous recombination, targeted genome editing
Play Button
Quantification of Fungal Colonization, Sporogenesis, and Production of Mycotoxins Using Kernel Bioassays
Authors: Shawn Christensen, Eli Borrego, Won-Bo Shim, Tom Isakeit, Michael Kolomiets.
Institutions: Texas A&M University.
The rotting of grains by seed-infecting fungi poses one of the greatest economic challenges to cereal production worldwide, not to mention serious risks to human and animal health. Among cereal production, maize is arguably the most affected crop, due to pathogen-induced losses in grain integrity and mycotoxin seed contamination. The two most prevalent and problematic mycotoxins for maize growers and food and feed processors are aflatoxin and fumonisin, produced by Aspergillus flavus and Fusarium verticillioides, respectively. Recent studies in molecular plant-pathogen interactions have demonstrated promise in understanding specific mechanisms associated with plant responses to fungal infection and mycotoxin contamination1,2,3,4,5,6. Because many labs are using kernel assays to study plant-pathogen interactions, there is a need for a standardized method for quantifying different biological parameters, so results from different laboratories can be cross-interpreted. For a robust and reproducible means for quantitative analyses on seeds, we have developed in-lab kernel assays and subsequent methods to quantify fungal growth, biomass, and mycotoxin contamination. Four sterilized maize kernels are inoculated in glass vials with a fungal suspension (106) and incubated for a predetermined period. Sample vials are then selected for enumeration of conidia by hemocytometer, ergosterol-based biomass analysis by high performance liquid chromatography (HPLC), aflatoxin quantification using an AflaTest fluorometer method, and fumonisin quantification by HPLC.
Immunology, Issue 62, Mycotoxins, sporogenesis, Aspergillus flavus, Fusarium verticillioides, aflatoxin, fumonisin, plant-microbe interactions, plant biology
Play Button
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Authors: Gauthier Julie, Fadi F. Hamdan, Guy A. Rouleau.
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1 and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo mutations. This is the case for autism and schizophrenia3. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo mutations would more frequently come from males, particularly older males4. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing
Play Button
Designing and Implementing Nervous System Simulations on LEGO Robots
Authors: Daniel Blustein, Nikolai Rosenthal, Joseph Ayers.
Institutions: Northeastern University, Bremen University of Applied Sciences.
We present a method to use the commercially available LEGO Mindstorms NXT robotics platform to test systems level neuroscience hypotheses. The first step of the method is to develop a nervous system simulation of specific reflexive behaviors of an appropriate model organism; here we use the American Lobster. Exteroceptive reflexes mediated by decussating (crossing) neural connections can explain an animal's taxis towards or away from a stimulus as described by Braitenberg and are particularly well suited for investigation using the NXT platform.1 The nervous system simulation is programmed using LabVIEW software on the LEGO Mindstorms platform. Once the nervous system is tuned properly, behavioral experiments are run on the robot and on the animal under identical environmental conditions. By controlling the sensory milieu experienced by the specimens, differences in behavioral outputs can be observed. These differences may point to specific deficiencies in the nervous system model and serve to inform the iteration of the model for the particular behavior under study. This method allows for the experimental manipulation of electronic nervous systems and serves as a way to explore neuroscience hypotheses specifically regarding the neurophysiological basis of simple innate reflexive behaviors. The LEGO Mindstorms NXT kit provides an affordable and efficient platform on which to test preliminary biomimetic robot control schemes. The approach is also well suited for the high school classroom to serve as the foundation for a hands-on inquiry-based biorobotics curriculum.
Neuroscience, Issue 75, Neurobiology, Bioengineering, Behavior, Mechanical Engineering, Computer Science, Marine Biology, Biomimetics, Marine Science, Neurosciences, Synthetic Biology, Robotics, robots, Modeling, models, Sensory Fusion, nervous system, Educational Tools, programming, software, lobster, Homarus americanus, animal model
Play Button
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Authors: Wenan Chen, Ashwin Belle, Charles Cockrell, Kevin R. Ward, Kayvan Najarian.
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Play Button
Fabrication of Micro-tissues using Modules of Collagen Gel Containing Cells
Authors: M. Dean Chamberlain, Mark J. Butler, Ema C. Ciucurel, Lindsay E. Fitzpatrick, Omar F. Khan, Brendan M. Leung, Chuen Lo, Ritesh Patel, Alexandra Velchinskaya, Derek N. Voice, Michael V. Sefton.
Institutions: University of Toronto, University of Toronto.
This protocol describes the fabrication of a type of micro-tissues called modules. The module approach generates uniform, scalable and vascularized tissues. The modules can be made of collagen as well as other gelable or crosslinkable materials. They are approximately 2 mm in length and 0.7 mm in diameter upon fabrication but shrink in size with embedded cells or when the modules are coated with endothelial cells. The modules individually are small enough that the embedded cells are within the diffusion limit of oxygen and other nutrients but modules can be packed together to form larger tissues that are perfusable. These tissues are modular in construction because different cell types can be embedded in or coated on the modules before they are packed together to form complex tissues. There are three main steps to making the modules: (1) neutralizing the collagen and embedding cells in it, (2) gelling the collagen in the tube and cutting the modules and (3) coating the modules with endothelial cells.
Bioengineering, Issue 46, Tissue engineering, micro-tissue, endothelial cells, collagen gels, modules, 3D tissue culture.
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.