JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
MAFCO: a compression tool for MAF files.
.
PLoS ONE
PUBLISHED: 03-30-2015
In the last decade, the cost of genomic sequencing has been decreasing so much that researchers all over the world accumulate huge amounts of data for present and future use. These genomic data need to be efficiently stored, because storage cost is not decreasing as fast as the cost of sequencing. In order to overcome this problem, the most popular general-purpose compression tool, gzip, is usually used. However, these tools were not specifically designed to compress this kind of data, and often fall short when the intention is to reduce the data size as much as possible. There are several compression algorithms available, even for genomic data, but very few have been designed to deal with Whole Genome Alignments, containing alignments between entire genomes of several species. In this paper, we present a lossless compression tool, MAFCO, specifically designed to compress MAF (Multiple Alignment Format) files. Compared to gzip, the proposed tool attains a compression gain from 34% to 57%, depending on the data set. When compared to a recent dedicated method, which is not compatible with some data sets, the compression gain of MAFCO is about 9%. Both source-code and binaries for several operating systems are freely available for non-commercial use at: http://bioinformatics.ua.pt/software/mafco.
Authors: Francesco Vallania, Enrique Ramos, Sharon Cresci, Robi D. Mitra, Todd E. Druley.
Published: 06-23-2012
ABSTRACT
As DNA sequencing technology has markedly advanced in recent years2, it has become increasingly evident that the amount of genetic variation between any two individuals is greater than previously thought3. In contrast, array-based genotyping has failed to identify a significant contribution of common sequence variants to the phenotypic variability of common disease4,5. Taken together, these observations have led to the evolution of the Common Disease / Rare Variant hypothesis suggesting that the majority of the "missing heritability" in common and complex phenotypes is instead due to an individual's personal profile of rare or private DNA variants6-8. However, characterizing how rare variation impacts complex phenotypes requires the analysis of many affected individuals at many genomic loci, and is ideally compared to a similar survey in an unaffected cohort. Despite the sequencing power offered by today's platforms, a population-based survey of many genomic loci and the subsequent computational analysis required remains prohibitive for many investigators. To address this need, we have developed a pooled sequencing approach1,9 and a novel software package1 for highly accurate rare variant detection from the resulting data. The ability to pool genomes from entire populations of affected individuals and survey the degree of genetic variation at multiple targeted regions in a single sequencing library provides excellent cost and time savings to traditional single-sample sequencing methodology. With a mean sequencing coverage per allele of 25-fold, our custom algorithm, SPLINTER, uses an internal variant calling control strategy to call insertions, deletions and substitutions up to four base pairs in length with high sensitivity and specificity from pools of up to 1 mutant allele in 500 individuals. Here we describe the method for preparing the pooled sequencing library followed by step-by-step instructions on how to use the SPLINTER package for pooled sequencing analysis (http://www.ibridgenetwork.org/wustl/splinter). We show a comparison between pooled sequencing of 947 individuals, all of whom also underwent genome-wide array, at over 20kb of sequencing per person. Concordance between genotyping of tagged and novel variants called in the pooled sample were excellent. This method can be easily scaled up to any number of genomic loci and any number of individuals. By incorporating the internal positive and negative amplicon controls at ratios that mimic the population under study, the algorithm can be calibrated for optimal performance. This strategy can also be modified for use with hybridization capture or individual-specific barcodes and can be applied to the sequencing of naturally heterogeneous samples, such as tumor DNA.
24 Related JoVE Articles!
Play Button
The Preparation of Drosophila Embryos for Live-Imaging Using the Hanging Drop Protocol
Authors: Bruce H. Reed, Stephanie C. McMillan, Roopali Chaudhary.
Institutions: University of Waterloo.
Green fluorescent protein (GFP)-based timelapse live-imaging is a powerful technique for studying the genetic regulation of dynamic processes such as tissue morphogenesis, cell-cell adhesion, or cell death. Drosophila embryos expressing GFP are readily imaged using either stereoscopic or confocal microscopy. A goal of any live-imaging protocol is to minimize detrimental effects such as dehydration and hypoxia. Previous protocols for preparing Drosophila embryos for live-imaging analysis have involved placing dechorionated embryos in halocarbon oil and sandwiching them between a halocarbon gas-permeable membrane and a coverslip1-3. The introduction of compression through mounting embryos in this manner represents an undesirable complication for any biomechanical-based analysis of morphogenesis. Our method, which we call the hanging drop protocol, results in excellent viability of embryos during live imaging and does not require that embryos be compressed. Briefly, the hanging drop protocol involves the placement of embryos in a drop of halocarbon oil that is suspended from a coverslip, which is, in turn, fixed in position over a humid chamber. In addition to providing gas exchange and preventing dehydration, this arrangement takes advantage of the buoyancy of embryos in halocarbon oil to prevent them from drifting out of position during timelapse acquisition. This video describes in detail how to collect and prepare Drosophila embryos for live imaging using the hanging drop protocol. This protocol is suitable for imaging dechorionated embryos using stereomicroscopy or any upright compound fluorescence microscope.
Developmental Biology, Issue 25, Drosophila, embryos, live-imaging, GFP
1206
Play Button
A Coupled Experiment-finite Element Modeling Methodology for Assessing High Strain Rate Mechanical Response of Soft Biomaterials
Authors: Rajkumar Prabhu, Wilburn R. Whittington, Sourav S. Patnaik, Yuxiong Mao, Mark T. Begonia, Lakiesha N. Williams, Jun Liao, M. F. Horstemeyer.
Institutions: Mississippi State University, Mississippi State University.
This study offers a combined experimental and finite element (FE) simulation approach for examining the mechanical behavior of soft biomaterials (e.g. brain, liver, tendon, fat, etc.) when exposed to high strain rates. This study utilized a Split-Hopkinson Pressure Bar (SHPB) to generate strain rates of 100-1,500 sec-1. The SHPB employed a striker bar consisting of a viscoelastic material (polycarbonate). A sample of the biomaterial was obtained shortly postmortem and prepared for SHPB testing. The specimen was interposed between the incident and transmitted bars, and the pneumatic components of the SHPB were activated to drive the striker bar toward the incident bar. The resulting impact generated a compressive stress wave (i.e. incident wave) that traveled through the incident bar. When the compressive stress wave reached the end of the incident bar, a portion continued forward through the sample and transmitted bar (i.e. transmitted wave) while another portion reversed through the incident bar as a tensile wave (i.e. reflected wave). These waves were measured using strain gages mounted on the incident and transmitted bars. The true stress-strain behavior of the sample was determined from equations based on wave propagation and dynamic force equilibrium. The experimental stress-strain response was three dimensional in nature because the specimen bulged. As such, the hydrostatic stress (first invariant) was used to generate the stress-strain response. In order to extract the uniaxial (one-dimensional) mechanical response of the tissue, an iterative coupled optimization was performed using experimental results and Finite Element Analysis (FEA), which contained an Internal State Variable (ISV) material model used for the tissue. The ISV material model used in the FE simulations of the experimental setup was iteratively calibrated (i.e. optimized) to the experimental data such that the experiment and FEA strain gage values and first invariant of stresses were in good agreement.
Bioengineering, Issue 99, Split-Hopkinson Pressure Bar, High Strain Rate, Finite Element Modeling, Soft Biomaterials, Dynamic Experiments, Internal State Variable Modeling, Brain, Liver, Tendon, Fat
51545
Play Button
Rapid and Low-cost Prototyping of Medical Devices Using 3D Printed Molds for Liquid Injection Molding
Authors: Philip Chung, J. Alex Heller, Mozziyar Etemadi, Paige E. Ottoson, Jonathan A. Liu, Larry Rand, Shuvo Roy.
Institutions: University of California, San Francisco, University of California, San Francisco, University of Southern California.
Biologically inert elastomers such as silicone are favorable materials for medical device fabrication, but forming and curing these elastomers using traditional liquid injection molding processes can be an expensive process due to tooling and equipment costs. As a result, it has traditionally been impractical to use liquid injection molding for low-cost, rapid prototyping applications. We have devised a method for rapid and low-cost production of liquid elastomer injection molded devices that utilizes fused deposition modeling 3D printers for mold design and a modified desiccator as an injection system. Low costs and rapid turnaround time in this technique lower the barrier to iteratively designing and prototyping complex elastomer devices. Furthermore, CAD models developed in this process can be later adapted for metal mold tooling design, enabling an easy transition to a traditional injection molding process. We have used this technique to manufacture intravaginal probes involving complex geometries, as well as overmolding over metal parts, using tools commonly available within an academic research laboratory. However, this technique can be easily adapted to create liquid injection molded devices for many other applications.
Bioengineering, Issue 88, liquid injection molding, reaction injection molding, molds, 3D printing, fused deposition modeling, rapid prototyping, medical devices, low cost, low volume, rapid turnaround time.
51745
Play Button
Synergetic Use of Neural Precursor Cells and Self-assembling Peptides in Experimental Cervical Spinal Cord Injury
Authors: Klaus Zweckberger, Yang Liu, Jian Wang, Nicole Forgione, Michael G. Fehlings.
Institutions: University Health Network, Krembil Neuroscience Centre, University of Toronto, University of Toronto.
Spinal cord injuries (SCI) cause serious neurological impairment and psychological, economic, and social consequences for patients and their families. Clinically, more than 50% of SCI affect the cervical spine1. As a consequence of the primary injury, a cascade of secondary mechanisms including inflammation, apoptosis, and demyelination occur finally leading to tissue scarring and development of intramedullary cavities2,3. Both represent physical and chemical barriers to cell transplantation, integration, and regeneration. Therefore, shaping the inhibitory environment and bridging cavities to create a supportive milieu for cell transplantation and regeneration is a promising therapeutic target4. Here, a contusion/compression model of cervical SCI using an aneurysm clip is described. This model is more clinically relevant than other experimental models, since complete transection or ruptures of the cord are rare. Also in comparison to the weight drop model, which in particular damage the dorsum columns, circumferential compression of the spinal cord appears advantageous. Clip closing force and duration can be adjusted to achieve different injury severity. A ring spring facilitates precise calibration and constancy of clip force. Under physiological conditions, synthetic self-assembling peptides (SAP) self-assemble into nanofibers and thus, are appealing for application in SCI5. They can be injected directly into the lesion minimizing damage to the cord. SAPs are biocompatible structures erecting scaffolds to bridge intramedullary cavities and thus, equip the damaged cord for regenerative treatments. K2(QL)6K2 (QL6) is a novel SAP introduced by Dong et al.6 In comparison to other peptides, QL6 self-assembles into β-sheets at neutral pH6.14 days after SCI, after the acute stage, SAPs are injected into the center of the lesion and neural precursor cells (NPC) are injected into adjacent dorsal columns. In order to support cell survival, transplantation is combined with continuous subdural administration of growth factors by osmotic micro pumps for 7 days.
Medicine, Issue 96, Spinal cord injury (SCI), cervical trauma, aneurysm clip SCI model, intraspinal injection, stem cells, self-assembling peptides, neural precursor cells, growth factors
52105
Play Button
Purifying the Impure: Sequencing Metagenomes and Metatranscriptomes from Complex Animal-associated Samples
Authors: Yan Wei Lim, Matthew Haynes, Mike Furlan, Charles E. Robertson, J. Kirk Harris, Forest Rohwer.
Institutions: San Diego State University, DOE Joint Genome Institute, University of Colorado, University of Colorado.
The accessibility of high-throughput sequencing has revolutionized many fields of biology. In order to better understand host-associated viral and microbial communities, a comprehensive workflow for DNA and RNA extraction was developed. The workflow concurrently generates viral and microbial metagenomes, as well as metatranscriptomes, from a single sample for next-generation sequencing. The coupling of these approaches provides an overview of both the taxonomical characteristics and the community encoded functions. The presented methods use Cystic Fibrosis (CF) sputum, a problematic sample type, because it is exceptionally viscous and contains high amount of mucins, free neutrophil DNA, and other unknown contaminants. The protocols described here target these problems and successfully recover viral and microbial DNA with minimal human DNA contamination. To complement the metagenomics studies, a metatranscriptomics protocol was optimized to recover both microbial and host mRNA that contains relatively few ribosomal RNA (rRNA) sequences. An overview of the data characteristics is presented to serve as a reference for assessing the success of the methods. Additional CF sputum samples were also collected to (i) evaluate the consistency of the microbiome profiles across seven consecutive days within a single patient, and (ii) compare the consistency of metagenomic approach to a 16S ribosomal RNA gene-based sequencing. The results showed that daily fluctuation of microbial profiles without antibiotic perturbation was minimal and the taxonomy profiles of the common CF-associated bacteria were highly similar between the 16S rDNA libraries and metagenomes generated from the hypotonic lysis (HL)-derived DNA. However, the differences between 16S rDNA taxonomical profiles generated from total DNA and HL-derived DNA suggest that hypotonic lysis and the washing steps benefit in not only removing the human-derived DNA, but also microbial-derived extracellular DNA that may misrepresent the actual microbial profiles.
Molecular Biology, Issue 94, virome, microbiome, metagenomics, metatranscriptomics, cystic fibrosis, mucosal-surface
52117
Play Button
Surgical Fixation of Sternal Fractures: Preoperative Planning and a Safe Surgical Technique Using Locked Titanium Plates and Depth Limited Drilling
Authors: Stefan Schulz-Drost, Pascal Oppel, Sina Grupp, Sonja Schmitt, Roman Th. Carbon, Andreas Mauerer, Friedrich F. Hennig, Thomas Buder.
Institutions: University Hospital Erlangen, University Hospital Erlangen, St.-Theresien Hospital, University Erlangen-Nuremberg.
Different ways to stabilize a sternal fracture are described in literature. Respecting different mechanisms of trauma such as the direct impact to the anterior chest wall or the flexion-compression injury of the trunk, there is a need to retain each sternal fragment in the correct position while neutralizing shearing forces to the sternum. Anterior sternal plating provides the best stability and is therefore increasingly used in most cases. However, many surgeons are reluctant to perform sternal osteosynthesis due to possible complications such as difficulties in preoperative planning, severe injuries to mediastinal organs, or failure of the performed method. This manuscript describes one possible safe way to stabilize different types of sternal fractures in a step by step guidance for anterior sternal plating using low profile locking titanium plates. Before surgical treatment, a detailed survey of the patient and a three dimensional reconstructed computed tomography is taken out to get detailed information of the fracture’s morphology. The surgical approach is usually a midline incision. Its position can be described by measuring the distance from upper sternal edge to the fracture and its length can be approximated by the summation of 60 mm for the basis incision, the thickness of presternal soft tissue and the greatest distance between the fragments in case of multiple fractures. Performing subperiosteal dissection along the sternum while reducing the fracture, using depth limited drilling, and fixing the plates prevents injuries to mediastinal organs and vessels. Transverse fractures and oblique fractures at the corpus sterni are plated longitudinally, whereas oblique fractures of manubrium, sternocostal separation and any longitudinally fracture needs to be stabilized by a transverse plate from rib to sternum to rib. Usually the high convenience of a patient is seen during follow up as well as a precise reconstruction of the sternal morphology.
Medicine, Issue 95, Sternal fracture, sternum fracture, locked plate, low profile plate, MatrixRib, depth limited drilling, surgical procedure, preoperative CT planning
52124
Play Button
Evaluation of a Novel Laser-assisted Coronary Anastomotic Connector - the Trinity Clip - in a Porcine Off-pump Bypass Model
Authors: David Stecher, Glenn Bronkers, Jappe O.T. Noest, Cornelis A.F. Tulleken, Imo E. Hoefer, Lex A. van Herwerden, Gerard Pasterkamp, Marc P. Buijsrogge.
Institutions: University Medical Center Utrecht, Vascular Connect b.v., University Medical Center Utrecht, University Medical Center Utrecht.
To simplify and facilitate beating heart (i.e., off-pump), minimally invasive coronary artery bypass surgery, a new coronary anastomotic connector, the Trinity Clip, is developed based on the excimer laser-assisted nonocclusive anastomosis technique. The Trinity Clip connector enables simplified, sutureless, and nonocclusive connection of the graft to the coronary artery, and an excimer laser catheter laser-punches the opening of the anastomosis. Consequently, owing to the complete nonocclusive anastomosis construction, coronary conditioning (i.e., occluding or shunting) is not necessary, in contrast to the conventional anastomotic technique, hence simplifying the off-pump bypass procedure. Prior to clinical application in coronary artery bypass grafting, the safety and quality of this novel connector will be evaluated in a long-term experimental porcine off-pump coronary artery bypass (OPCAB) study. In this paper, we describe how to evaluate the coronary anastomosis in the porcine OPCAB model using various techniques to assess its quality. Representative results are summarized and visually demonstrated.
Medicine, Issue 93, Anastomosis, coronary, anastomotic connector, anastomotic coupler, excimer laser-assisted nonocclusive anastomosis (ELANA), coronary artery bypass graft (CABG), off-pump coronary artery bypass (OPCAB), beating heart surgery, excimer laser, porcine model, experimental, medical device
52127
Play Button
Enhanced Reduced Representation Bisulfite Sequencing for Assessment of DNA Methylation at Base Pair Resolution
Authors: Francine E. Garrett-Bakelman, Caroline K. Sheridan, Thadeous J. Kacmarczyk, Jennifer Ishii, Doron Betel, Alicia Alonso, Christopher E. Mason, Maria E. Figueroa, Ari M. Melnick.
Institutions: Weill Cornell Medical College, Weill Cornell Medical College, Weill Cornell Medical College, University of Michigan.
DNA methylation pattern mapping is heavily studied in normal and diseased tissues. A variety of methods have been established to interrogate the cytosine methylation patterns in cells. Reduced representation of whole genome bisulfite sequencing was developed to detect quantitative base pair resolution cytosine methylation patterns at GC-rich genomic loci. This is accomplished by combining the use of a restriction enzyme followed by bisulfite conversion. Enhanced Reduced Representation Bisulfite Sequencing (ERRBS) increases the biologically relevant genomic loci covered and has been used to profile cytosine methylation in DNA from human, mouse and other organisms. ERRBS initiates with restriction enzyme digestion of DNA to generate low molecular weight fragments for use in library preparation. These fragments are subjected to standard library construction for next generation sequencing. Bisulfite conversion of unmethylated cytosines prior to the final amplification step allows for quantitative base resolution of cytosine methylation levels in covered genomic loci. The protocol can be completed within four days. Despite low complexity in the first three bases sequenced, ERRBS libraries yield high quality data when using a designated sequencing control lane. Mapping and bioinformatics analysis is then performed and yields data that can be easily integrated with a variety of genome-wide platforms. ERRBS can utilize small input material quantities making it feasible to process human clinical samples and applicable in a range of research applications. The video produced demonstrates critical steps of the ERRBS protocol.
Genetics, Issue 96, Epigenetics, bisulfite sequencing, DNA methylation, genomic DNA, 5-methylcytosine, high-throughput
52246
Play Button
Calibrated Forceps Model of Spinal Cord Compression Injury
Authors: Ashley McDonough, Angela Monterrubio, Jeanelle Ariza, Verónica Martínez-Cerdeño.
Institutions: University of California, Davis, Shriners Hospitals for Children (Northern California).
Compression injuries of the murine spinal cord are valuable animal models for the study of spinal cord injury (SCI) and spinal regenerative therapy. The calibrated forceps model of compression injury is a convenient, low cost, and very reproducible animal model for SCI. We used a pair of modified forceps in accordance with the method published by Plemel et al. (2008) to laterally compress the spinal cord to a distance of 0.35 mm. In this video, we will demonstrate a dorsal laminectomy to expose the spinal cord, followed by compression of the spinal cord with the modified forceps. In the video, we will also address issues related to the care of paraplegic laboratory animals. This injury model produces mice that exhibit impairment in sensation, as well as impaired hindlimb locomotor function. Furthermore, this method of injury produces consistent aberrations in the pathology of the SCI, as determined by immunohistochemical methods. After watching this video, viewers should be able to determine the necessary supplies and methods for producing SCI of various severities in the mouse for studies on SCI and/or treatments designed to mitigate impairment after injury.
Medicine, Issue 98, SCI, compression model, compression injury, modified forceps, laminectomy, neurological deficit, murine spinal cord, reproducible animal model, reproducible deficit
52318
Play Button
A Rat Model of Ventricular Fibrillation and Resuscitation by Conventional Closed-chest Technique
Authors: Lorissa Lamoureux, Jeejabai Radhakrishnan, Raúl J. Gazmuri.
Institutions: Rosalind Franklin University of Medicine and Science.
A rat model of electrically-induced ventricular fibrillation followed by cardiac resuscitation using a closed chest technique that incorporates the basic components of cardiopulmonary resuscitation in humans is herein described. The model was developed in 1988 and has been used in approximately 70 peer-reviewed publications examining a myriad of resuscitation aspects including its physiology and pathophysiology, determinants of resuscitability, pharmacologic interventions, and even the effects of cell therapies. The model featured in this presentation includes: (1) vascular catheterization to measure aortic and right atrial pressures, to measure cardiac output by thermodilution, and to electrically induce ventricular fibrillation; and (2) tracheal intubation for positive pressure ventilation with oxygen enriched gas and assessment of the end-tidal CO2. A typical sequence of intervention entails: (1) electrical induction of ventricular fibrillation, (2) chest compression using a mechanical piston device concomitantly with positive pressure ventilation delivering oxygen-enriched gas, (3) electrical shocks to terminate ventricular fibrillation and reestablish cardiac activity, (4) assessment of post-resuscitation hemodynamic and metabolic function, and (5) assessment of survival and recovery of organ function. A robust inventory of measurements is available that includes – but is not limited to – hemodynamic, metabolic, and tissue measurements. The model has been highly effective in developing new resuscitation concepts and examining novel therapeutic interventions before their testing in larger and translationally more relevant animal models of cardiac arrest and resuscitation.
Medicine, Issue 98, Cardiopulmonary resuscitation, Hemodynamics, Myocardial ischemia, Rats, Reperfusion, Ventilation, Ventricular fibrillation, Ventricular function, Translational medical research
52413
Play Button
Genome-wide Snapshot of Chromatin Regulators and States in Xenopus Embryos by ChIP-Seq
Authors: George E. Gentsch, Ilya Patrushev, James C. Smith.
Institutions: MRC National Institute for Medical Research.
The recruitment of chromatin regulators and the assignment of chromatin states to specific genomic loci are pivotal to cell fate decisions and tissue and organ formation during development. Determining the locations and levels of such chromatin features in vivo will provide valuable information about the spatio-temporal regulation of genomic elements, and will support aspirations to mimic embryonic tissue development in vitro. The most commonly used method for genome-wide and high-resolution profiling is chromatin immunoprecipitation followed by next-generation sequencing (ChIP-Seq). This protocol outlines how yolk-rich embryos such as those of the frog Xenopus can be processed for ChIP-Seq experiments, and it offers simple command lines for post-sequencing analysis. Because of the high efficiency with which the protocol extracts nuclei from formaldehyde-fixed tissue, the method allows easy upscaling to obtain enough ChIP material for genome-wide profiling. Our protocol has been used successfully to map various DNA-binding proteins such as transcription factors, signaling mediators, components of the transcription machinery, chromatin modifiers and post-translational histone modifications, and for this to be done at various stages of embryogenesis. Lastly, this protocol should be widely applicable to other model and non-model organisms as more and more genome assemblies become available.
Developmental Biology, Issue 96, Chromatin immunoprecipitation, next-generation sequencing, ChIP-Seq, developmental biology, Xenopus embryos, cross-linking, transcription factor, post-sequencing analysis, DNA occupancy, metagene, binding motif, GO term
52535
Play Button
A Practical Guide to Phylogenetics for Nonexperts
Authors: Damien O'Halloran.
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
50975
Play Button
Using Flatbed Scanners to Collect High-resolution Time-lapsed Images of the Arabidopsis Root Gravitropic Response
Authors: Halie C Smith, Devon J Niewohner, Grant D Dewey, Autumn M Longo, Tracy L Guy, Bradley R Higgins, Sarah B Daehling, Sarah C. Genrich, Christopher D Wentworth, Tessa L Durham Brooks.
Institutions: Doane College, Doane College.
Research efforts in biology increasingly require use of methodologies that enable high-volume collection of high-resolution data. A challenge laboratories can face is the development and attainment of these methods. Observation of phenotypes in a process of interest is a typical objective of research labs studying gene function and this is often achieved through image capture. A particular process that is amenable to observation using imaging approaches is the corrective growth of a seedling root that has been displaced from alignment with the gravity vector. Imaging platforms used to measure the root gravitropic response can be expensive, relatively low in throughput, and/or labor intensive. These issues have been addressed by developing a high-throughput image capture method using inexpensive, yet high-resolution, flatbed scanners. Using this method, images can be captured every few minutes at 4,800 dpi. The current setup enables collection of 216 individual responses per day. The image data collected is of ample quality for image analysis applications.
Basic Protocol, Issue 83, root gravitropism, Arabidopsis, high-throughput phenotyping, flatbed scanners, image analysis, undergraduate research
50878
Play Button
Generation of Comprehensive Thoracic Oncology Database - Tool for Translational Research
Authors: Mosmi Surati, Matthew Robinson, Suvobroto Nandi, Leonardo Faoro, Carley Demchuk, Rajani Kanteti, Benjamin Ferguson, Tara Gangadhar, Thomas Hensing, Rifat Hasina, Aliya Husain, Mark Ferguson, Theodore Karrison, Ravi Salgia.
Institutions: University of Chicago, University of Chicago, Northshore University Health Systems, University of Chicago, University of Chicago, University of Chicago.
The Thoracic Oncology Program Database Project was created to serve as a comprehensive, verified, and accessible repository for well-annotated cancer specimens and clinical data to be available to researchers within the Thoracic Oncology Research Program. This database also captures a large volume of genomic and proteomic data obtained from various tumor tissue studies. A team of clinical and basic science researchers, a biostatistician, and a bioinformatics expert was convened to design the database. Variables of interest were clearly defined and their descriptions were written within a standard operating manual to ensure consistency of data annotation. Using a protocol for prospective tissue banking and another protocol for retrospective banking, tumor and normal tissue samples from patients consented to these protocols were collected. Clinical information such as demographics, cancer characterization, and treatment plans for these patients were abstracted and entered into an Access database. Proteomic and genomic data have been included in the database and have been linked to clinical information for patients described within the database. The data from each table were linked using the relationships function in Microsoft Access to allow the database manager to connect clinical and laboratory information during a query. The queried data can then be exported for statistical analysis and hypothesis generation.
Medicine, Issue 47, Database, Thoracic oncology, Bioinformatics, Biorepository, Microsoft Access, Proteomics, Genomics
2414
Play Button
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Authors: Gauthier Julie, Fadi F. Hamdan, Guy A. Rouleau.
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1 and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo mutations. This is the case for autism and schizophrenia3. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo mutations would more frequently come from males, particularly older males4. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing
2534
Play Button
Measurement of Aggregate Cohesion by Tissue Surface Tensiometry
Authors: Christine M. Butler, Ramsey A. Foty.
Institutions: UMDNJ-Robert Wood Johnson Medical School.
Rigorous measurement of intercellular binding energy can only be made using methods grounded in thermodynamic principles in systems at equilibrium. We have developed tissue surface tensiometry (TST) specifically to measure the surface free energy of interaction between cells. The biophysical concepts underlying TST have been previously described in detail1,2. The method is based on the observation that mutually cohesive cells, if maintained in shaking culture, will spontaneously assemble into clusters. Over time, these clusters will round up to form spheres. This rounding-up behavior mimics the behavior characteristic of liquid systems. Intercellular binding energy is measured by compressing spherical aggregates between parallel plates in a custom-designed tissue surface tensiometer. The same mathematical equation used to measure the surface tension of a liquid droplet is used to measure surface tension of 3D tissue-like spherical aggregates. The cellular equivalent of liquid surface tension is intercellular binding energy, or more generally, tissue cohesivity. Previous studies from our laboratory have shown that tissue surface tension (1) predicts how two groups of embryonic cells will interact with one another1-5, (2) can strongly influence the ability of tissues to interact with biomaterials6, (3) can be altered not only through direct manipulation of cadherin-based intercellular cohesion7, but also by manipulation of key ECM molecules such as FN8-11 and 4) correlates with invasive potential of lung cancer12, fibrosarcoma13, brain tumor14 and prostate tumor cell lines15. In this article we will describe the apparatus, detail the steps required to generate spheroids, to load the spheroids into the tensiometer chamber, to initiate aggregate compression, and to analyze and validate the tissue surface tension measurements generated.
Bioengineering, Issue 50, 3D, aggregate cohesion, tissue surface tension, parallel plate compression
2739
Play Button
The ITS2 Database
Authors: Benjamin Merget, Christian Koetschan, Thomas Hackl, Frank Förster, Thomas Dandekar, Tobias Müller, Jörg Schultz, Matthias Wolf.
Institutions: University of Würzburg, University of Würzburg.
The internal transcribed spacer 2 (ITS2) has been used as a phylogenetic marker for more than two decades. As ITS2 research mainly focused on the very variable ITS2 sequence, it confined this marker to low-level phylogenetics only. However, the combination of the ITS2 sequence and its highly conserved secondary structure improves the phylogenetic resolution1 and allows phylogenetic inference at multiple taxonomic ranks, including species delimitation2-8. The ITS2 Database9 presents an exhaustive dataset of internal transcribed spacer 2 sequences from NCBI GenBank11 accurately reannotated10. Following an annotation by profile Hidden Markov Models (HMMs), the secondary structure of each sequence is predicted. First, it is tested whether a minimum energy based fold12 (direct fold) results in a correct, four helix conformation. If this is not the case, the structure is predicted by homology modeling13. In homology modeling, an already known secondary structure is transferred to another ITS2 sequence, whose secondary structure was not able to fold correctly in a direct fold. The ITS2 Database is not only a database for storage and retrieval of ITS2 sequence-structures. It also provides several tools to process your own ITS2 sequences, including annotation, structural prediction, motif detection and BLAST14 search on the combined sequence-structure information. Moreover, it integrates trimmed versions of 4SALE15,16 and ProfDistS17 for multiple sequence-structure alignment calculation and Neighbor Joining18 tree reconstruction. Together they form a coherent analysis pipeline from an initial set of sequences to a phylogeny based on sequence and secondary structure. In a nutshell, this workbench simplifies first phylogenetic analyses to only a few mouse-clicks, while additionally providing tools and data for comprehensive large-scale analyses.
Genetics, Issue 61, alignment, internal transcribed spacer 2, molecular systematics, secondary structure, ribosomal RNA, phylogenetic tree, homology modeling, phylogeny
3806
Play Button
RNA-seq Analysis of Transcriptomes in Thrombin-treated and Control Human Pulmonary Microvascular Endothelial Cells
Authors: Dilyara Cheranova, Margaret Gibson, Suman Chaudhary, Li Qin Zhang, Daniel P. Heruth, Dmitry N. Grigoryev, Shui Qing Ye.
Institutions: Children's Mercy Hospital and Clinics, School of Medicine, University of Missouri-Kansas City.
The characterization of gene expression in cells via measurement of mRNA levels is a useful tool in determining how the transcriptional machinery of the cell is affected by external signals (e.g. drug treatment), or how cells differ between a healthy state and a diseased state. With the advent and continuous refinement of next-generation DNA sequencing technology, RNA-sequencing (RNA-seq) has become an increasingly popular method of transcriptome analysis to catalog all species of transcripts, to determine the transcriptional structure of all expressed genes and to quantify the changing expression levels of the total set of transcripts in a given cell, tissue or organism1,2 . RNA-seq is gradually replacing DNA microarrays as a preferred method for transcriptome analysis because it has the advantages of profiling a complete transcriptome, providing a digital type datum (copy number of any transcript) and not relying on any known genomic sequence3. Here, we present a complete and detailed protocol to apply RNA-seq to profile transcriptomes in human pulmonary microvascular endothelial cells with or without thrombin treatment. This protocol is based on our recent published study entitled "RNA-seq Reveals Novel Transcriptome of Genes and Their Isoforms in Human Pulmonary Microvascular Endothelial Cells Treated with Thrombin,"4 in which we successfully performed the first complete transcriptome analysis of human pulmonary microvascular endothelial cells treated with thrombin using RNA-seq. It yielded unprecedented resources for further experimentation to gain insights into molecular mechanisms underlying thrombin-mediated endothelial dysfunction in the pathogenesis of inflammatory conditions, cancer, diabetes, and coronary heart disease, and provides potential new leads for therapeutic targets to those diseases. The descriptive text of this protocol is divided into four parts. The first part describes the treatment of human pulmonary microvascular endothelial cells with thrombin and RNA isolation, quality analysis and quantification. The second part describes library construction and sequencing. The third part describes the data analysis. The fourth part describes an RT-PCR validation assay. Representative results of several key steps are displayed. Useful tips or precautions to boost success in key steps are provided in the Discussion section. Although this protocol uses human pulmonary microvascular endothelial cells treated with thrombin, it can be generalized to profile transcriptomes in both mammalian and non-mammalian cells and in tissues treated with different stimuli or inhibitors, or to compare transcriptomes in cells or tissues between a healthy state and a disease state.
Genetics, Issue 72, Molecular Biology, Immunology, Medicine, Genomics, Proteins, RNA-seq, Next Generation DNA Sequencing, Transcriptome, Transcription, Thrombin, Endothelial cells, high-throughput, DNA, genomic DNA, RT-PCR, PCR
4393
Play Button
Assessing Neurodegenerative Phenotypes in Drosophila Dopaminergic Neurons by Climbing Assays and Whole Brain Immunostaining
Authors: Maria Cecilia Barone, Dirk Bohmann.
Institutions: University of Rochester Medical Center .
Drosophila melanogaster is a valuable model organism to study aging and pathological degenerative processes in the nervous system. The advantages of the fly as an experimental system include its genetic tractability, short life span and the possibility to observe and quantitatively analyze complex behaviors. The expression of disease-linked genes in specific neuronal populations of the Drosophila brain, can be used to model human neurodegenerative diseases such as Parkinson's and Alzheimer's 5. Dopaminergic (DA) neurons are among the most vulnerable neuronal populations in the aging human brain. In Parkinson's disease (PD), the most common neurodegenerative movement disorder, the accelerated loss of DA neurons leads to a progressive and irreversible decline in locomotor function. In addition to age and exposure to environmental toxins, loss of DA neurons is exacerbated by specific mutations in the coding or promoter regions of several genes. The identification of such PD-associated alleles provides the experimental basis for the use of Drosophila as a model to study neurodegeneration of DA neurons in vivo. For example, the expression of the PD-linked human α-synuclein gene in Drosophila DA neurons recapitulates some features of the human disease, e.g. progressive loss of DA neurons and declining locomotor function 2. Accordingly, this model has been successfully used to identify potential therapeutic targets in PD 8. Here we describe two assays that have commonly been used to study age-dependent neurodegeneration of DA neurons in Drosophila: a climbing assay based on the startle-induced negative geotaxis response and tyrosine hydroxylase immunostaining of whole adult brain mounts to monitor the number of DA neurons at different ages. In both cases, in vivo expression of UAS transgenes specifically in DA neurons can be achieved by using a tyrosine hydroxylase (TH) promoter-Gal4 driver line 3, 10.
Neuroscience, Issue 74, Genetics, Neurobiology, Molecular Biology, Cellular Biology, Biomedical Engineering, Medicine, Developmental Biology, Drosophila melanogaster, neurodegenerative diseases, negative geotaxis, tyrosine hydroxylase, dopaminergic neuron, α-synuclein, neurons, immunostaining, animal model
50339
Play Button
Design of a Biaxial Mechanical Loading Bioreactor for Tissue Engineering
Authors: Bahar Bilgen, Danielle Chu, Robert Stefani, Roy K. Aaron.
Institutions: The Warren Alpert Brown Medical School of Brown University and the Rhode Island Hospital, VA Medical Center, Providence, RI, University of Texas Southwestern Medical Center .
We designed a loading device that is capable of applying uniaxial or biaxial mechanical strain to a tissue engineered biocomposites fabricated for transplantation. While the device primarily functions as a bioreactor that mimics the native mechanical strains, it is also outfitted with a load cell for providing force feedback or mechanical testing of the constructs. The device subjects engineered cartilage constructs to biaxial mechanical loading with great precision of loading dose (amplitude and frequency) and is compact enough to fit inside a standard tissue culture incubator. It loads samples directly in a tissue culture plate, and multiple plate sizes are compatible with the system. The device has been designed using components manufactured for precision-guided laser applications. Bi-axial loading is accomplished by two orthogonal stages. The stages have a 50 mm travel range and are driven independently by stepper motor actuators, controlled by a closed-loop stepper motor driver that features micro-stepping capabilities, enabling step sizes of less than 50 nm. A polysulfone loading platen is coupled to the bi-axial moving platform. Movements of the stages are controlled by Thor-labs Advanced Positioning Technology (APT) software. The stepper motor driver is used with the software to adjust load parameters of frequency and amplitude of both shear and compression independently and simultaneously. Positional feedback is provided by linear optical encoders that have a bidirectional repeatability of 0.1 μm and a resolution of 20 nm, translating to a positional accuracy of less than 3 μm over the full 50 mm of travel. These encoders provide the necessary position feedback to the drive electronics to ensure true nanopositioning capabilities. In order to provide the force feedback to detect contact and evaluate loading responses, a precision miniature load cell is positioned between the loading platen and the moving platform. The load cell has high accuracies of 0.15% to 0.25% full scale.
Bioengineering, Issue 74, Biomedical Engineering, Biophysics, Cellular Biology, Medicine, Anatomy, Physiology, Cell Engineering, Bioreactors, Culture Techniques, Cell Engineering, Tissue Engineering, compression loads, shear loads, Tissues, bioreactor, mechanical loading, compression, shear, musculoskeletal, cartilage, bone, transplantation, cell culture
50387
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
50680
Play Button
Detecting Somatic Genetic Alterations in Tumor Specimens by Exon Capture and Massively Parallel Sequencing
Authors: Helen H Won, Sasinya N Scott, A. Rose Brannon, Ronak H Shah, Michael F Berger.
Institutions: Memorial Sloan-Kettering Cancer Center, Memorial Sloan-Kettering Cancer Center.
Efforts to detect and investigate key oncogenic mutations have proven valuable to facilitate the appropriate treatment for cancer patients. The establishment of high-throughput, massively parallel "next-generation" sequencing has aided the discovery of many such mutations. To enhance the clinical and translational utility of this technology, platforms must be high-throughput, cost-effective, and compatible with formalin-fixed paraffin embedded (FFPE) tissue samples that may yield small amounts of degraded or damaged DNA. Here, we describe the preparation of barcoded and multiplexed DNA libraries followed by hybridization-based capture of targeted exons for the detection of cancer-associated mutations in fresh frozen and FFPE tumors by massively parallel sequencing. This method enables the identification of sequence mutations, copy number alterations, and select structural rearrangements involving all targeted genes. Targeted exon sequencing offers the benefits of high throughput, low cost, and deep sequence coverage, thus conferring high sensitivity for detecting low frequency mutations.
Molecular Biology, Issue 80, Molecular Diagnostic Techniques, High-Throughput Nucleotide Sequencing, Genetics, Neoplasms, Diagnosis, Massively parallel sequencing, targeted exon sequencing, hybridization capture, cancer, FFPE, DNA mutations
50710
Play Button
Phage Phenomics: Physiological Approaches to Characterize Novel Viral Proteins
Authors: Savannah E. Sanchez, Daniel A. Cuevas, Jason E. Rostron, Tiffany Y. Liang, Cullen G. Pivaroff, Matthew R. Haynes, Jim Nulton, Ben Felts, Barbara A. Bailey, Peter Salamon, Robert A. Edwards, Alex B. Burgin, Anca M. Segall, Forest Rohwer.
Institutions: San Diego State University, San Diego State University, San Diego State University, San Diego State University, San Diego State University, Argonne National Laboratory, Broad Institute.
Current investigations into phage-host interactions are dependent on extrapolating knowledge from (meta)genomes. Interestingly, 60 - 95% of all phage sequences share no homology to current annotated proteins. As a result, a large proportion of phage genes are annotated as hypothetical. This reality heavily affects the annotation of both structural and auxiliary metabolic genes. Here we present phenomic methods designed to capture the physiological response(s) of a selected host during expression of one of these unknown phage genes. Multi-phenotype Assay Plates (MAPs) are used to monitor the diversity of host substrate utilization and subsequent biomass formation, while metabolomics provides bi-product analysis by monitoring metabolite abundance and diversity. Both tools are used simultaneously to provide a phenotypic profile associated with expression of a single putative phage open reading frame (ORF). Representative results for both methods are compared, highlighting the phenotypic profile differences of a host carrying either putative structural or metabolic phage genes. In addition, the visualization techniques and high throughput computational pipelines that facilitated experimental analysis are presented.
Immunology, Issue 100, phenomics, phage, viral metagenome, Multi-phenotype Assay Plates (MAPs), continuous culture, metabolomics
52854
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.