JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
The Barcode of Life Data Portal: bridging the biodiversity informatics divide for DNA barcoding.
PUBLISHED: 01-20-2011
With the volume of molecular sequence data that is systematically being generated globally, there is a need for centralized resources for data exploration and analytics. DNA Barcode initiatives are on track to generate a compendium of molecular sequence-based signatures for identifying animals and plants. To date, the range of available data exploration and analytic tools to explore these data have only been available in a boutique form--often representing a frustrating hurdle for many researchers that may not necessarily have resources to install or implement algorithms described by the analytic community. The Barcode of Life Data Portal (BDP) is a first step towards integrating the latest biodiversity informatics innovations with molecular sequence data from DNA barcoding. Through establishment of community driven standards, based on discussion with the Data Analysis Working Group (DAWG) of the Consortium for the Barcode of Life (CBOL), the BDP provides an infrastructure for incorporation of existing and next-generation DNA barcode analytic applications in an open forum.
Authors: Sylvie Sanschagrin, Etienne Yergeau.
Published: 08-29-2014
One of the major questions in microbial ecology is “who is there?” This question can be answered using various tools, but one of the long-lasting gold standards is to sequence 16S ribosomal RNA (rRNA) gene amplicons generated by domain-level PCR reactions amplifying from genomic DNA. Traditionally, this was performed by cloning and Sanger (capillary electrophoresis) sequencing of PCR amplicons. The advent of next-generation sequencing has tremendously simplified and increased the sequencing depth for 16S rRNA gene sequencing. The introduction of benchtop sequencers now allows small labs to perform their 16S rRNA sequencing in-house in a matter of days. Here, an approach for 16S rRNA gene amplicon sequencing using a benchtop next-generation sequencer is detailed. The environmental DNA is first amplified by PCR using primers that contain sequencing adapters and barcodes. They are then coupled to spherical particles via emulsion PCR. The particles are loaded on a disposable chip and the chip is inserted in the sequencing machine after which the sequencing is performed. The sequences are retrieved in fastq format, filtered and the barcodes are used to establish the sample membership of the reads. The filtered and binned reads are then further analyzed using publically available tools. An example analysis where the reads were classified with a taxonomy-finding algorithm within the software package Mothur is given. The method outlined here is simple, inexpensive and straightforward and should help smaller labs to take advantage from the ongoing genomic revolution.
24 Related JoVE Articles!
Play Button
Diagnosing Pulmonary Tuberculosis with the Xpert MTB/RIF Test
Authors: Thomas Bodmer, Angelika Ströhle.
Institutions: University of Bern, MCL Laboratories Inc..
Tuberculosis (TB) due to Mycobacterium tuberculosis (MTB) remains a major public health issue: the infection affects up to one third of the world population1, and almost two million people are killed by TB each year.2 Universal access to high-quality, patient-centered treatment for all TB patients is emphasized by WHO's Stop TB Strategy.3 The rapid detection of MTB in respiratory specimens and drug therapy based on reliable drug resistance testing results are a prerequisite for the successful implementation of this strategy. However, in many areas of the world, TB diagnosis still relies on insensitive, poorly standardized sputum microscopy methods. Ineffective TB detection and the emergence and transmission of drug-resistant MTB strains increasingly jeopardize global TB control activities.2 Effective diagnosis of pulmonary TB requires the availability - on a global scale - of standardized, easy-to-use, and robust diagnostic tools that would allow the direct detection of both the MTB complex and resistance to key antibiotics, such as rifampicin (RIF). The latter result can serve as marker for multidrug-resistant MTB (MDR TB) and has been reported in > 95% of the MDR-TB isolates.4, 5 The rapid availability of reliable test results is likely to directly translate into sound patient management decisions that, ultimately, will cure the individual patient and break the chain of TB transmission in the community.2 Cepheid's (Sunnyvale, CA, U.S.A.) Xpert MTB/RIF assay6, 7 meets the demands outlined above in a remarkable manner. It is a nucleic-acids amplification test for 1) the detection of MTB complex DNA in sputum or concentrated sputum sediments; and 2) the detection of RIF resistance-associated mutations of the rpoB gene.8 It is designed for use with Cepheid's GeneXpert Dx System that integrates and automates sample processing, nucleic acid amplification, and detection of the target sequences using real-time PCR and reverse transcriptase PCR. The system consists of an instrument, personal computer, barcode scanner, and preloaded software for running tests and viewing the results.9 It employs single-use disposable Xpert MTB/RIF cartridges that hold PCR reagents and host the PCR process. Because the cartridges are self-contained, cross-contamination between samples is eliminated.6 Current nucleic acid amplification methods used to detect MTB are complex, labor-intensive, and technically demanding. The Xpert MTB/RIF assay has the potential to bring standardized, sensitive and very specific diagnostic testing for both TB and drug resistance to universal-access point-of-care settings3, provided that they will be able to afford it. In order to facilitate access, the Foundation for Innovative New Diagnostics (FIND) has negotiated significant price reductions. Current FIND-negotiated prices, along with the list of countries eligible for the discounts, are available on the web.10
Immunology, Issue 62, tuberculosis, drug resistance, rifampicin, rapid diagnosis, Xpert MTB/RIF test
Play Button
Identification of Sleeping Beauty Transposon Insertions in Solid Tumors using Linker-mediated PCR
Authors: Callie L. Janik, Timothy K. Starr.
Institutions: University of Minnesota, Minneapolis, University of Minnesota, Minneapolis.
Genomic, proteomic, transcriptomic, and epigenomic analyses of human tumors indicate that there are thousands of anomalies within each cancer genome compared to matched normal tissue. Based on these analyses it is evident that there are many undiscovered genetic drivers of cancer1. Unfortunately these drivers are hidden within a much larger number of passenger anomalies in the genome that do not directly contribute to tumor formation. Another aspect of the cancer genome is that there is considerable genetic heterogeneity within similar tumor types. Each tumor can harbor different mutations that provide a selective advantage for tumor formation2. Performing an unbiased forward genetic screen in mice provides the tools to generate tumors and analyze their genetic composition, while reducing the background of passenger mutations. The Sleeping Beauty (SB) transposon system is one such method3. The SB system utilizes mobile vectors (transposons) that can be inserted throughout the genome by the transposase enzyme. Mutations are limited to a specific cell type through the use of a conditional transposase allele that is activated by Cre Recombinase. Many mouse lines exist that express Cre Recombinase in specific tissues. By crossing one of these lines to the conditional transposase allele (e.g. Lox-stop-Lox-SB11), the SB system is activated only in cells that express Cre Recombinase. The Cre Recombinase will excise a stop cassette that blocks expression of the transposase allele, thereby activating transposon mutagenesis within the designated cell type. An SB screen is initiated by breeding three strains of transgenic mice so that the experimental mice carry a conditional transposase allele, a concatamer of transposons, and a tissue-specific Cre Recombinase allele. These mice are allowed to age until tumors form and they become moribund. The mice are then necropsied and genomic DNA is isolated from the tumors. Next, the genomic DNA is subjected to linker-mediated-PCR (LM-PCR) that results in amplification of genomic loci containing an SB transposon. LM-PCR performed on a single tumor will result in hundreds of distinct amplicons representing the hundreds of genomic loci containing transposon insertions in a single tumor4. The transposon insertions in all tumors are analyzed and common insertion sites (CISs) are identified using an appropriate statistical method5. Genes within the CIS are highly likely to be oncogenes or tumor suppressor genes, and are considered candidate cancer genes. The advantages of using the SB system to identify candidate cancer genes are: 1) the transposon can easily be located in the genome because its sequence is known, 2) transposition can be directed to almost any cell type and 3) the transposon is capable of introducing both gain- and loss-of-function mutations6. The following protocol describes how to devise and execute a forward genetic screen using the SB transposon system to identify candidate cancer genes (Figure 1).
Genetics, Issue 72, Medicine, Cancer Biology, Biomedical Engineering, Genomics, Mice, Genetic Techniques, life sciences, animal models, Neoplasms, Genetic Phenomena, Forward genetic screen, cancer drivers, mouse models, oncogenes, tumor suppressor genes, Sleeping Beauty transposons, insertions, DNA, PCR, animal model
Play Button
BEST: Barcode Enabled Sequencing of Tetrads
Authors: Adrian C. Scott, Catherine L. Ludlow, Gareth A. Cromie, Aimée M. Dudley.
Institutions: Pacific Northwest Diabetes Research Institute.
Tetrad analysis is a valuable tool for yeast genetics, but the laborious manual nature of the process has hindered its application on large scales. Barcode Enabled Sequencing of Tetrads (BEST)1 replaces the manual processes of isolating, disrupting and spacing tetrads. BEST isolates tetrads by virtue of a sporulation-specific GFP fusion protein that permits fluorescence-activated cell sorting of tetrads directly onto agar plates, where the ascus is enzymatically digested and the spores are disrupted and randomly arrayed by glass bead plating. The haploid colonies are then assigned sister spore relationships, i.e. information about which spores originated from the same tetrad, using molecular barcodes read during genotyping. By removing the bottleneck of manual dissection, hundreds or even thousands of tetrads can be isolated in minutes. Here we present a detailed description of the experimental procedures required to perform BEST in the yeast Saccharomyces cerevisiae, starting with a heterozygous diploid strain through the isolation of colonies derived from the haploid meiotic progeny.
Genetics, Issue 87, Yeast, Tetrad, Genetics, DNA sequencing
Play Button
Surgical Procedures for a Rat Model of Partial Orthotopic Liver Transplantation with Hepatic Arterial Reconstruction
Authors: Kazuyuki Nagai, Shintaro Yagi, Shinji Uemoto, Rene H. Tolba.
Institutions: RWTH-Aachen University, Kyoto University .
Orthotopic liver transplantation (OLT) in rats using a whole or partial graft is an indispensable experimental model for transplantation research, such as studies on graft preservation and ischemia-reperfusion injury 1,2, immunological responses 3,4, hemodynamics 5,6, and small-for-size syndrome 7. The rat OLT is among the most difficult animal models in experimental surgery and demands advanced microsurgical skills that take a long time to learn. Consequently, the use of this model has been limited. Since the reliability and reproducibility of results are key components of the experiments in which such complex animal models are used, it is essential for surgeons who are involved in rat OLT to be trained in well-standardized and sophisticated procedures for this model. While various techniques and modifications of OLT in rats have been reported 8 since the first model was described by Lee et al. 9 in 1973, the elimination of the hepatic arterial reconstruction 10 and the introduction of the cuff anastomosis technique by Kamada et al. 11 were a major advancement in this model, because they simplified the reconstruction procedures to a great degree. In the model by Kamada et al., the hepatic rearterialization was also eliminated. Since rats could survive without hepatic arterial flow after liver transplantation, there was considerable controversy over the value of hepatic arterialization. However, the physiological superiority of the arterialized model has been increasingly acknowledged, especially in terms of preserving the bile duct system 8,12 and the liver integrity 8,13,14. In this article, we present detailed surgical procedures for a rat model of OLT with hepatic arterial reconstruction using a 50% partial graft after ex vivo liver resection. The reconstruction procedures for each vessel and the bile duct are performed by the following methods: a 7-0 polypropylene continuous suture for the supra- and infrahepatic vena cava; a cuff technique for the portal vein; and a stent technique for the hepatic artery and the bile duct.
Medicine, Issue 73, Biomedical Engineering, Anatomy, Physiology, Immunology, Surgery, liver transplantation, liver, hepatic, partial, orthotopic, split, rat, graft, transplantation, microsurgery, procedure, clinical, technique, artery, arterialization, arterialized, anastomosis, reperfusion, rat, animal model
Play Button
Heterotopic Auxiliary Rat Liver Transplantation With Flow-regulated Portal Vein Arterialization in Acute Hepatic Failure
Authors: Karina Schleimer, Johannes Kalder, Jochen Grommes, Houman Jalaie, Samir Tawadros, Andreas Greiner, Michael Jacobs, Maria Kokozidou.
Institutions: University Hospital RWTH Aachen.
In acute hepatic failure auxiliary liver transplantation is an interesting alternative approach. The aim is to provide a temporary support until the failing native liver has regenerated.1-3 The APOLT-method, the orthotopic implantation of auxiliary segments- averts most of the technical problems. However this method necessitates extensive resections of both the native liver and the graft.4 In 1998, Erhard developed the heterotopic auxiliary liver transplantation (HALT) utilizing portal vein arterialization (PVA) (Figure 1). This technique showed promising initial clinical results.5-6 We developed a HALT-technique with flow-regulated PVA in the rat to examine the influence of flow-regulated PVA on graft morphology and function (Figure 2). A liver graft reduced to 30 % of its original size, was heterotopically implanted in the right renal region of the recipient after explantation of the right kidney.  The infra-hepatic caval vein of the graft was anastomosed with the infrahepatic caval vein of the recipient. The arterialization of the donor’s portal vein was carried out via the recipient’s right renal artery with the stent technique. The blood-flow regulation of the arterialized portal vein was achieved with the use of a stent with an internal diameter of 0.3 mm. The celiac trunk of the graft was end-to-side anastomosed with the recipient’s aorta and the bile duct was implanted into the duodenum. A subtotal resection of the native liver was performed to induce acute hepatic failure. 7 In this manner 112 transplantations were performed. The perioperative survival rate was 90% and the 6-week survival rate was 80%. Six weeks after operation, the native liver regenerated, showing an increase in weight from 2.3±0.8 g to 9.8±1 g. At this time, the graft’s weight decreased from 3.3±0.8 g to 2.3±0.8 g. We were able to obtain promising long-term results in terms of graft morphology and function. HALT with flow-regulated PVA reliably bridges acute hepatic failure until the native liver regenerates.
Medicine, Issue 91, auxiliary liver transplantation, rat, portal vein arterialization, flow-regulation, acute hepatic failure
Play Button
Monitoring of Systemic and Hepatic Hemodynamic Parameters in Mice
Authors: Chichi Xie, Weiwei Wei, Tao Zhang, Olaf Dirsch, Uta Dahmen.
Institutions: Jena University Hospital, Jena University Hospital, The First Affiliated Hospital of Wenzhou Medical University.
The use of mouse models in experimental research is of enormous importance for the study of hepatic physiology and pathophysiological disturbances. However, due to the small size of the mouse, technical details of the intraoperative monitoring procedure suitable for the mouse were rarely described. Previously we have reported a monitoring procedure to obtain hemodynamic parameters for rats. Now, we adapted the procedure to acquire systemic and hepatic hemodynamic parameters in mice, a species ten-fold smaller than rats. This film demonstrates the instrumentation of the animals as well as the data acquisition process needed to assess systemic and hepatic hemodynamics in mice. Vital parameters, including body temperature, respiratory rate and heart rate were recorded throughout the whole procedure. Systemic hemodynamic parameters consist of carotid artery pressure (CAP) and central venous pressure (CVP). Hepatic perfusion parameters include portal vein pressure (PVP), portal flow rate as well as the flow rate of the common hepatic artery (table 1). Instrumentation and data acquisition to record the normal values was completed within 1.5 h. Systemic and hepatic hemodynamic parameters remained within normal ranges during this procedure. This procedure is challenging but feasible. We have already applied this procedure to assess hepatic hemodynamics in normal mice as well as during 70% partial hepatectomy and in liver lobe clamping experiments. Mean PVP after resection (n= 20), was 11.41±2.94 cmH2O which was significantly higher (P<0.05) than before resection (6.87±2.39 cmH2O). The results of liver lobe clamping experiment indicated that this monitoring procedure is sensitive and suitable for detecting small changes in portal pressure and portal flow rate. In conclusion, this procedure is reliable in the hands of an experienced micro-surgeon but should be limited to experiments where mice are absolutely needed.
Medicine, Issue 92, mice, hemodynamics, hepatic perfusion, CAP, CVP, surgery, intraoperative monitoring, portal vein pressure, blood flow
Play Button
Detecting Somatic Genetic Alterations in Tumor Specimens by Exon Capture and Massively Parallel Sequencing
Authors: Helen H Won, Sasinya N Scott, A. Rose Brannon, Ronak H Shah, Michael F Berger.
Institutions: Memorial Sloan-Kettering Cancer Center, Memorial Sloan-Kettering Cancer Center.
Efforts to detect and investigate key oncogenic mutations have proven valuable to facilitate the appropriate treatment for cancer patients. The establishment of high-throughput, massively parallel "next-generation" sequencing has aided the discovery of many such mutations. To enhance the clinical and translational utility of this technology, platforms must be high-throughput, cost-effective, and compatible with formalin-fixed paraffin embedded (FFPE) tissue samples that may yield small amounts of degraded or damaged DNA. Here, we describe the preparation of barcoded and multiplexed DNA libraries followed by hybridization-based capture of targeted exons for the detection of cancer-associated mutations in fresh frozen and FFPE tumors by massively parallel sequencing. This method enables the identification of sequence mutations, copy number alterations, and select structural rearrangements involving all targeted genes. Targeted exon sequencing offers the benefits of high throughput, low cost, and deep sequence coverage, thus conferring high sensitivity for detecting low frequency mutations.
Molecular Biology, Issue 80, Molecular Diagnostic Techniques, High-Throughput Nucleotide Sequencing, Genetics, Neoplasms, Diagnosis, Massively parallel sequencing, targeted exon sequencing, hybridization capture, cancer, FFPE, DNA mutations
Play Button
Steps for the Autologous Ex vivo Perfused Porcine Liver-kidney Experiment
Authors: Wen Yuan Chung, Amar M. Eltweri, John Isherwood, Jonathan Haqq, Seok Ling Ong, Gianpiero Gravante, David M. Lloyd, Matthew S. Metcalfe, Ashley R. Dennison.
Institutions: University Hospitals of Leicester.
The use of ex vivo perfused models can mimic the physiological conditions of the liver for short periods, but to maintain normal homeostasis for an extended perfusion period is challenging. We have added the kidney to our previous ex vivo perfused liver experiment model to reproduce a more accurate physiological state for prolonged experiments without using live animals. Five intact livers and kidneys were retrieved post-mortem from sacrificed pigs on different days and perfused for a minimum of 6 hr. Hourly arterial blood gases were obtained to analyze pH, lactate, glucose and renal parameters. The primary endpoint was to investigate the effect of adding one kidney to the model on the acid base balance, glucose, and electrolyte levels. The result of this liver-kidney experiment was compared to the results of five previous liver only perfusion models. In summary, with the addition of one kidney to the ex vivo liver circuit, hyperglycemia and metabolic acidosis were improved. In addition this model reproduces the physiological and metabolic responses of the liver sufficiently accurately to obviate the need for the use of live animals. The ex vivo liver-kidney perfusion model can be used as an alternative method in organ specific studies. It provides a disconnection from numerous systemic influences and allows specific and accurate adjustments of arterial and venous pressures and flow.
Medicine, Issue 82, Ex vivo, porcine, perfusion model, acid base balance, glucose, liver function, kidney function, cytokine response
Play Button
Genomic MRI - a Public Resource for Studying Sequence Patterns within Genomic DNA
Authors: Ashwin Prakash, Jason Bechtel, Alexei Fedorov.
Institutions: University of Toledo Health Science Campus.
Non-coding genomic regions in complex eukaryotes, including intergenic areas, introns, and untranslated segments of exons, are profoundly non-random in their nucleotide composition and consist of a complex mosaic of sequence patterns. These patterns include so-called Mid-Range Inhomogeneity (MRI) regions -- sequences 30-10000 nucleotides in length that are enriched by a particular base or combination of bases (e.g. (G+T)-rich, purine-rich, etc.). MRI regions are associated with unusual (non-B-form) DNA structures that are often involved in regulation of gene expression, recombination, and other genetic processes (Fedorova & Fedorov 2010). The existence of a strong fixation bias within MRI regions against mutations that tend to reduce their sequence inhomogeneity additionally supports the functionality and importance of these genomic sequences (Prakash et al. 2009). Here we demonstrate a freely available Internet resource -- the Genomic MRI program package -- designed for computational analysis of genomic sequences in order to find and characterize various MRI patterns within them (Bechtel et al. 2008). This package also allows generation of randomized sequences with various properties and level of correspondence to the natural input DNA sequences. The main goal of this resource is to facilitate examination of vast regions of non-coding DNA that are still scarcely investigated and await thorough exploration and recognition.
Genetics, Issue 51, bioinformatics, computational biology, genomics, non-randomness, signals, gene regulation, DNA conformation
Play Button
Technique of Subnormothermic Ex Vivo Liver Perfusion for the Storage, Assessment, and Repair of Marginal Liver Grafts
Authors: Jan M. Knaak, Vinzent N. Spetzler, Nicolas Goldaracena, Kristine S. Louis, Nazia Selzner, Markus Selzner.
Institutions: Toronto General Hospital, Toronto General Hospital, Toronto General Hospital.
The success of liver transplantation has resulted in a dramatic organ shortage. In most transplant regions 20-30% of patients on the waiting list for liver transplantation die without receiving an organ transplant or are delisted for disease progression. One strategy to increase the donor pool is the utilization of marginal grafts, such as fatty livers, grafts from older donors, or donation after cardiac death (DCD). The current preservation technique of cold static storage is only poorly tolerated by marginal livers resulting in significant organ damage. In addition, cold static organ storage does not allow graft assessment or repair prior to transplantation. These shortcomings of cold static preservation have triggered an interest in warm perfused organ preservation to reduce cold ischemic injury, assess liver grafts during preservation, and explore the opportunity to repair marginal livers prior to transplantation. The optimal pressure and flow conditions, perfusion temperature, composition of the perfusion solution and the need for an oxygen carrier has been controversial in the past. In spite of promising results in several animal studies, the complexity and the costs have prevented a broader clinical application so far. Recently, with enhanced technology and a better understanding of liver physiology during ex vivo perfusion the outcome of warm liver perfusion has improved and consistently good results can be achieved. This paper will provide information about liver retrieval, storage techniques, and isolated liver perfusion in pigs. We will illustrate a) the requirements to ensure sufficient oxygen supply to the organ, b) technical considerations about the perfusion machine and the perfusion solution, and c) biochemical aspects of isolated organs.
Medicine, Issue 90, ex vivo liver perfusion, marginal grafts, DCD
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Play Button
Infinium Assay for Large-scale SNP Genotyping Applications
Authors: Adam J. Adler, Graham B. Wiley, Patrick M. Gaffney.
Institutions: Oklahoma Medical Research Foundation.
Genotyping variants in the human genome has proven to be an efficient method to identify genetic associations with phenotypes. The distribution of variants within families or populations can facilitate identification of the genetic factors of disease. Illumina's panel of genotyping BeadChips allows investigators to genotype thousands or millions of single nucleotide polymorphisms (SNPs) or to analyze other genomic variants, such as copy number, across a large number of DNA samples. These SNPs can be spread throughout the genome or targeted in specific regions in order to maximize potential discovery. The Infinium assay has been optimized to yield high-quality, accurate results quickly. With proper setup, a single technician can process from a few hundred to over a thousand DNA samples per week, depending on the type of array. This assay guides users through every step, starting with genomic DNA and ending with the scanning of the array. Using propriety reagents, samples are amplified, fragmented, precipitated, resuspended, hybridized to the chip, extended by a single base, stained, and scanned on either an iScan or Hi Scan high-resolution optical imaging system. One overnight step is required to amplify the DNA. The DNA is denatured and isothermally amplified by whole-genome amplification; therefore, no PCR is required. Samples are hybridized to the arrays during a second overnight step. By the third day, the samples are ready to be scanned and analyzed. Amplified DNA may be stockpiled in large quantities, allowing bead arrays to be processed every day of the week, thereby maximizing throughput.
Basic Protocol, Issue 81, genomics, SNP, Genotyping, Infinium, iScan, HiScan, Illumina
Play Button
Chromatin Interaction Analysis with Paired-End Tag Sequencing (ChIA-PET) for Mapping Chromatin Interactions and Understanding Transcription Regulation
Authors: Yufen Goh, Melissa J. Fullwood, Huay Mei Poh, Su Qin Peh, Chin Thing Ong, Jingyao Zhang, Xiaoan Ruan, Yijun Ruan.
Institutions: Agency for Science, Technology and Research, Singapore, A*STAR-Duke-NUS Neuroscience Research Partnership, Singapore, National University of Singapore, Singapore.
Genomes are organized into three-dimensional structures, adopting higher-order conformations inside the micron-sized nuclear spaces 7, 2, 12. Such architectures are not random and involve interactions between gene promoters and regulatory elements 13. The binding of transcription factors to specific regulatory sequences brings about a network of transcription regulation and coordination 1, 14. Chromatin Interaction Analysis by Paired-End Tag Sequencing (ChIA-PET) was developed to identify these higher-order chromatin structures 5,6. Cells are fixed and interacting loci are captured by covalent DNA-protein cross-links. To minimize non-specific noise and reduce complexity, as well as to increase the specificity of the chromatin interaction analysis, chromatin immunoprecipitation (ChIP) is used against specific protein factors to enrich chromatin fragments of interest before proximity ligation. Ligation involving half-linkers subsequently forms covalent links between pairs of DNA fragments tethered together within individual chromatin complexes. The flanking MmeI restriction enzyme sites in the half-linkers allow extraction of paired end tag-linker-tag constructs (PETs) upon MmeI digestion. As the half-linkers are biotinylated, these PET constructs are purified using streptavidin-magnetic beads. The purified PETs are ligated with next-generation sequencing adaptors and a catalog of interacting fragments is generated via next-generation sequencers such as the Illumina Genome Analyzer. Mapping and bioinformatics analysis is then performed to identify ChIP-enriched binding sites and ChIP-enriched chromatin interactions 8. We have produced a video to demonstrate critical aspects of the ChIA-PET protocol, especially the preparation of ChIP as the quality of ChIP plays a major role in the outcome of a ChIA-PET library. As the protocols are very long, only the critical steps are shown in the video.
Genetics, Issue 62, ChIP, ChIA-PET, Chromatin Interactions, Genomics, Next-Generation Sequencing
Play Button
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Authors: Andreas Florian Haas, Ben Knowles, Yan Wei Lim, Tracey McDole Somera, Linda Wegley Kelly, Mark Hatay, Forest Rohwer.
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
Play Button
Competitive Genomic Screens of Barcoded Yeast Libraries
Authors: Andrew M. Smith, Tanja Durbic, Julia Oh, Malene Urbanus, Michael Proctor, Lawrence E. Heisler, Guri Giaever, Corey Nislow.
Institutions: University of Toronto, University of Toronto, University of Toronto, National Human Genome Research Institute, NIH, Stanford University , University of Toronto.
By virtue of advances in next generation sequencing technologies, we have access to new genome sequences almost daily. The tempo of these advances is accelerating, promising greater depth and breadth. In light of these extraordinary advances, the need for fast, parallel methods to define gene function becomes ever more important. Collections of genome-wide deletion mutants in yeasts and E. coli have served as workhorses for functional characterization of gene function, but this approach is not scalable, current gene-deletion approaches require each of the thousands of genes that comprise a genome to be deleted and verified. Only after this work is complete can we pursue high-throughput phenotyping. Over the past decade, our laboratory has refined a portfolio of competitive, miniaturized, high-throughput genome-wide assays that can be performed in parallel. This parallelization is possible because of the inclusion of DNA 'tags', or 'barcodes,' into each mutant, with the barcode serving as a proxy for the mutation and one can measure the barcode abundance to assess mutant fitness. In this study, we seek to fill the gap between DNA sequence and barcoded mutant collections. To accomplish this we introduce a combined transposon disruption-barcoding approach that opens up parallel barcode assays to newly sequenced, but poorly characterized microbes. To illustrate this approach we present a new Candida albicans barcoded disruption collection and describe how both microarray-based and next generation sequencing-based platforms can be used to collect 10,000 - 1,000,000 gene-gene and drug-gene interactions in a single experiment.
Biochemistry, Issue 54, chemical biology, chemogenomics, chemical probes, barcode microarray, next generation sequencing
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Biology of Microbial Communities - Interview
Authors: Roberto Kolter.
Institutions: Harvard Medical School.
Microbiology, issue 4, microbial community, DNA, extraction, gut, termit
Play Button
Investigating the Microbial Community in the Termite Hindgut - Interview
Authors: Jared Leadbetter.
Institutions: California Institute of Technology - Caltech.
Jared Leadbetter explains why the termite-gut microbial community is an excellent system for studying the complex interactions between microbes. The symbiotic relationship existing between the host insect and lignocellulose-degrading gut microbes is explained, as well as the industrial uses of these microbes for degrading plant biomass and generating biofuels.
Microbiology, issue 4, microbial community, diversity
Play Button
Principles of Site-Specific Recombinase (SSR) Technology
Authors: Frank Bucholtz.
Institutions: Max Plank Institute for Molecular Cell Biology and Genetics, Dresden.
Site-specific recombinase (SSR) technology allows the manipulation of gene structure to explore gene function and has become an integral tool of molecular biology. Site-specific recombinases are proteins that bind to distinct DNA target sequences. The Cre/lox system was first described in bacteriophages during the 1980's. Cre recombinase is a Type I topoisomerase that catalyzes site-specific recombination of DNA between two loxP (locus of X-over P1) sites. The Cre/lox system does not require any cofactors. LoxP sequences contain distinct binding sites for Cre recombinases that surround a directional core sequence where recombination and rearrangement takes place. When cells contain loxP sites and express the Cre recombinase, a recombination event occurs. Double-stranded DNA is cut at both loxP sites by the Cre recombinase, rearranged, and ligated ("scissors and glue"). Products of the recombination event depend on the relative orientation of the asymmetric sequences. SSR technology is frequently used as a tool to explore gene function. Here the gene of interest is flanked with Cre target sites loxP ("floxed"). Animals are then crossed with animals expressing the Cre recombinase under the control of a tissue-specific promoter. In tissues that express the Cre recombinase it binds to target sequences and excises the floxed gene. Controlled gene deletion allows the investigation of gene function in specific tissues and at distinct time points. Analysis of gene function employing SSR technology --- conditional mutagenesis -- has significant advantages over traditional knock-outs where gene deletion is frequently lethal.
Cellular Biology, Issue 15, Molecular Biology, Site-Specific Recombinase, Cre recombinase, Cre/lox system, transgenic animals, transgenic technology
Play Button
Facilitating the Analysis of Immunological Data with Visual Analytic Techniques
Authors: David C. Shih, Kevin C. Ho, Kyle M. Melnick, Ronald A. Rensink, Tobias R. Kollmann, Edgardo S. Fortuno III.
Institutions: University of British Columbia, University of British Columbia, University of British Columbia.
Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.
Immunology, Issue 47, Visual analytics, flow cytometry, Luminex, Tableau, cytokine, innate immunity, single nucleotide polymorphism
Play Button
Molecular Evolution of the Tre Recombinase
Authors: Frank Buchholz.
Institutions: Max Plank Institute for Molecular Cell Biology and Genetics, Dresden.
Here we report the generation of Tre recombinase through directed, molecular evolution. Tre recombinase recognizes a pre-defined target sequence within the LTR sequences of the HIV-1 provirus, resulting in the excision and eradication of the provirus from infected human cells. We started with Cre, a 38-kDa recombinase, that recognizes a 34-bp double-stranded DNA sequence known as loxP. Because Cre can effectively eliminate genomic sequences, we set out to tailor a recombinase that could remove the sequence between the 5'-LTR and 3'-LTR of an integrated HIV-1 provirus. As a first step we identified sequences within the LTR sites that were similar to loxP and tested for recombination activity. Initially Cre and mutagenized Cre libraries failed to recombine the chosen loxLTR sites of the HIV-1 provirus. As the start of any directed molecular evolution process requires at least residual activity, the original asymmetric loxLTR sequences were split into subsets and tested again for recombination activity. Acting as intermediates, recombination activity was shown with the subsets. Next, recombinase libraries were enriched through reiterative evolution cycles. Subsequently, enriched libraries were shuffled and recombined. The combination of different mutations proved synergistic and recombinases were created that were able to recombine loxLTR1 and loxLTR2. This was evidence that an evolutionary strategy through intermediates can be successful. After a total of 126 evolution cycles individual recombinases were functionally and structurally analyzed. The most active recombinase -- Tre -- had 19 amino acid changes as compared to Cre. Tre recombinase was able to excise the HIV-1 provirus from the genome HIV-1 infected HeLa cells (see "HIV-1 Proviral DNA Excision Using an Evolved Recombinase", Hauber J., Heinrich-Pette-Institute for Experimental Virology and Immunology, Hamburg, Germany). While still in its infancy, directed molecular evolution will allow the creation of custom enzymes that will serve as tools of "molecular surgery" and molecular medicine.
Cell Biology, Issue 15, HIV-1, Tre recombinase, Site-specific recombination, molecular evolution
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.