Targeted genetic manipulation using homologous recombination is the method of choice for functional genomic analysis to obtain a detailed view of gene function and phenotype(s). The development of mutant strains with targeted gene deletions, targeted mutations, complemented gene function, and/or tagged genes provides powerful strategies to address gene function, particularly if these genetic manipulations can be efficiently targeted to the gene locus of interest using integration mediated by double cross over homologous recombination.
Due to very high rates of nonhomologous recombination, functional genomic analysis of Toxoplasma gondii has been previously limited by the absence of efficient methods for targeting gene deletions and gene replacements to specific genetic loci. Recently, we abolished the major pathway of nonhomologous recombination in type I and type II strains of T. gondii by deleting the gene encoding the KU80 protein1,2. The Δku80 strains behave normally during tachyzoite (acute) and bradyzoite (chronic) stages in vitro and in vivo and exhibit essentially a 100% frequency of homologous recombination. The Δku80 strains make functional genomic studies feasible on the single gene as well as on the genome scale1-4.
Here, we report methods for using type I and type II Δku80Δhxgprt strains to advance gene targeting approaches in T. gondii. We outline efficient methods for generating gene deletions, gene replacements, and tagged genes by targeted insertion or deletion of the hypoxanthine-xanthine-guanine phosphoribosyltransferase (HXGPRT) selectable marker. The described gene targeting protocol can be used in a variety of ways in Δku80 strains to advance functional analysis of the parasite genome and to develop single strains that carry multiple targeted genetic manipulations. The application of this genetic method and subsequent phenotypic assays will reveal fundamental and unique aspects of the biology of T. gondii and related significant human pathogens that cause malaria (Plasmodium sp.) and cryptosporidiosis (Cryptosporidium).
26 Related JoVE Articles!
GENPLAT: an Automated Platform for Biomass Enzyme Discovery and Cocktail Optimization
Institutions: Michigan State University, Michigan State University.
The high cost of enzymes for biomass deconstruction is a major impediment to the economic conversion of lignocellulosic feedstocks to liquid transportation fuels such as ethanol. We have developed an integrated high throughput platform, called GENPLAT, for the discovery and development of novel enzymes and enzyme cocktails for the release of sugars from diverse pretreatment/biomass combinations. GENPLAT comprises four elements: individual pure enzymes, statistical design of experiments, robotic pipeting of biomass slurries and enzymes, and automated colorimeteric determination of released Glc and Xyl. Individual enzymes are produced by expression in Pichia pastoris
or Trichoderma reesei
, or by chromatographic purification from commercial cocktails or from extracts of novel microorganisms. Simplex lattice (fractional factorial) mixture models are designed using commercial Design of Experiment statistical software. Enzyme mixtures of high complexity are constructed using robotic pipeting into a 96-well format. The measurement of released Glc and Xyl is automated using enzyme-linked colorimetric assays. Optimized enzyme mixtures containing as many as 16 components have been tested on a variety of feedstock and pretreatment combinations.
GENPLAT is adaptable to mixtures of pure enzymes, mixtures of commercial products (e.g., Accellerase 1000 and Novozyme 188), extracts of novel microbes, or combinations thereof. To make and test mixtures of ˜10 pure enzymes requires less than 100 μg of each protein and fewer than 100 total reactions, when operated at a final total loading of 15 mg protein/g glucan. We use enzymes from several sources. Enzymes can be purified from natural sources such as fungal cultures (e.g., Aspergillus niger
, Cochliobolus carbonum
, and Galerina marginata
), or they can be made by expression of the encoding genes (obtained from the increasing number of microbial genome sequences) in hosts such as E. coli, Pichia pastoris
, or a filamentous fungus such as T. reesei
. Proteins can also be purified from commercial enzyme cocktails (e.g., Multifect Xylanase, Novozyme 188). An increasing number of pure enzymes, including glycosyl hydrolases, cell wall-active esterases, proteases, and lyases, are available from commercial sources, e.g., Megazyme, Inc. (www.megazyme.com), NZYTech (www.nzytech.com), and PROZOMIX (www.prozomix.com).
Design-Expert software (Stat-Ease, Inc.) is used to create simplex-lattice designs and to analyze responses (in this case, Glc and Xyl release). Mixtures contain 4-20 components, which can vary in proportion between 0 and 100%. Assay points typically include the extreme vertices with a sufficient number of intervening points to generate a valid model. In the terminology of experimental design, most of our studies are "mixture" experiments, meaning that the sum of all components adds to a total fixed protein loading (expressed as mg/g glucan). The number of mixtures in the simplex-lattice depends on both the number of components in the mixture and the degree of polynomial (quadratic or cubic). For example, a 6-component experiment will entail 63 separate reactions with an augmented special cubic model, which can detect three-way interactions, whereas only 23 individual reactions are necessary with an augmented quadratic model. For mixtures containing more than eight components, a quadratic experimental design is more practical, and in our experience such models are usually statistically valid.
All enzyme loadings are expressed as a percentage of the final total loading (which for our experiments is typically 15 mg protein/g glucan). For "core" enzymes, the lower percentage limit is set to 5%. This limit was derived from our experience in which yields of Glc and/or Xyl were very low if any core enzyme was present at 0%. Poor models result from too many samples showing very low Glc or Xyl yields. Setting a lower limit in turn determines an upper limit. That is, for a six-component experiment, if the lower limit for each single component is set to 5%, then the upper limit of each single component will be 75%. The lower limits of all other enzymes considered as "accessory" are set to 0%. "Core" and "accessory" are somewhat arbitrary designations and will differ depending on the substrate, but in our studies the core enzymes for release of Glc from corn stover comprise the following enzymes from T. reesei
: CBH1 (also known as Cel7A), CBH2 (Cel6A), EG1(Cel7B), BG (β-glucosidase), EX3 (endo-β1,4-xylanase, GH10), and BX (β-xylosidase).
Bioengineering, Issue 56, cellulase, cellobiohydrolase, glucanase, xylanase, hemicellulase, experimental design, biomass, bioenergy, corn stover, glycosyl hydrolase
An Allele-specific Gene Expression Assay to Test the Functional Basis of Genetic Associations
Institutions: University of Oxford.
The number of significant genetic associations with common complex traits is constantly increasing. However, most of these associations have not been understood at molecular level. One of the mechanisms mediating the effect of DNA variants on phenotypes is gene expression, which has been shown to be particularly relevant for complex traits1
This method tests in a cellular context the effect of specific DNA sequences on gene expression. The principle is to measure the relative abundance of transcripts arising from the two alleles of a gene, analysing cells which carry one copy of the DNA sequences associated with disease (the risk variants)2,3
. Therefore, the cells used for this method should meet two fundamental genotypic requirements: they have to be heterozygous both for DNA risk variants and for DNA markers, typically coding polymorphisms, which can distinguish transcripts based on their chromosomal origin (Figure 1). DNA risk variants and DNA markers do not need to have the same allele frequency but the phase (haplotypic) relationship of the genetic markers needs to be understood. It is also important to choose cell types which express the gene of interest. This protocol refers specifically to the procedure adopted to extract nucleic acids from fibroblasts but the method is equally applicable to other cells types including primary cells.
DNA and RNA are extracted from the selected cell lines and cDNA is generated. DNA and cDNA are analysed with a primer extension assay, designed to target the coding DNA markers4
. The primer extension assay is carried out using the MassARRAY (Sequenom)5
platform according to the manufacturer's specifications. Primer extension products are then analysed by matrix-assisted laser desorption/ionization time of-flight mass spectrometry (MALDI-TOF/MS). Because the selected markers are heterozygous they will generate two peaks on the MS profiles. The area of each peak is proportional to the transcript abundance and can be measured with a function of the MassARRAY Typer software to generate an allelic ratio (allele 1: allele 2) calculation. The allelic ratio obtained for cDNA is normalized using that measured from genomic DNA, where the allelic ratio is expected to be 1:1 to correct for technical artifacts. Markers with a normalised allelic ratio significantly different to 1 indicate that the amount of transcript generated from the two chromosomes in the same cell is different, suggesting that the DNA variants associated with the phenotype have an effect on gene expression. Experimental controls should be used to confirm the results.
Cellular Biology, Issue 45, Gene expression, regulatory variant, haplotype, association study, primer extension, MALDI-TOF mass spectrometry, single nucleotide polymorphism, allele-specific
Environmentally Induced Heritable Changes in Flax
Institutions: Case Western Reserve University.
Some flax varieties respond to nutrient stress by modifying their genome and these modifications can be inherited through many generations. Also associated with these genomic changes are heritable phenotypic variations 1,2
. The flax variety Stormont Cirrus (Pl) when grown under three different nutrient conditions can either remain inducible (under the control conditions), or become stably modified to either the large or small genotroph by growth under high or low nutrient conditions respectively. The lines resulting from the initial growth under each of these conditions appear to grow better when grown under the same conditions in subsequent generations, notably the Pl line grows best under the control treatment indicating that the plants growing under both the high and low nutrients are under stress. One of the genomic changes that are associated with the induction of heritable changes is the appearance of an insertion element (LIS-1) 3, 4
while the plants are growing under the nutrient stress. With respect to this insertion event, the flax variety Stormont Cirrus (Pl) when grown under three different nutrient conditions can either remain unchanged (under the control conditions), have the insertion appear in all the plants (under low nutrients) and have this transmitted to the next generation, or have the insertion (or parts of it) appear but not be transmitted through generations (under high nutrients) 4
. The frequency of the appearance of this insertion indicates that it is under positive selection, which is also consistent with the growth response in subsequent generations. Leaves or meristems harvested at various stages of growth are used for DNA and RNA isolation. The RNA is used to identify variation in expression associated with the various growth environments and/or t he presence/absence of LIS-1. The isolated DNA is used to identify those plants in which the insertion has occurred.
Plant Biology, Issue 47, Flax, genome variation, environmental stress, small RNAs, altered gene expression
Modeling Astrocytoma Pathogenesis In Vitro and In Vivo Using Cortical Astrocytes or Neural Stem Cells from Conditional, Genetically Engineered Mice
Institutions: University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, Emory University School of Medicine, University of North Carolina School of Medicine.
Current astrocytoma models are limited in their ability to define the roles of oncogenic mutations in specific brain cell types during disease pathogenesis and their utility for preclinical drug development. In order to design a better model system for these applications, phenotypically wild-type cortical astrocytes and neural stem cells (NSC) from conditional, genetically engineered mice (GEM) that harbor various combinations of floxed oncogenic alleles were harvested and grown in culture. Genetic recombination was induced in vitro
using adenoviral Cre-mediated recombination, resulting in expression of mutated oncogenes and deletion of tumor suppressor genes. The phenotypic consequences of these mutations were defined by measuring proliferation, transformation, and drug response in vitro
. Orthotopic allograft models, whereby transformed cells are stereotactically injected into the brains of immune-competent, syngeneic littermates, were developed to define the role of oncogenic mutations and cell type on tumorigenesis in vivo
. Unlike most established human glioblastoma cell line xenografts, injection of transformed GEM-derived cortical astrocytes into the brains of immune-competent littermates produced astrocytomas, including the most aggressive subtype, glioblastoma, that recapitulated the histopathological hallmarks of human astrocytomas, including diffuse invasion of normal brain parenchyma. Bioluminescence imaging of orthotopic allografts from transformed astrocytes engineered to express luciferase was utilized to monitor in vivo
tumor growth over time. Thus, astrocytoma models using astrocytes and NSC harvested from GEM with conditional oncogenic alleles provide an integrated system to study the genetics and cell biology of astrocytoma pathogenesis in vitro
and in vivo
and may be useful in preclinical drug development for these devastating diseases.
Neuroscience, Issue 90, astrocytoma, cortical astrocytes, genetically engineered mice, glioblastoma, neural stem cells, orthotopic allograft
High-throughput, Automated Extraction of DNA and RNA from Clinical Samples using TruTip Technology on Common Liquid Handling Robots
Institutions: Akonni Biosystems, Inc., Akonni Biosystems, Inc., Akonni Biosystems, Inc., Akonni Biosystems, Inc..
TruTip is a simple nucleic acid extraction technology whereby a porous, monolithic binding matrix is inserted into a pipette tip. The geometry of the monolith can be adapted for specific pipette tips ranging in volume from 1.0 to 5.0 ml. The large porosity of the monolith enables viscous or complex samples to readily pass through it with minimal fluidic backpressure. Bi-directional flow maximizes residence time between the monolith and sample, and enables large sample volumes to be processed within a single TruTip. The fundamental steps, irrespective of sample volume or TruTip geometry, include cell lysis, nucleic acid binding to the inner pores of the TruTip monolith, washing away unbound sample components and lysis buffers, and eluting purified and concentrated nucleic acids into an appropriate buffer. The attributes and adaptability of TruTip are demonstrated in three automated clinical sample processing protocols using an Eppendorf epMotion 5070, Hamilton STAR and STARplus liquid handling robots, including RNA isolation from nasopharyngeal aspirate, genomic DNA isolation from whole blood, and fetal DNA extraction and enrichment from large volumes of maternal plasma (respectively).
Genetics, Issue 76, Bioengineering, Biomedical Engineering, Molecular Biology, Automation, Laboratory, Clinical Laboratory Techniques, Molecular Diagnostic Techniques, Analytic Sample Preparation Methods, Clinical Laboratory Techniques, Molecular Diagnostic Techniques, Genetic Techniques, Molecular Diagnostic Techniques, Automation, Laboratory, Chemistry, Clinical, DNA/RNA extraction, automation, nucleic acid isolation, sample preparation, nasopharyngeal aspirate, blood, plasma, high-throughput, sequencing
Isolation of Fidelity Variants of RNA Viruses and Characterization of Virus Mutation Frequency
Institutions: Institut Pasteur .
RNA viruses use RNA dependent RNA polymerases to replicate their genomes. The intrinsically high error rate of these enzymes is a large contributor to the generation of extreme population diversity that facilitates virus adaptation and evolution. Increasing evidence shows that the intrinsic error rates, and the resulting mutation frequencies, of RNA viruses can be modulated by subtle amino acid changes to the viral polymerase. Although biochemical assays exist for some viral RNA polymerases that permit quantitative measure of incorporation fidelity, here we describe a simple method of measuring mutation frequencies of RNA viruses that has proven to be as accurate as biochemical approaches in identifying fidelity altering mutations. The approach uses conventional virological and sequencing techniques that can be performed in most biology laboratories. Based on our experience with a number of different viruses, we have identified the key steps that must be optimized to increase the likelihood of isolating fidelity variants and generating data of statistical significance. The isolation and characterization of fidelity altering mutations can provide new insights into polymerase structure and function1-3
. Furthermore, these fidelity variants can be useful tools in characterizing mechanisms of virus adaptation and evolution4-7
Immunology, Issue 52, Polymerase fidelity, RNA virus, mutation frequency, mutagen, RNA polymerase, viral evolution
Primer-Free Aptamer Selection Using A Random DNA Library
Institutions: Pennsylvania State University, Pennsylvania State University, Pennsylvania State University, Pennsylvania State University.
Aptamers are highly structured oligonucleotides (DNA or RNA) that can bind to targets with affinities comparable to antibodies 1
. They are identified through an in vitro selection process called Systematic Evolution of Ligands by EXponential enrichment (SELEX) to recognize a wide variety of targets, from small molecules to proteins and other macromolecules 2-4
. Aptamers have properties that are well suited for in vivo diagnostic and/or therapeutic applications: Besides good specificity and affinity, they are easily synthesized, survive more rigorous processing conditions, they are poorly immunogenic, and their relatively small size can result in facile penetration of tissues.
Aptamers that are identified through the standard SELEX process usually comprise ~80 nucleotides (nt), since they are typically selected from nucleic acid libraries with ~40 nt long randomized regions plus fixed primer sites of ~20 nt on each side. The fixed primer sequences thus can comprise nearly ~50% of the library sequences, and therefore may positively or negatively compromise identification of aptamers in the selection process 3
, although bioinformatics approaches suggest that the fixed sequences do not contribute significantly to aptamer structure after selection 5
. To address these potential problems, primer sequences have been blocked by complementary oligonucleotides or switched to different sequences midway during the rounds of SELEX 6
, or they have been trimmed to 6-9 nt 7, 8
. Wen and Gray 9
designed a primer-free genomic SELEX method, in which the primer sequences were completely removed from the library before selection and were then regenerated to allow amplification of the selected genomic fragments. However, to employ the technique, a unique genomic library has to be constructed, which possesses limited diversity, and regeneration after rounds of selection relies on a linear reamplification step. Alternatively, efforts to circumvent problems caused by fixed primer sequences using high efficiency partitioning are met with problems regarding PCR amplification 10
We have developed a primer-free (PF) selection method that significantly simplifies SELEX procedures and effectively eliminates primer-interference problems 11, 12
. The protocols work in a straightforward manner. The central random region of the library is purified without extraneous flanking sequences and is bound to a suitable target (for example to a purified protein or complex mixtures such as cell lines). Then the bound sequences are obtained, reunited with flanking sequences, and re-amplified to generate selected sub-libraries. As an example, here we selected aptamers to S100B, a protein marker for melanoma. Binding assays showed Kd s in the 10-7
M range after a few rounds of selection, and we demonstrate that the aptamers function effectively in a sandwich binding format.
Cellular Biology, Issue 41, aptamer, selection, S100B, sandwich
Identification of Sleeping Beauty Transposon Insertions in Solid Tumors using Linker-mediated PCR
Institutions: University of Minnesota, Minneapolis, University of Minnesota, Minneapolis.
Genomic, proteomic, transcriptomic, and epigenomic analyses of human tumors indicate that there are thousands of anomalies within each cancer genome compared to matched normal tissue. Based on these analyses it is evident that there are many undiscovered genetic drivers of cancer1
. Unfortunately these drivers are hidden within a much larger number of passenger anomalies in the genome that do not directly contribute to tumor formation. Another aspect of the cancer genome is that there is considerable genetic heterogeneity within similar tumor types. Each tumor can harbor different mutations that provide a selective advantage for tumor formation2
. Performing an unbiased forward genetic screen in mice provides the tools to generate tumors and analyze their genetic composition, while reducing the background of passenger mutations. The Sleeping Beauty
(SB) transposon system is one such method3
. The SB system utilizes mobile vectors (transposons) that can be inserted throughout the genome by the transposase enzyme. Mutations are limited to a specific cell type through the use of a conditional transposase allele that is activated by Cre Recombinase
. Many mouse lines exist that express Cre Recombinase
in specific tissues. By crossing one of these lines to the conditional transposase allele (e.g.
Lox-stop-Lox-SB11), the SB system is activated only in cells that express Cre Recombinase
. The Cre Recombinase
will excise a stop cassette that blocks expression of the transposase allele, thereby activating transposon mutagenesis within the designated cell type. An SB screen is initiated by breeding three strains of transgenic mice so that the experimental mice carry a conditional transposase allele, a concatamer of transposons, and a tissue-specific Cre Recombinase
allele. These mice are allowed to age until tumors form and they become moribund. The mice are then necropsied and genomic DNA is isolated from the tumors. Next, the genomic DNA is subjected to linker-mediated-PCR (LM-PCR) that results in amplification of genomic loci containing an SB transposon. LM-PCR performed on a single tumor will result in hundreds of distinct amplicons representing the hundreds of genomic loci containing transposon insertions in a single tumor4
. The transposon insertions in all tumors are analyzed and common insertion sites (CISs) are identified using an appropriate statistical method5
. Genes within the CIS are highly likely to be oncogenes or tumor suppressor genes, and are considered candidate cancer genes. The advantages of using the SB system to identify candidate cancer genes are: 1) the transposon can easily be located in the genome because its sequence is known, 2) transposition can be directed to almost any cell type and 3) the transposon is capable of introducing both gain- and loss-of-function mutations6
. The following protocol describes how to devise and execute a forward genetic screen using the SB transposon system to identify candidate cancer genes (Figure 1
Genetics, Issue 72, Medicine, Cancer Biology, Biomedical Engineering, Genomics, Mice, Genetic Techniques, life sciences, animal models, Neoplasms, Genetic Phenomena, Forward genetic screen, cancer drivers, mouse models, oncogenes, tumor suppressor genes, Sleeping Beauty transposons, insertions, DNA, PCR, animal model
Regular Care and Maintenance of a Zebrafish (Danio rerio) Laboratory: An Introduction
Institutions: Edith Cowan University, Graylands Hospital, University of Western Australia, McCusker Alzheimer's Research foundation, University of Western Australia , University of Adelaide, Curtin University of Technology, University of Western Australia .
This protocol describes regular care and maintenance of a zebrafish laboratory. Zebrafish are now gaining popularity in genetics, pharmacological and behavioural research. As a vertebrate, zebrafish share considerable genetic sequence similarity with humans and are being used as an animal model for various human disease conditions. The advantages of zebrafish in comparison to other common vertebrate models include high fecundity, low maintenance cost, transparent embryos, and rapid development. Due to the spur of interest in zebrafish research, the need to establish and maintain a productive zebrafish housing facility is also increasing. Although literature is available for the maintenance of a zebrafish laboratory, a concise video protocol is lacking. This video illustrates the protocol for regular housing, feeding, breeding and raising of zebrafish larvae. This process will help researchers to understand the natural behaviour and optimal conditions of zebrafish husbandry and hence troubleshoot experimental issues that originate from the fish husbandry conditions. This protocol will be of immense help to researchers planning to establish a zebrafish laboratory, and also to graduate students who are intending to use zebrafish as an animal model.
Basic Protocols, Issue 69, Biology, Marine Biology, Zebrafish, Danio rerio, maintenance, breeding, feeding, raising, larvae, animal model, aquarium
Experimental Manipulation of Body Size to Estimate Morphological Scaling Relationships in Drosophila
Institutions: University of Houston, Michigan State University.
The scaling of body parts is a central feature of animal morphology1-7
. Within species, morphological traits need to be correctly proportioned to the body for the organism to function; larger individuals typically have larger body parts and smaller individuals generally have smaller body parts, such that overall body shape is maintained across a range of adult body sizes. The requirement for correct proportions means that individuals within species usually exhibit low variation in relative trait size. In contrast, relative trait size can vary dramatically among species and is a primary mechanism by which morphological diversity is produced. Over a century of comparative work has established these intra- and interspecific patterns3,4
Perhaps the most widely used approach to describe this variation is to calculate the scaling relationship between the size of two morphological traits using the allometric equation y=bxα, where x and y are the size of the two traits, such as organ and body size8,9
. This equation describes the within-group (e.g., species, population) scaling relationship between two traits as both vary in size. Log-transformation of this equation produces a simple linear equation, log(y) = log(b) + αlog(x) and log-log plots of the size of different traits among individuals of the same species typically reveal linear scaling with an intercept of log(b) and a slope of α, called the 'allometric coefficient'9,10
. Morphological variation among groups is described by differences in scaling relationship intercepts or slopes for a given trait pair. Consequently, variation in the parameters of the allometric equation (b and α) elegantly describes the shape variation captured in the relationship between organ and body size within and among biological groups (see 11,12
Not all traits scale linearly with each other or with body size (e.g., 13,14
) Hence, morphological scaling relationships are most informative when the data are taken from the full range of trait sizes. Here we describe how simple experimental manipulation of diet can be used to produce the full range of body size in insects. This permits an estimation of the full scaling relationship for any given pair of traits, allowing a complete description of how shape covaries with size and a robust comparison of scaling relationship parameters among biological groups. Although we focus on Drosophila
, our methodology should be applicable to nearly any fully metamorphic insect.
Developmental Biology, Issue 56, Drosophila, allometry, morphology, body size, scaling, insect
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g.
protein-protein) and functional (e.g.
gene-gene or genetic) interactions (GI)1
. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2
. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7
, but GI information remains sparse for prokaryotes8
, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10
Here, we present the key steps required to perform quantitative E. coli
Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9
, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format.
Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g.
the 'Keio' collection11
) and essential gene hypomorphic mutations (i.e.
alleles conferring reduced protein expression, stability, or activity9, 12, 13
) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14
. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9
. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2
. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e.
slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2
as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
Lignin Down-regulation of Zea mays via dsRNAi and Klason Lignin Analysis
Institutions: University of Arizona, Michigan State University, The Institute for Advanced Learning and Research, Michigan State University.
To facilitate the use of lignocellulosic biomass as an alternative bioenergy resource, during biological conversion processes, a pretreatment step is needed to open up the structure of the plant cell wall, increasing the accessibility of the cell wall carbohydrates. Lignin, a polyphenolic material present in many cell wall types, is known to be a significant hindrance to enzyme access. Reduction in lignin content to a level that does not interfere with the structural integrity and defense system of the plant might be a valuable step to reduce the costs of bioethanol production. In this study, we have genetically down-regulated one of the lignin biosynthesis-related genes, cinnamoyl-CoA reductase (ZmCCR1
) via a double stranded RNA interference technique. The ZmCCR1_RNAi
construct was integrated into the maize genome using the particle bombardment method. Transgenic maize plants grew normally as compared to the wild-type control plants without interfering with biomass growth or defense mechanisms, with the exception of displaying of brown-coloration in transgenic plants leaf mid-ribs, husks, and stems. The microscopic analyses, in conjunction with the histological assay, revealed that the leaf sclerenchyma fibers were thinned but the structure and size of other major vascular system components was not altered. The lignin content in the transgenic maize was reduced by 7-8.7%, the crystalline cellulose content was increased in response to lignin reduction, and hemicelluloses remained unchanged. The analyses may indicate that carbon flow might have been shifted from lignin biosynthesis to cellulose biosynthesis. This article delineates the procedures used to down-regulate the lignin content in maize via RNAi technology, and the cell wall compositional analyses used to verify the effect of the modifications on the cell wall structure.
Bioengineering, Issue 89, Zea mays, cinnamoyl-CoA reductase (CCR), dsRNAi, Klason lignin measurement, cell wall carbohydrate analysis, gas chromatography (GC)
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Combining Magnetic Sorting of Mother Cells and Fluctuation Tests to Analyze Genome Instability During Mitotic Cell Aging in Saccharomyces cerevisiae
Institutions: Rensselaer Polytechnic Institute.
has been an excellent model system for examining mechanisms and consequences of genome instability. Information gained from this yeast model is relevant to many organisms, including humans, since DNA repair and DNA damage response factors are well conserved across diverse species. However, S. cerevisiae
has not yet been used to fully address whether the rate of accumulating mutations changes with increasing replicative (mitotic) age due to technical constraints. For instance, measurements of yeast replicative lifespan through micromanipulation involve very small populations of cells, which prohibit detection of rare mutations. Genetic methods to enrich for mother cells in populations by inducing death of daughter cells have been developed, but population sizes are still limited by the frequency with which random mutations that compromise the selection systems occur. The current protocol takes advantage of magnetic sorting of surface-labeled yeast mother cells to obtain large enough populations of aging mother cells to quantify rare mutations through phenotypic selections. Mutation rates, measured through fluctuation tests, and mutation frequencies are first established for young cells and used to predict the frequency of mutations in mother cells of various replicative ages. Mutation frequencies are then determined for sorted mother cells, and the age of the mother cells is determined using flow cytometry by staining with a fluorescent reagent that detects bud scars formed on their cell surfaces during cell division. Comparison of predicted mutation frequencies based on the number of cell divisions to the frequencies experimentally observed for mother cells of a given replicative age can then identify whether there are age-related changes in the rate of accumulating mutations. Variations of this basic protocol provide the means to investigate the influence of alterations in specific gene functions or specific environmental conditions on mutation accumulation to address mechanisms underlying genome instability during replicative aging.
Microbiology, Issue 92, Aging, mutations, genome instability, Saccharomyces cerevisiae, fluctuation test, magnetic sorting, mother cell, replicative aging
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus
, consequently the name Taq
PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to:
● Set up reactions and thermal cycling conditions for a conventional PCR experiment
● Understand the function of various reaction components and their overall effect on a PCR experiment
● Design and optimize a PCR experiment for any DNA template
● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Evaluation of Integrated Anaerobic Digestion and Hydrothermal Carbonization for Bioenergy Production
Institutions: Leibniz Institute for Agricultural Engineering.
Lignocellulosic biomass is one of the most abundant yet underutilized renewable energy resources. Both anaerobic digestion (AD) and hydrothermal carbonization (HTC) are promising technologies for bioenergy production from biomass in terms of biogas and HTC biochar, respectively. In this study, the combination of AD and HTC is proposed to increase overall bioenergy production. Wheat straw was anaerobically digested in a novel upflow anaerobic solid state reactor (UASS) in both mesophilic (37 °C) and thermophilic (55 °C) conditions. Wet digested from thermophilic AD was hydrothermally carbonized at 230 °C for 6 hr for HTC biochar production. At thermophilic temperature, the UASS system yields an average of 165 LCH4
(VS: volatile solids) and 121 L CH4
at mesophilic AD over the continuous operation of 200 days. Meanwhile, 43.4 g of HTC biochar with 29.6 MJ/kgdry_biochar
was obtained from HTC of 1 kg digestate (dry basis) from mesophilic AD. The combination of AD and HTC, in this particular set of experiment yield 13.2 MJ of energy per 1 kg of dry wheat straw, which is at least 20% higher than HTC alone and 60.2% higher than AD only.
Environmental Sciences, Issue 88, Biomethane, Hydrothermal Carbonization (HTC), Calorific Value, Lignocellulosic Biomass, UASS, Anaerobic Digestion
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g.
carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
Large Volume (20L+) Filtration of Coastal Seawater Samples
Institutions: University of British Columbia - UBC.
The workflow begins with the collection of coastal marine waters for downstream microbial community, nutrient and trace gas analyses. For this method, samples were collected from the deck of the HMS John Strickland operating in Saanich Inlet. This video documents large volume (≥20 L) filtration of microbial biomass, ranging between 0.22μm and 2.7μm in diameter, from the water column. Two 20L samples can be filtered simultaneously using a single pump unit equipped with four rotating heads. Filtration is done in the field on extended trips, or immediately upon return for day trips. It is important to record the amount of water passing through each sterivex filter unit. To prevent biofilm formation between sampling trips, all filtration equipment must be rinsed with dilute HCl and deionized water and autoclaved immediately after use. This procedure will take approximately 5 hours plus an additional hour for clean up.
Molecular Biology, Issue 28, microbial biomass, filtration, sterivex, GF/D, nucleic acids, seawater, fjord, hypoxic, Saanich Inlet
Quantitative Comparison of cis-Regulatory Element (CRE) Activities in Transgenic Drosophila melanogaster
Institutions: University of Dayton, University of Dayton.
Gene expression patterns are specified by cis
-regulatory element (CRE) sequences, which are also called enhancers or cis-regulatory modules. A typical CRE possesses an arrangement of binding sites for several transcription factor proteins that confer a regulatory logic specifying when, where, and at what level the regulated gene(s) is expressed. The full set of CREs within an animal genome encodes the organism′s program for development1
, and empirical as well as theoretical studies indicate that mutations in CREs played a prominent role in morphological evolution2-4
. Moreover, human genome wide association studies indicate that genetic variation in CREs contribute substantially to phenotypic variation5,6
. Thus, understanding regulatory logic and how mutations affect such logic is a central goal of genetics.
Reporter transgenes provide a powerful method to study the in vivo
function of CREs. Here a known or suspected CRE sequence is coupled to heterologous promoter and coding sequences for a reporter gene encoding an easily observable protein product. When a reporter transgene is inserted into a host organism, the CRE′s activity becomes visible in the form of the encoded reporter protein. P-element mediated transgenesis in the fruit fly species Drosophila (D.) melanogaster7
has been used for decades to introduce reporter transgenes into this model organism, though the genomic placement of transgenes is random. Hence, reporter gene activity is strongly influenced by the local chromatin and gene environment, limiting CRE comparisons to being qualitative. In recent years, the phiC31 based integration system was adapted for use in D. melanogaster
to insert transgenes into specific genome landing sites8-10
. This capability has made the quantitative measurement of gene and, relevant here, CRE activity11-13
feasible. The production of transgenic fruit flies can be outsourced, including phiC31-based integration, eliminating the need to purchase expensive equipment and/or have proficiency at specialized transgene injection protocols.
Here, we present a general protocol to quantitatively evaluate a CRE′s activity, and show how this approach can be used to measure the effects of an introduced mutation on a CRE′s activity and to compare the activities of orthologous CREs. Although the examples given are for a CRE active during fruit fly metamorphosis, the approach can be applied to other developmental stages, fruit fly species, or model organisms. Ultimately, a more widespread use of this approach to study CREs should advance an understanding of regulatory logic and how logic can vary and evolve.
Developmental Biology, Issue 58, Cis-regulatory element, CRE, cis-regulatory module, enhancer, site-specific integration, reporter transgenes, confocal microscopy, regulatory logic, transcription factors, binding sites, Drosophila melanogaster, Drosophila
Isolation of Genomic DNA from Mouse Tails
Institutions: University of California, Irvine (UCI).
Basic Protocols, Issue 6, genomic, DNA, genotyping, mouse
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo
mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo
mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo
mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1
and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo
mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2
. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo
mutations. This is the case for autism and schizophrenia3
. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo
mutations would more frequently come from males, particularly older males4
. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo
mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo
mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing
Small Volume (1-3L) Filtration of Coastal Seawater Samples
Institutions: University of British Columbia - UBC.
The workflow begins with the collection of coastal marine waters for downstream microbial community, nutrient and trace gas analyses. For today s demonstration samples were collected from the deck of the HMS John Strickland operating in Saanich Inlet. This video documents small volume (~1 L) filtration of microbial biomass from the water column. The protocol is an extension of the large volume sampling protocol described earlier, with one major difference: here, there is no pre-filtration step, so all size classes of biomass are collected down to the 0.22 μm filter cut-off. Samples collected this way are ideal for nucleic acid analysis. The set-up, filtration, and clean-up steps each take about 20-30 minutes. If using two peristaltic pumps simultaneously, up to 8 samples may be filtered at the same time. To prevent biofilm formation between sampling trips, all filtration equipment must be rinsed with dilute HCl and deionized water and autoclaved immediately after use.
Molecular Biology, Issue 28, microbiology, seawater, filtration, biomass concentration
Investigating the Microbial Community in the Termite Hindgut - Interview
Institutions: California Institute of Technology - Caltech.
Jared Leadbetter explains why the termite-gut microbial community is an excellent system for studying the complex interactions between microbes. The symbiotic relationship existing between the host insect and lignocellulose-degrading gut microbes is explained, as well as the industrial uses of these microbes for degrading plant biomass and generating biofuels.
Microbiology, issue 4, microbial community, diversity