As DNA sequencing technology has markedly advanced in recent years2, it has become increasingly evident that the amount of genetic variation between any two individuals is greater than previously thought3. In contrast, array-based genotyping has failed to identify a significant contribution of common sequence variants to the phenotypic variability of common disease4,5. Taken together, these observations have led to the evolution of the Common Disease / Rare Variant hypothesis suggesting that the majority of the "missing heritability" in common and complex phenotypes is instead due to an individual's personal profile of rare or private DNA variants6-8. However, characterizing how rare variation impacts complex phenotypes requires the analysis of many affected individuals at many genomic loci, and is ideally compared to a similar survey in an unaffected cohort. Despite the sequencing power offered by today's platforms, a population-based survey of many genomic loci and the subsequent computational analysis required remains prohibitive for many investigators.
To address this need, we have developed a pooled sequencing approach1,9 and a novel software package1 for highly accurate rare variant detection from the resulting data. The ability to pool genomes from entire populations of affected individuals and survey the degree of genetic variation at multiple targeted regions in a single sequencing library provides excellent cost and time savings to traditional single-sample sequencing methodology. With a mean sequencing coverage per allele of 25-fold, our custom algorithm, SPLINTER, uses an internal variant calling control strategy to call insertions, deletions and substitutions up to four base pairs in length with high sensitivity and specificity from pools of up to 1 mutant allele in 500 individuals. Here we describe the method for preparing the pooled sequencing library followed by step-by-step instructions on how to use the SPLINTER package for pooled sequencing analysis (http://www.ibridgenetwork.org/wustl/splinter). We show a comparison between pooled sequencing of 947 individuals, all of whom also underwent genome-wide array, at over 20kb of sequencing per person. Concordance between genotyping of tagged and novel variants called in the pooled sample were excellent. This method can be easily scaled up to any number of genomic loci and any number of individuals. By incorporating the internal positive and negative amplicon controls at ratios that mimic the population under study, the algorithm can be calibrated for optimal performance. This strategy can also be modified for use with hybridization capture or individual-specific barcodes and can be applied to the sequencing of naturally heterogeneous samples, such as tumor DNA.
23 Related JoVE Articles!
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
Infinium Assay for Large-scale SNP Genotyping Applications
Institutions: Oklahoma Medical Research Foundation.
Genotyping variants in the human genome has proven to be an efficient method to identify genetic associations with phenotypes. The distribution of variants within families or populations can facilitate identification of the genetic factors of disease. Illumina's panel of genotyping BeadChips allows investigators to genotype thousands or millions of single nucleotide polymorphisms (SNPs) or to analyze other genomic variants, such as copy number, across a large number of DNA samples. These SNPs can be spread throughout the genome or targeted in specific regions in order to maximize potential discovery. The Infinium assay has been optimized to yield high-quality, accurate results quickly. With proper setup, a single technician can process from a few hundred to over a thousand DNA samples per week, depending on the type of array. This assay guides users through every step, starting with genomic DNA and ending with the scanning of the array. Using propriety reagents, samples are amplified, fragmented, precipitated, resuspended, hybridized to the chip, extended by a single base, stained, and scanned on either an iScan or Hi Scan high-resolution optical imaging system. One overnight step is required to amplify the DNA. The DNA is denatured and isothermally amplified by whole-genome amplification; therefore, no PCR is required. Samples are hybridized to the arrays during a second overnight step. By the third day, the samples are ready to be scanned and analyzed. Amplified DNA may be stockpiled in large quantities, allowing bead arrays to be processed every day of the week, thereby maximizing throughput.
Basic Protocol, Issue 81, genomics, SNP, Genotyping, Infinium, iScan, HiScan, Illumina
Detecting Somatic Genetic Alterations in Tumor Specimens by Exon Capture and Massively Parallel Sequencing
Institutions: Memorial Sloan-Kettering Cancer Center, Memorial Sloan-Kettering Cancer Center.
Efforts to detect and investigate key oncogenic mutations have proven valuable to facilitate the appropriate treatment for cancer patients. The establishment of high-throughput, massively parallel "next-generation" sequencing has aided the discovery of many such mutations. To enhance the clinical and translational utility of this technology, platforms must be high-throughput, cost-effective, and compatible with formalin-fixed paraffin embedded (FFPE) tissue samples that may yield small amounts of degraded or damaged DNA. Here, we describe the preparation of barcoded and multiplexed DNA libraries followed by hybridization-based capture of targeted exons for the detection of cancer-associated mutations in fresh frozen and FFPE tumors by massively parallel sequencing. This method enables the identification of sequence mutations, copy number alterations, and select structural rearrangements involving all targeted genes. Targeted exon sequencing offers the benefits of high throughput, low cost, and deep sequence coverage, thus conferring high sensitivity for detecting low frequency mutations.
Molecular Biology, Issue 80, Molecular Diagnostic Techniques, High-Throughput Nucleotide Sequencing, Genetics, Neoplasms, Diagnosis, Massively parallel sequencing, targeted exon sequencing, hybridization capture, cancer, FFPE, DNA mutations
Simulation of the Planetary Interior Differentiation Processes in the Laboratory
Institutions: Carnegie Institution of Washington.
A planetary interior is under high-pressure and high-temperature conditions and it has a layered structure. There are two important processes that led to that layered structure, (1) percolation of liquid metal in a solid silicate matrix by planet differentiation, and (2) inner core crystallization by subsequent planet cooling. We conduct high-pressure and high-temperature experiments to simulate both processes in the laboratory. Formation of percolative planetary core depends on the efficiency of melt percolation, which is controlled by the dihedral (wetting) angle. The percolation simulation includes heating the sample at high pressure to a target temperature at which iron-sulfur alloy is molten while the silicate remains solid, and then determining the true dihedral angle to evaluate the style of liquid migration in a crystalline matrix by 3D visualization. The 3D volume rendering is achieved by slicing the recovered sample with a focused ion beam (FIB) and taking SEM image of each slice with a FIB/SEM crossbeam instrument. The second set of experiments is designed to understand the inner core crystallization and element distribution between the liquid outer core and solid inner core by determining the melting temperature and element partitioning at high pressure. The melting experiments are conducted in the multi-anvil apparatus up to 27 GPa and extended to higher pressure in the diamond-anvil cell with laser-heating. We have developed techniques to recover small heated samples by precision FIB milling and obtain high-resolution images of the laser-heated spot that show melting texture at high pressure. By analyzing the chemical compositions of the coexisting liquid and solid phases, we precisely determine the liquidus curve, providing necessary data to understand the inner core crystallization process.
Physics, Issue 81, Geophysics, Planetary Science, Geochemistry, Planetary interior, high-pressure, planet differentiation, 3D tomography
Investigating the Effects of Probiotics on Pneumococcal Colonization Using an In Vitro Adherence Assay
Institutions: Murdoch Childrens Research Institute, Murdoch Childrens Research Institute, The University of Melbourne, The University of Melbourne.
Adherence of Streptococcus pneumoniae
(the pneumococcus) to the epithelial lining of the nasopharynx can result in colonization and is considered a prerequisite for pneumococcal infections such as pneumonia and otitis media. In vitro
adherence assays can be used to study the attachment of pneumococci to epithelial cell monolayers and to investigate potential interventions, such as the use of probiotics, to inhibit pneumococcal colonization. The protocol described here is used to investigate the effects of the probiotic Streptococcus salivarius
on the adherence of pneumococci to the human epithelial cell line CCL-23 (sometimes referred to as HEp-2 cells). The assay involves three main steps: 1) preparation of epithelial and bacterial cells, 2) addition of bacteria to epithelial cell monolayers, and 3) detection of adherent pneumococci by viable counts (serial dilution and plating) or quantitative real-time PCR (qPCR). This technique is relatively straightforward and does not require specialized equipment other than a tissue culture setup. The assay can be used to test other probiotic species and/or potential inhibitors of pneumococcal colonization and can be easily modified to address other scientific questions regarding pneumococcal adherence and invasion.
Immunology, Issue 86, Gram-Positive Bacterial Infections, Pneumonia, Bacterial, Lung Diseases, Respiratory Tract Infections, Streptococcus pneumoniae, adherence, colonization, probiotics, Streptococcus salivarius, In Vitro assays
Rapid and Efficient Zebrafish Genotyping Using PCR with High-resolution Melt Analysis
Institutions: University of Utah School of Medicine, University of Utah School of Medicine, University of Utah School of Medicine, University of Utah School of Medicine, University of Utah School of Medicine.
Zebrafish is a powerful vertebrate model system for studying development, modeling disease, and performing drug screening. Recently a variety of genetic tools have been introduced, including multiple strategies for inducing mutations and generating transgenic lines. However, large-scale screening is limited by traditional genotyping methods, which are time-consuming and labor-intensive. Here we describe a technique to analyze zebrafish genotypes by PCR combined with high-resolution melting analysis (HRMA). This approach is rapid, sensitive, and inexpensive, with lower risk of contamination artifacts. Genotyping by PCR with HRMA can be used for embryos or adult fish, including in high-throughput screening protocols.
Basic Protocol, Issue 84, genotyping, high-resolution melting analysis (HRMA), PCR, zebrafish, mutation, transgenes
Isolation and Quantification of Botulinum Neurotoxin From Complex Matrices Using the BoTest Matrix Assays
Institutions: BioSentinel Inc., Madison, WI.
Accurate detection and quantification of botulinum neurotoxin (BoNT) in complex matrices is required for pharmaceutical, environmental, and food sample testing. Rapid BoNT testing of foodstuffs is needed during outbreak forensics, patient diagnosis, and food safety testing while accurate potency testing is required for BoNT-based drug product manufacturing and patient safety. The widely used mouse bioassay for BoNT testing is highly sensitive but lacks the precision and throughput needed for rapid and routine BoNT testing. Furthermore, the bioassay's use of animals has resulted in calls by drug product regulatory authorities and animal-rights proponents in the US and abroad to replace the mouse bioassay for BoNT testing. Several in vitro
replacement assays have been developed that work well with purified BoNT in simple buffers, but most have not been shown to be applicable to testing in highly complex matrices. Here, a protocol for the detection of BoNT in complex matrices using the BoTest Matrix assays is presented. The assay consists of three parts: The first part involves preparation of the samples for testing, the second part is an immunoprecipitation step using anti-BoNT antibody-coated paramagnetic beads to purify BoNT from the matrix, and the third part quantifies the isolated BoNT's proteolytic activity using a fluorogenic reporter. The protocol is written for high throughput testing in 96-well plates using both liquid and solid matrices and requires about 2 hr of manual preparation with total assay times of 4-26 hr depending on the sample type, toxin load, and desired sensitivity. Data are presented for BoNT/A testing with phosphate-buffered saline, a drug product, culture supernatant, 2% milk, and fresh tomatoes and includes discussion of critical parameters for assay success.
Neuroscience, Issue 85, Botulinum, food testing, detection, quantification, complex matrices, BoTest Matrix, Clostridium, potency testing
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro
. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro
replication of HIV-1 as influenced by the gag
gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag
gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro
replication of chronically derived gag-pro
sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Quantifying Yeast Chronological Life Span by Outgrowth of Aged Cells
Institutions: University of Washington.
The budding yeast Saccharomyces cerevisiae
has proven to be an important model organism in the field of aging research 1
. The replicative and chronological life spans are two established paradigms used to study aging in yeast. Replicative aging is defined as the number of daughter cells a single yeast mother cell produces before senescence; chronological aging is defined by the length of time cells can survive in a non-dividing, quiescence-like state 2
. We have developed a high-throughput method for quantitative measurement of chronological life span. This method involves aging the cells in a defined medium under agitation and at constant temperature. At each age-point, a sub-population of cells is removed from the aging culture and inoculated into rich growth medium. A high-resolution growth curve is then obtained for this sub-population of aged cells using a Bioscreen C MBR machine. An algorithm is then applied to determine the relative proportion of viable cells in each sub-population based on the growth kinetics at each age-point. This method requires substantially less time and resources compared to other chronological lifespan assays while maintaining reproducibility and precision. The high-throughput nature of this assay should allow for large-scale genetic and chemical screens to identify novel longevity modifiers for further testing in more complex organisms.
Microbiology, Issue 27, longevity, aging, chronological life span, yeast, Bioscreen C MBR, stationary phase
Profiling of Pre-micro RNAs and microRNAs using Quantitative Real-time PCR (qPCR) Arrays
Institutions: University of North Carolina at Chapel Hill.
Quantitative real-time PCR (QPCR) has emerged as an accurate and valuable tool in profiling gene expression levels. One of its many advantages is a lower detection limit compared to other methods of gene expression profiling while using smaller amounts of input for each assay. Automated qPCR setup has improved this field by allowing for greater reproducibility. Its convenient and rapid setup allows for high-throughput experiments, enabling the profiling of many different genes simultaneously in each experiment. This method along with internal plate controls also reduces experimental variables common to other techniques.
We recently developed a qPCR assay for profiling of pre-microRNAs (pre-miRNAs) using a set of 186 primer pairs. MicroRNAs have emerged as a novel class of small, non-coding RNAs with the ability to regulate many mRNA targets at the post-transcriptional level. These small RNAs are first transcribed by RNA polymerase II as a primary miRNA (pri-miRNA) transcript, which is then cleaved into the precursor miRNA (pre-miRNA). Pre-miRNAs are exported to the cytoplasm where Dicer cleaves the hairpin loop to yield mature miRNAs. Increases in miRNA levels can be observed at both the precursor and mature miRNA levels and profiling of both of these forms can be useful. There are several commercially available assays for mature miRNAs; however, their high cost may deter researchers from this profiling technique. Here, we discuss a cost-effective, reliable, SYBR-based qPCR method of profiling pre-miRNAs. Changes in pre-miRNA levels often reflect mature miRNA changes and can be a useful indicator of mature miRNA expression. However, simultaneous profiling of both pre-miRNAs and mature miRNAs may be optimal as they can contribute nonredundant information and provide insight into microRNA processing. Furthermore, the technique described here can be expanded to encompass the profiling of other library sets for specific pathways or pathogens.
Biochemistry, Issue 46, pre-microRNAs, qPCR, profiling, Tecan Freedom Evo, robot
Identifying the Effects of BRCA1 Mutations on Homologous Recombination using Cells that Express Endogenous Wild-type BRCA1
Institutions: The Ohio State University, Tohoku University.
The functional analysis of missense mutations can be complicated by the presence in the cell of the endogenous protein. Structure-function analyses of the BRCA1 have been complicated by the lack of a robust assay for the full length BRCA1 protein and the difficulties inherent in working with cell lines that express hypomorphic BRCA1 protein1,2,3,4,5
. We developed a system whereby the endogenous BRCA1 protein in a cell was acutely depleted by RNAi targeting the 3'-UTR of the BRCA1 mRNA and replaced by co-transfecting a plasmid expressing a BRCA1 variant. One advantage of this procedure is that the acute silencing of BRCA1 and simultaneous replacement allow the cells to grow without secondary mutations or adaptations that might arise over time to compensate for the loss of BRCA1 function. This depletion and add-back procedure was done in a HeLa-derived cell line that was readily assayed for homologous recombination activity. The homologous recombination assay is based on a previously published method whereby a recombination substrate is integrated into the genome (Figure 1)6,7,8,9
. This recombination substrate has the rare-cutting I-SceI restriction enzyme site inside an inactive GFP allele, and downstream is a second inactive GFP allele. Transfection of the plasmid that expresses I-SceI results in a double-stranded break, which may be repaired by homologous recombination, and if homologous recombination does repair the break it creates an active GFP allele that is readily scored by flow cytometry for GFP protein expression. Depletion of endogenous BRCA1 resulted in an 8-10-fold reduction in homologous recombination activity, and add-back of wild-type plasmid fully restored homologous recombination function. When specific point mutants of full length BRCA1 were expressed from co-transfected plasmids, the effect of the specific missense mutant could be scored. As an example, the expression of the BRCA1(M18T) protein, a variant of unknown clinical significance10
, was expressed in these cells, it failed to restore BRCA1-dependent homologous recombination. By contrast, expression of another variant, also of unknown significance, BRCA1(I21V) fully restored BRCA1-dependent homologous recombination function. This strategy of testing the function of BRCA1 missense mutations has been applied to another biological system assaying for centrosome function (Kais et al, unpublished observations). Overall, this approach is suitable for the analysis of missense mutants in any gene that must be analyzed recessively.
Cell Biology, Issue 48, BRCA1, homologous recombination, breast cancer, RNA interference, DNA repair
Isolation of Fidelity Variants of RNA Viruses and Characterization of Virus Mutation Frequency
Institutions: Institut Pasteur .
RNA viruses use RNA dependent RNA polymerases to replicate their genomes. The intrinsically high error rate of these enzymes is a large contributor to the generation of extreme population diversity that facilitates virus adaptation and evolution. Increasing evidence shows that the intrinsic error rates, and the resulting mutation frequencies, of RNA viruses can be modulated by subtle amino acid changes to the viral polymerase. Although biochemical assays exist for some viral RNA polymerases that permit quantitative measure of incorporation fidelity, here we describe a simple method of measuring mutation frequencies of RNA viruses that has proven to be as accurate as biochemical approaches in identifying fidelity altering mutations. The approach uses conventional virological and sequencing techniques that can be performed in most biology laboratories. Based on our experience with a number of different viruses, we have identified the key steps that must be optimized to increase the likelihood of isolating fidelity variants and generating data of statistical significance. The isolation and characterization of fidelity altering mutations can provide new insights into polymerase structure and function1-3
. Furthermore, these fidelity variants can be useful tools in characterizing mechanisms of virus adaptation and evolution4-7
Immunology, Issue 52, Polymerase fidelity, RNA virus, mutation frequency, mutagen, RNA polymerase, viral evolution
Quantitative, Real-time Analysis of Base Excision Repair Activity in Cell Lysates Utilizing Lesion-specific Molecular Beacons
Institutions: University of Pittsburgh School of Medicine, University of Pittsburgh Cancer Institute, The Netherlands Cancer Institute, University of Pittsburgh School of Public Health.
We describe a method for the quantitative, real-time measurement of DNA glycosylase and AP endonuclease activities in cell nuclear lysates using base excision repair (BER) molecular beacons. The substrate (beacon) is comprised of a deoxyoligonucleotide containing a single base lesion with a 6-Carboxyfluorescein (6-FAM) moiety conjugated to the 5'end and a Dabcyl moiety conjugated to the 3' end of the oligonucleotide. The BER molecular beacon is 43 bases in length and the sequence is designed to promote the formation of a stem-loop structure with 13 nucleotides in the loop and 15 base pairs in the stem1,2
. When folded in this configuration the 6-FAM moiety is quenched by Dabcyl in a non-fluorescent manner via Förster Resonance Energy Transfer (FRET)3,4
. The lesion is positioned such that following base lesion removal and strand scission the remaining 5 base oligonucleotide containing the 6-FAM moiety is released from the stem. Release and detachment from the quencher (Dabcyl) results in an increase of fluorescence that is proportionate to the level of DNA repair. By collecting multiple reads of the fluorescence values, real-time assessment of BER activity is possible. The use of standard quantitative real-time PCR instruments allows the simultaneous analysis of numerous samples. The design of these BER molecular beacons, with a single base lesion, is amenable to kinetic analyses, BER quantification and inhibitor validation and is adaptable for quantification of DNA Repair activity in tissue and tumor cell lysates or with purified proteins. The analysis of BER activity in tumor lysates or tissue aspirates using these molecular beacons may be applicable to functional biomarker measurements. Further, the analysis of BER activity with purified proteins using this quantitative assay provides a rapid, high-throughput method for the discovery and validation of BER inhibitors.
Molecular Biology, Issue 66, Genetics, Cancer Biology, Base excision repair, DNA glycosylase, AP endonuclease, fluorescent, real-time, activity assay, molecular beacon, biomarker, DNA Damage, base lesion
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo
mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo
mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo
mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1
and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo
mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2
. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo
mutations. This is the case for autism and schizophrenia3
. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo
mutations would more frequently come from males, particularly older males4
. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo
mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo
mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Quantitative Real-Time PCR using the Thermo Scientific Solaris qPCR Assay
Institutions: Thermo Scientific Solaris qPCR Products.
The Solaris qPCR Gene Expression Assay is a novel type of primer/probe set, designed to simplify the qPCR process while maintaining the sensitivity and accuracy of the assay. These primer/probe sets are pre-designed to >98% of the human and mouse genomes and feature significant improvements from previously available technologies. These improvements were made possible by virtue of a novel design algorithm, developed by Thermo Scientific bioinformatics experts.
Several convenient features have been incorporated into the Solaris qPCR Assay to streamline the process of performing quantitative real-time PCR. First, the protocol is similar to commonly employed alternatives, so the methods used during qPCR are likely to be familiar. Second, the master mix is blue, which makes setting the qPCR reactions easier to track. Third, the thermal cycling conditions are the same for all assays (genes), making it possible to run many samples at a time and reducing the potential for error. Finally, the probe and primer sequence information are provided, simplifying the publication process.
Here, we demonstrate how to obtain the appropriate Solaris reagents using the GENEius product search feature found on the ordering web site (www.thermo.com/solaris) and how to use the Solaris reagents for performing qPCR using the standard curve method.
Cellular Biology, Issue 40, qPCR, probe, real-time PCR, molecular biology, Solaris, primer, gene expression assays
Rapid Genotyping of Mouse Tissue Using Sigma's Extract-N-Amp Tissue PCR Kit
Institutions: University of California, Irvine (UCI).
Genomic detection of DNA via PCR amplification and detection on an electrophoretic gel is a standard way that the genotype of a tissue sample is determined. Conventional preparation of tissues for PCR-ready DNA often take several hours to days, depending on the tissue sample. The genotype of the sample may thus be delayed for several days, which is not an option for many different types of experiments. Here we demonstrate the complete genotyping of a mouse tail sample, including tissue digestion and PCR readout, in one and a half hours using Sigma's SYBR Green Extract-N-Amp Tissue PCR Kit. First, we demonstrate the fifteen-minute extraction of DNA from the tissue sample. Then, we demonstrate the real time read-out of the PCR amplification of the sample, which allows for the identification of a positive sample as it is being amplified. Together, the rapid extraction and real-time readout allow for a prompt identification of genotype of a variety different types of tissues through the reliable method of PCR.
Basic Protocols, Issue 11, genotyping, PCR, DNA extraction, Mice
Pyrosequencing: A Simple Method for Accurate Genotyping
Institutions: Washington University in St. Louis.
Pharmacogenetic research benefits first-hand from the abundance of information provided by the completion of the Human Genome Project. With such a tremendous amount of data available comes an explosion of genotyping methods. Pyrosequencing(R) is one of the most thorough yet simple methods to date used to analyze polymorphisms. It also has the ability to identify tri-allelic, indels, short-repeat polymorphisms, along with determining allele percentages for methylation or pooled sample assessment. In addition, there is a standardized control sequence that provides internal quality control. This method has led to rapid and efficient single-nucleotide polymorphism evaluation including many clinically relevant polymorphisms. The technique and methodology of Pyrosequencing is explained.
Cellular Biology, Issue 11, Springer Protocols, Pyrosequencing, genotype, polymorphism, SNP, pharmacogenetics, pharmacogenomics, PCR