JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Classification of the adenylation and acyl-transferase activity of NRPS and PKS systems using ensembles of substrate specific hidden Markov models.
PLoS ONE
PUBLISHED: 01-01-2013
There is a growing interest in the Non-ribosomal peptide synthetases (NRPSs) and polyketide synthases (PKSs) of microbes, fungi and plants because they can produce bioactive peptides such as antibiotics. The ability to identify the substrate specificity of the enzymes adenylation (A) and acyl-transferase (AT) domains is essential to rationally deduce or engineer new products. We here report on a Hidden Markov Model (HMM)-based ensemble method to predict the substrate specificity at high quality. We collected a new reference set of experimentally validated sequences. An initial classification based on alignment and Neighbor Joining was performed in line with most of the previously published prediction methods. We then created and tested single substrate specific HMMs and found that their use improved the correct identification significantly for A as well as for AT domains. A major advantage of the use of HMMs is that it abolishes the dependency on multiple sequence alignment and residue selection that is hampering the alignment-based clustering methods. Using our models we obtained a high prediction quality for the substrate specificity of the A domains similar to two recently published tools that make use of HMMs or Support Vector Machines (NRPSsp and NRPS predictor2, respectively). Moreover, replacement of the single substrate specific HMMs by ensembles of models caused a clear increase in prediction quality. We argue that the superiority of the ensemble over the single model is caused by the way substrate specificity evolves for the studied systems. It is likely that this also holds true for other protein domains. The ensemble predictor has been implemented in a simple web-based tool that is available at http://www.cmbi.ru.nl/NRPS-PKS-substrate-predictor/.
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Published: 07-25-2013
ABSTRACT
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
21 Related JoVE Articles!
Play Button
Specificity Analysis of Protein Lysine Methyltransferases Using SPOT Peptide Arrays
Authors: Srikanth Kudithipudi, Denis Kusevic, Sara Weirich, Albert Jeltsch.
Institutions: Stuttgart University.
Lysine methylation is an emerging post-translation modification and it has been identified on several histone and non-histone proteins, where it plays crucial roles in cell development and many diseases. Approximately 5,000 lysine methylation sites were identified on different proteins, which are set by few dozens of protein lysine methyltransferases. This suggests that each PKMT methylates multiple proteins, however till now only one or two substrates have been identified for several of these enzymes. To approach this problem, we have introduced peptide array based substrate specificity analyses of PKMTs. Peptide arrays are powerful tools to characterize the specificity of PKMTs because methylation of several substrates with different sequences can be tested on one array. We synthesized peptide arrays on cellulose membrane using an Intavis SPOT synthesizer and analyzed the specificity of various PKMTs. Based on the results, for several of these enzymes, novel substrates could be identified. For example, for NSD1 by employing peptide arrays, we showed that it methylates K44 of H4 instead of the reported H4K20 and in addition H1.5K168 is the highly preferred substrate over the previously known H3K36. Hence, peptide arrays are powerful tools to biochemically characterize the PKMTs.
Biochemistry, Issue 93, Peptide arrays, solid phase peptide synthesis, SPOT synthesis, protein lysine methyltransferases, substrate specificity profile analysis, lysine methylation
52203
Play Button
Designing Silk-silk Protein Alloy Materials for Biomedical Applications
Authors: Xiao Hu, Solomon Duki, Joseph Forys, Jeffrey Hettinger, Justin Buchicchio, Tabbetha Dobbins, Catherine Yang.
Institutions: Rowan University, Rowan University, Cooper Medical School of Rowan University, Rowan University.
Fibrous proteins display different sequences and structures that have been used for various applications in biomedical fields such as biosensors, nanomedicine, tissue regeneration, and drug delivery. Designing materials based on the molecular-scale interactions between these proteins will help generate new multifunctional protein alloy biomaterials with tunable properties. Such alloy material systems also provide advantages in comparison to traditional synthetic polymers due to the materials biodegradability, biocompatibility, and tenability in the body. This article used the protein blends of wild tussah silk (Antheraea pernyi) and domestic mulberry silk (Bombyx mori) as an example to provide useful protocols regarding these topics, including how to predict protein-protein interactions by computational methods, how to produce protein alloy solutions, how to verify alloy systems by thermal analysis, and how to fabricate variable alloy materials including optical materials with diffraction gratings, electric materials with circuits coatings, and pharmaceutical materials for drug release and delivery. These methods can provide important information for designing the next generation multifunctional biomaterials based on different protein alloys.
Bioengineering, Issue 90, protein alloys, biomaterials, biomedical, silk blends, computational simulation, implantable electronic devices
50891
Play Button
Mouse Genome Engineering Using Designer Nucleases
Authors: Mario Hermann, Tomas Cermak, Daniel F. Voytas, Pawel Pelczar.
Institutions: University of Zurich, University of Minnesota.
Transgenic mice carrying site-specific genome modifications (knockout, knock-in) are of vital importance for dissecting complex biological systems as well as for modeling human diseases and testing therapeutic strategies. Recent advances in the use of designer nucleases such as zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and the clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated (Cas) 9 system for site-specific genome engineering open the possibility to perform rapid targeted genome modification in virtually any laboratory species without the need to rely on embryonic stem (ES) cell technology. A genome editing experiment typically starts with identification of designer nuclease target sites within a gene of interest followed by construction of custom DNA-binding domains to direct nuclease activity to the investigator-defined genomic locus. Designer nuclease plasmids are in vitro transcribed to generate mRNA for microinjection of fertilized mouse oocytes. Here, we provide a protocol for achieving targeted genome modification by direct injection of TALEN mRNA into fertilized mouse oocytes.
Genetics, Issue 86, Oocyte microinjection, Designer nucleases, ZFN, TALEN, Genome Engineering
50930
Play Button
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Authors: Colin W. Bell, Barbara E. Fricks, Jennifer D. Rocca, Jessica M. Steinweg, Shawna K. McMahon, Matthew D. Wallenstein.
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample). Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
50961
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
DNA-affinity-purified Chip (DAP-chip) Method to Determine Gene Targets for Bacterial Two component Regulatory Systems
Authors: Lara Rajeev, Eric G. Luning, Aindrila Mukhopadhyay.
Institutions: Lawrence Berkeley National Laboratory.
In vivo methods such as ChIP-chip are well-established techniques used to determine global gene targets for transcription factors. However, they are of limited use in exploring bacterial two component regulatory systems with uncharacterized activation conditions. Such systems regulate transcription only when activated in the presence of unique signals. Since these signals are often unknown, the in vitro microarray based method described in this video article can be used to determine gene targets and binding sites for response regulators. This DNA-affinity-purified-chip method may be used for any purified regulator in any organism with a sequenced genome. The protocol involves allowing the purified tagged protein to bind to sheared genomic DNA and then affinity purifying the protein-bound DNA, followed by fluorescent labeling of the DNA and hybridization to a custom tiling array. Preceding steps that may be used to optimize the assay for specific regulators are also described. The peaks generated by the array data analysis are used to predict binding site motifs, which are then experimentally validated. The motif predictions can be further used to determine gene targets of orthologous response regulators in closely related species. We demonstrate the applicability of this method by determining the gene targets and binding site motifs and thus predicting the function for a sigma54-dependent response regulator DVU3023 in the environmental bacterium Desulfovibrio vulgaris Hildenborough.
Genetics, Issue 89, DNA-Affinity-Purified-chip, response regulator, transcription factor binding site, two component system, signal transduction, Desulfovibrio, lactate utilization regulator, ChIP-chip
51715
Play Button
Simultaneous Long-term Recordings at Two Neuronal Processing Stages in Behaving Honeybees
Authors: Martin Fritz Brill, Maren Reuter, Wolfgang Rössler, Martin Fritz Strube-Bloss.
Institutions: University of Würzburg.
In both mammals and insects neuronal information is processed in different higher and lower order brain centers. These centers are coupled via convergent and divergent anatomical connections including feed forward and feedback wiring. Furthermore, information of the same origin is partially sent via parallel pathways to different and sometimes into the same brain areas. To understand the evolutionary benefits as well as the computational advantages of these wiring strategies and especially their temporal dependencies on each other, it is necessary to have simultaneous access to single neurons of different tracts or neuropiles in the same preparation at high temporal resolution. Here we concentrate on honeybees by demonstrating a unique extracellular long term access to record multi unit activity at two subsequent neuropiles1, the antennal lobe (AL), the first olfactory processing stage and the mushroom body (MB), a higher order integration center involved in learning and memory formation, or two parallel neuronal tracts2 connecting the AL with the MB. The latter was chosen as an example and will be described in full. In the supporting video the construction and permanent insertion of flexible multi channel wire electrodes is demonstrated. Pairwise differential amplification of the micro wire electrode channels drastically reduces the noise and verifies that the source of the signal is closely related to the position of the electrode tip. The mechanical flexibility of the used wire electrodes allows stable invasive long term recordings over many hours up to days, which is a clear advantage compared to conventional extra and intracellular in vivo recording techniques.
Neuroscience, Issue 89, honeybee brain, olfaction, extracellular long term recordings, double recordings, differential wire electrodes, single unit, multi-unit recordings
51750
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
51823
Play Button
Highly Efficient Ligation of Small RNA Molecules for MicroRNA Quantitation by High-Throughput Sequencing
Authors: Jerome E. Lee, Rui Yi.
Institutions: University of Colorado, Boulder, University of Colorado, Denver.
MiRNA cloning and high-throughput sequencing, termed miR-Seq, stands alone as a transcriptome-wide approach to quantify miRNAs with single nucleotide resolution. This technique captures miRNAs by attaching 3’ and 5’ oligonucleotide adapters to miRNA molecules and allows de novo miRNA discovery. Coupling with powerful next-generation sequencing platforms, miR-Seq has been instrumental in the study of miRNA biology. However, significant biases introduced by oligonucleotide ligation steps have prevented miR-Seq from being employed as an accurate quantitation tool. Previous studies demonstrate that biases in current miR-Seq methods often lead to inaccurate miRNA quantification with errors up to 1,000-fold for some miRNAs1,2. To resolve these biases imparted by RNA ligation, we have developed a small RNA ligation method that results in ligation efficiencies of over 95% for both 3’ and 5′ ligation steps. Benchmarking this improved library construction method using equimolar or differentially mixed synthetic miRNAs, consistently yields reads numbers with less than two-fold deviation from the expected value. Furthermore, this high-efficiency miR-Seq method permits accurate genome-wide miRNA profiling from in vivo total RNA samples2.
Molecular Biology, Issue 93, RNA, ligation, miRNA, miR-Seq, linker, oligonucleotide, high-throughput sequencing
52095
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
50341
Play Button
Detection of Protein Palmitoylation in Cultured Hippocampal Neurons by Immunoprecipitation and Acyl-Biotin Exchange (ABE)
Authors: G. Stefano Brigidi, Shernaz X Bamji.
Institutions: University of British Columbia .
Palmitoylation is a post-translational lipid modification involving the attachment of a 16-carbon saturated fatty acid, palmitate, to cysteine residues of substrate proteins through a labile thioester bond [reviewed in1]. Palmitoylation of a substrate protein increases its hydrophobicity, and typically facilitates its trafficking toward cellular membranes. Recent studies have shown palmitoylation to be one of the most common lipid modifications in neurons1, 2, suggesting that palmitate turnover is an important mechanism by which these cells regulate the targeting and trafficking of proteins. The identification and detection of palmitoylated substrates can therefore better our understanding of protein trafficking in neurons. Detection of protein palmitoylation in the past has been technically hindered due to the lack of a consensus sequence among substrate proteins, and the reliance on metabolic labeling of palmitoyl-proteins with 3H-palmitate, a time-consuming biochemical assay with low sensitivity. Development of the Acyl-Biotin Exchange (ABE) assay enables more rapid and high sensitivity detection of palmitoylated proteins2-4, and is optimal for measuring the dynamic turnover of palmitate on neuronal proteins. The ABE assay is comprised of three biochemical steps (Figure 1): 1) irreversible blockade of unmodified cysteine thiol groups using N-ethylmaliemide (NEM), 2) specific cleavage and unmasking of the palmitoylated cysteine's thiol group by hydroxylamine (HAM), and 3) selective labeling of the palmitoylated cysteine using a thiol-reactive biotinylation reagent, biotin-BMCC. Purification of the thiol-biotinylated proteins following the ABE steps has differed, depending on the overall goal of the experiment. Here, we describe a method to purify a palmitoylated protein of interest in primary hippocampal neurons by an initial immunoprecipitation (IP) step using an antibody directed against the protein, followed by the ABE assay and western blotting to directly measure palmitoylation levels of that protein, which is termed the IP-ABE assay. Low-density cultures of embryonic rat hippocampal neurons have been widely used to study the localization, function, and trafficking of neuronal proteins, making them ideally suited for studying neuronal protein palmitoylation using the IP-ABE assay. The IP-ABE assay mainly requires standard IP and western blotting reagents, and is only limited by the availability of antibodies against the target substrate. This assay can easily be adapted for the purification and detection of transfected palmitoylated proteins in heterologous cell cultures, primary neuronal cultures derived from various brain tissues of both mouse and rat, and even primary brain tissue itself.
Neuroscience, Issue 72, Biochemistry, Neurobiology, Molecular Biology, Cellular Biology, Physiology, Proteins, synapse, cultured hippocampal neurons, palmitoylation, lipid, immunoprecipitation, western blotting, biotin, Acyl-Biotin Exchange, ABE, neuron, brain, cell culture, rat, mouse, animal model
50031
Play Button
High Throughput Screening of Fungal Endoglucanase Activity in Escherichia coli
Authors: Mary F. Farrow, Frances H. Arnold.
Institutions: California Institute of Technology, California Institute of Technology.
Cellulase enzymes (endoglucanases, cellobiohydrolases, and β-glucosidases) hydrolyze cellulose into component sugars, which in turn can be converted into fuel alcohols1. The potential for enzymatic hydrolysis of cellulosic biomass to provide renewable energy has intensified efforts to engineer cellulases for economical fuel production2. Of particular interest are fungal cellulases3-8, which are already being used industrially for foods and textiles processing. Identifying active variants among a library of mutant cellulases is critical to the engineering process; active mutants can be further tested for improved properties and/or subjected to additional mutagenesis. Efficient engineering of fungal cellulases has been hampered by a lack of genetic tools for native organisms and by difficulties in expressing the enzymes in heterologous hosts. Recently, Morikawa and coworkers developed a method for expressing in E. coli the catalytic domains of endoglucanases from H. jecorina3,9, an important industrial fungus with the capacity to secrete cellulases in large quantities. Functional E. coli expression has also been reported for cellulases from other fungi, including Macrophomina phaseolina10 and Phanerochaete chrysosporium11-12. We present a method for high throughput screening of fungal endoglucanase activity in E. coli. (Fig 1) This method uses the common microbial dye Congo Red (CR) to visualize enzymatic degradation of carboxymethyl cellulose (CMC) by cells growing on solid medium. The activity assay requires inexpensive reagents, minimal manipulation, and gives unambiguous results as zones of degradation (“halos”) at the colony site. Although a quantitative measure of enzymatic activity cannot be determined by this method, we have found that halo size correlates with total enzymatic activity in the cell. Further characterization of individual positive clones will determine , relative protein fitness. Traditional bacterial whole cell CMC/CR activity assays13 involve pouring agar containing CMC onto colonies, which is subject to cross-contamination, or incubating cultures in CMC agar wells, which is less amenable to large-scale experimentation. Here we report an improved protocol that modifies existing wash methods14 for cellulase activity: cells grown on CMC agar plates are removed prior to CR staining. Our protocol significantly reduces cross-contamination and is highly scalable, allowing the rapid screening of thousands of clones. In addition to H. jecorina enzymes, we have expressed and screened endoglucanase variants from the Thermoascus aurantiacus and Penicillium decumbens (shown in Figure 2), suggesting that this protocol is applicable to enzymes from a range of organisms.
Molecular Biology, Issue 54, cellulase, endoglucanase, CMC, Congo Red
2942
Play Button
A Protocol for Computer-Based Protein Structure and Function Prediction
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
3259
Play Button
The ITS2 Database
Authors: Benjamin Merget, Christian Koetschan, Thomas Hackl, Frank Förster, Thomas Dandekar, Tobias Müller, Jörg Schultz, Matthias Wolf.
Institutions: University of Würzburg, University of Würzburg.
The internal transcribed spacer 2 (ITS2) has been used as a phylogenetic marker for more than two decades. As ITS2 research mainly focused on the very variable ITS2 sequence, it confined this marker to low-level phylogenetics only. However, the combination of the ITS2 sequence and its highly conserved secondary structure improves the phylogenetic resolution1 and allows phylogenetic inference at multiple taxonomic ranks, including species delimitation2-8. The ITS2 Database9 presents an exhaustive dataset of internal transcribed spacer 2 sequences from NCBI GenBank11 accurately reannotated10. Following an annotation by profile Hidden Markov Models (HMMs), the secondary structure of each sequence is predicted. First, it is tested whether a minimum energy based fold12 (direct fold) results in a correct, four helix conformation. If this is not the case, the structure is predicted by homology modeling13. In homology modeling, an already known secondary structure is transferred to another ITS2 sequence, whose secondary structure was not able to fold correctly in a direct fold. The ITS2 Database is not only a database for storage and retrieval of ITS2 sequence-structures. It also provides several tools to process your own ITS2 sequences, including annotation, structural prediction, motif detection and BLAST14 search on the combined sequence-structure information. Moreover, it integrates trimmed versions of 4SALE15,16 and ProfDistS17 for multiple sequence-structure alignment calculation and Neighbor Joining18 tree reconstruction. Together they form a coherent analysis pipeline from an initial set of sequences to a phylogeny based on sequence and secondary structure. In a nutshell, this workbench simplifies first phylogenetic analyses to only a few mouse-clicks, while additionally providing tools and data for comprehensive large-scale analyses.
Genetics, Issue 61, alignment, internal transcribed spacer 2, molecular systematics, secondary structure, ribosomal RNA, phylogenetic tree, homology modeling, phylogeny
3806
Play Button
Avidity-based Extracellular Interaction Screening (AVEXIS) for the Scalable Detection of Low-affinity Extracellular Receptor-Ligand Interactions
Authors: Jason S. Kerr, Gavin J. Wright.
Institutions: Wellcome Trust Sanger Institute.
Extracellular protein:protein interactions between secreted or membrane-tethered proteins are critical for both initiating intercellular communication and ensuring cohesion within multicellular organisms. Proteins predicted to form extracellular interactions are encoded by approximately a quarter of human genes1, but despite their importance and abundance, the majority of these proteins have no documented binding partner. Primarily, this is due to their biochemical intractability: membrane-embedded proteins are difficult to solubilise in their native conformation and contain structurally-important posttranslational modifications. Also, the interaction affinities between receptor proteins are often characterised by extremely low interaction strengths (half-lives < 1 second) precluding their detection with many commonly-used high throughput methods2. Here, we describe an assay, AVEXIS (AVidity-based EXtracellular Interaction Screen) that overcomes these technical challenges enabling the detection of very weak protein interactions (t1/2 ≤ 0.1 sec) with a low false positive rate3. The assay is usually implemented in a high throughput format to enable the systematic screening of many thousands of interactions in a convenient microtitre plate format (Fig. 1). It relies on the production of soluble recombinant protein libraries that contain the ectodomain fragments of cell surface receptors or secreted proteins within which to screen for interactions; therefore, this approach is suitable for type I, type II, GPI-linked cell surface receptors and secreted proteins but not for multipass membrane proteins such as ion channels or transporters. The recombinant protein libraries are produced using a convenient and high-level mammalian expression system4, to ensure that important posttranslational modifications such as glycosylation and disulphide bonds are added. Expressed recombinant proteins are secreted into the medium and produced in two forms: a biotinylated bait which can be captured on a streptavidin-coated solid phase suitable for screening, and a pentamerised enzyme-tagged (β-lactamase) prey. The bait and prey proteins are presented to each other in a binary fashion to detect direct interactions between them, similar to a conventional ELISA (Fig. 1). The pentamerisation of the proteins in the prey is achieved through a peptide sequence from the cartilage oligomeric matrix protein (COMP) and increases the local concentration of the ectodomains thereby providing significant avidity gains to enable even very transient interactions to be detected. By normalising the activities of both the bait and prey to predetermined levels prior to screening, we have shown that interactions having monomeric half-lives of 0.1 sec can be detected with low false positive rates3.
Molecular Biology, Issue 61, Receptor-ligand pairs, Extracellular protein interactions, AVEXIS, Adhesion receptors, Transient/weak interactions, High throughput screening
3881
Play Button
A Novel Bayesian Change-point Algorithm for Genome-wide Analysis of Diverse ChIPseq Data Types
Authors: Haipeng Xing, Willey Liao, Yifan Mo, Michael Q. Zhang.
Institutions: Stony Brook University, Cold Spring Harbor Laboratory, University of Texas at Dallas.
ChIPseq is a widely used technique for investigating protein-DNA interactions. Read density profiles are generated by using next-sequencing of protein-bound DNA and aligning the short reads to a reference genome. Enriched regions are revealed as peaks, which often differ dramatically in shape, depending on the target protein1. For example, transcription factors often bind in a site- and sequence-specific manner and tend to produce punctate peaks, while histone modifications are more pervasive and are characterized by broad, diffuse islands of enrichment2. Reliably identifying these regions was the focus of our work. Algorithms for analyzing ChIPseq data have employed various methodologies, from heuristics3-5 to more rigorous statistical models, e.g. Hidden Markov Models (HMMs)6-8. We sought a solution that minimized the necessity for difficult-to-define, ad hoc parameters that often compromise resolution and lessen the intuitive usability of the tool. With respect to HMM-based methods, we aimed to curtail parameter estimation procedures and simple, finite state classifications that are often utilized. Additionally, conventional ChIPseq data analysis involves categorization of the expected read density profiles as either punctate or diffuse followed by subsequent application of the appropriate tool. We further aimed to replace the need for these two distinct models with a single, more versatile model, which can capably address the entire spectrum of data types. To meet these objectives, we first constructed a statistical framework that naturally modeled ChIPseq data structures using a cutting edge advance in HMMs9, which utilizes only explicit formulas-an innovation crucial to its performance advantages. More sophisticated then heuristic models, our HMM accommodates infinite hidden states through a Bayesian model. We applied it to identifying reasonable change points in read density, which further define segments of enrichment. Our analysis revealed how our Bayesian Change Point (BCP) algorithm had a reduced computational complexity-evidenced by an abridged run time and memory footprint. The BCP algorithm was successfully applied to both punctate peak and diffuse island identification with robust accuracy and limited user-defined parameters. This illustrated both its versatility and ease of use. Consequently, we believe it can be implemented readily across broad ranges of data types and end users in a manner that is easily compared and contrasted, making it a great tool for ChIPseq data analysis that can aid in collaboration and corroboration between research groups. Here, we demonstrate the application of BCP to existing transcription factor10,11 and epigenetic data12 to illustrate its usefulness.
Genetics, Issue 70, Bioinformatics, Genomics, Molecular Biology, Cellular Biology, Immunology, Chromatin immunoprecipitation, ChIP-Seq, histone modifications, segmentation, Bayesian, Hidden Markov Models, epigenetics
4273
Play Button
The Logic, Experimental Steps, and Potential of Heterologous Natural Product Biosynthesis Featuring the Complex Antibiotic Erythromycin A Produced Through E. coli
Authors: Ming Jiang, Haoran Zhang, Blaine A. Pfeifer.
Institutions: State University of New York at Buffalo, Massachusetts Institute of Technology.
The heterologous production of complex natural products is an approach designed to address current limitations and future possibilities. It is particularly useful for those compounds which possess therapeutic value but cannot be sufficiently produced or would benefit from an improved form of production. The experimental procedures involved can be subdivided into three components: 1) genetic transfer; 2) heterologous reconstitution; and 3) product analysis. Each experimental component is under continual optimization to meet the challenges and anticipate the opportunities associated with this emerging approach. Heterologous biosynthesis begins with the identification of a genetic sequence responsible for a valuable natural product. Transferring this sequence to a heterologous host is complicated by the biosynthetic pathway complexity responsible for product formation. The antibiotic erythromycin A is a good example. Twenty genes (totaling >50 kb) are required for eventual biosynthesis. In addition, three of these genes encode megasynthases, multi-domain enzymes each ~300 kDa in size. This genetic material must be designed and transferred to E. coli for reconstituted biosynthesis. The use of PCR isolation, operon construction, multi-cystronic plasmids, and electro-transformation will be described in transferring the erythromycin A genetic cluster to E. coli. Once transferred, the E. coli cell must support eventual biosynthesis. This process is also challenging given the substantial differences between E. coli and most original hosts responsible for complex natural product formation. The cell must provide necessary substrates to support biosynthesis and coordinately express the transferred genetic cluster to produce active enzymes. In the case of erythromycin A, the E. coli cell had to be engineered to provide the two precursors (propionyl-CoA and (2S)-methylmalonyl-CoA) required for biosynthesis. In addition, gene sequence modifications, plasmid copy number, chaperonin co-expression, post-translational enzymatic modification, and process temperature were also required to allow final erythromycin A formation. Finally, successful production must be assessed. For the erythromycin A case, we will present two methods. The first is liquid chromatography-mass spectrometry (LC-MS) to confirm and quantify production. The bioactivity of erythromycin A will also be confirmed through use of a bioassay in which the antibiotic activity is tested against Bacillus subtilis. The assessment assays establish erythromycin A biosynthesis from E. coli and set the stage for future engineering efforts to improve or diversify production and for the production of new complex natural compounds using this approach.
Biomedical Engineering, Issue 71, Chemical Engineering, Bioengineering, Molecular Biology, Cellular Biology, Microbiology, Basic Protocols, Biochemistry, Biotechnology, Heterologous biosynthesis, natural products, antibiotics, erythromycin A, metabolic engineering, E. coli
4346
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Applying Microfluidics to Electrophysiology
Authors: David T. Eddington.
Institutions: University of Illinois, Chicago.
Microfluidics can be integrated with standard electrophysiology techniques to allow new experimental modalities. Specifically, the motivation for the microfluidic brain slice device is discussed including how the device docks to standard perfusion chambers and the technique of passive pumping which is used to deliver boluses of neuromodulators to the brain slice. By simplifying the device design, we are able to achieve a practical solution to the current unmet electrophysiology need of applying multiple neuromodulators across multiple regions of the brain slice. This is achieved by substituting the standard coverglass substrate of the perfusion chamber with a thin microfluidic device bonded to the coverglass substrate. This was then attached to the perfusion chamber and small holes connect the open-well of the perfusion chamber to the microfluidic channels buried within the microfluidic substrate. These microfluidic channels are interfaced with ports drilled into the edge of the perfusion chamber to access and deliver stimulants. This project represents how the field of microfluidics is transitioning away from proof-of concept device demonstrations and into practical solutions for unmet experimental and clinical needs.
Neuroscience, Issue 8, Biomedical Engineering, Microfluidics, Slice Recording, Electrophysiology, Neurotransmitter, Bioengineering
301
Play Button
BioMEMS and Cellular Biology: Perspectives and Applications
Authors: Albert Folch.
Institutions: University of Washington.
The ability to culture cells has revolutionized hypothesis testing in basic cell and molecular biology research. It has become a standard methodology in drug screening, toxicology, and clinical assays, and is increasingly used in regenerative medicine. However, the traditional cell culture methodology essentially consisting of the immersion of a large population of cells in a homogeneous fluid medium and on a homogeneous flat substrate has become increasingly limiting both from a fundamental and practical perspective. Microfabrication technologies have enabled researchers to design, with micrometer control, the biochemical composition and topology of the substrate, and the medium composition, as well as the neighboring cell type in the surrounding cellular microenvironment. Additionally, microtechnology is conceptually well-suited for the development of fast, low-cost in vitro systems that allow for high-throughput culturing and analysis of cells under large numbers of conditions. In this interview, Albert Folch explains these limitations, how they can be overcome with soft lithography and microfluidics, and describes some relevant examples of research in his lab and future directions.
Biomedical Engineering, Issue 8, BioMEMS, Soft Lithography, Microfluidics, Agrin, Axon Guidance, Olfaction, Interview
300
Play Button
MALDI Sample Preparation: the Ultra Thin Layer Method
Authors: David Fenyo, Qingjun Wang, Jeffrey A. DeGrasse, Julio C. Padovan, Martine Cadene, Brian T. Chait.
Institutions: Rockefeller University.
This video demonstrates the preparation of an ultra-thin matrix/analyte layer for analyzing peptides and proteins by Matrix-Assisted Laser Desorption Ionization Mass Spectrometry (MALDI-MS) 1,2. The ultra-thin layer method involves the production of a substrate layer of matrix crystals (alpha-cyano-4-hydroxycinnamic acid) on the sample plate, which serves as a seeding ground for subsequent crystallization of a matrix/analyte mixture. Advantages of the ultra-thin layer method over other sample deposition approaches (e.g. dried droplet) are that it provides (i) greater tolerance to impurities such as salts and detergents, (ii) better resolution, and (iii) higher spatial uniformity. This method is especially useful for the accurate mass determination of proteins. The protocol was initially developed and optimized for the analysis of membrane proteins and used to successfully analyze ion channels, metabolite transporters, and receptors, containing between 2 and 12 transmembrane domains 2. Since the original publication, it has also shown to be equally useful for the analysis of soluble proteins. Indeed, we have used it for a large number of proteins having a wide range of properties, including those with molecular masses as high as 380 kDa 3. It is currently our method of choice for the molecular mass analysis of all proteins. The described procedure consistently produces high-quality spectra, and it is sensitive, robust, and easy to implement.
Cellular Biology, Issue 3, mass-spectrometry, ultra-thin layer, MALDI, MS, proteins
192
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.