Robots designed to track chemical leaks in hazardous industrial facilities1 or explosive traces in landmine fields2 face the same problem as insects foraging for food or searching for mates3: the olfactory search is constrained by the physics of turbulent transport4. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity5-6, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones7 but also chemicals that are relevant to humans, e.g., volatile compounds emanating from cancer cells8 or toxic and illicit substances9-11. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors12. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies13. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration14 or using nanostructured gas sensors that mimic insect antennae15.
22 Related JoVE Articles!
Measuring Attentional Biases for Threat in Children and Adults
Institutions: Rutgers University.
Investigators have long been interested in the human propensity for the rapid detection of threatening stimuli. However, until recently, research in this domain has focused almost exclusively on adult participants, completely ignoring the topic of threat detection over the course of development. One of the biggest reasons for the lack of developmental work in this area is likely the absence of a reliable paradigm that can measure perceptual biases for threat in children. To address this issue, we recently designed a modified visual search paradigm similar to the standard adult paradigm that is appropriate for studying threat detection in preschool-aged participants. Here we describe this new procedure. In the general paradigm, we present participants with matrices of color photographs, and ask them to find and touch a target on the screen. Latency to touch the target is recorded. Using a touch-screen monitor makes the procedure simple and easy, allowing us to collect data in participants ranging from 3 years of age to adults. Thus far, the paradigm has consistently shown that both adults and children detect threatening stimuli (e.g.,
snakes, spiders, angry/fearful faces) more quickly than neutral stimuli (e.g.,
flowers, mushrooms, happy/neutral faces). Altogether, this procedure provides an important new tool for researchers interested in studying the development of attentional biases for threat.
Behavior, Issue 92, Detection, threat, attention, attentional bias, anxiety, visual search
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
A Practical Guide to Phylogenetics for Nonexperts
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
Barnes Maze Testing Strategies with Small and Large Rodent Models
Institutions: University of Missouri, Food and Drug Administration.
Spatial learning and memory of laboratory rodents is often assessed via navigational ability in mazes, most popular of which are the water and dry-land (Barnes) mazes. Improved performance over sessions or trials is thought to reflect learning and memory of the escape cage/platform location. Considered less stressful than water mazes, the Barnes maze is a relatively simple design of a circular platform top with several holes equally spaced around the perimeter edge. All but one of the holes are false-bottomed or blind-ending, while one leads to an escape cage. Mildly aversive stimuli (e.g.
bright overhead lights) provide motivation to locate the escape cage. Latency to locate the escape cage can be measured during the session; however, additional endpoints typically require video recording. From those video recordings, use of automated tracking software can generate a variety of endpoints that are similar to those produced in water mazes (e.g.
distance traveled, velocity/speed, time spent in the correct quadrant, time spent moving/resting, and confirmation of latency). Type of search strategy (i.e.
random, serial, or direct) can be categorized as well. Barnes maze construction and testing methodologies can differ for small rodents, such as mice, and large rodents, such as rats. For example, while extra-maze cues are effective for rats, smaller wild rodents may require intra-maze cues with a visual barrier around the maze. Appropriate stimuli must be identified which motivate the rodent to locate the escape cage. Both Barnes and water mazes can be time consuming as 4-7 test trials are typically required to detect improved learning and memory performance (e.g.
shorter latencies or path lengths to locate the escape platform or cage) and/or differences between experimental groups. Even so, the Barnes maze is a widely employed behavioral assessment measuring spatial navigational abilities and their potential disruption by genetic, neurobehavioral manipulations, or drug/ toxicant exposure.
Behavior, Issue 84, spatial navigation, rats, Peromyscus, mice, intra- and extra-maze cues, learning, memory, latency, search strategy, escape motivation
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Helical Organization of Blood Coagulation Factor VIII on Lipid Nanotubes
Institutions: University of Texas Medical Branch, University of Texas Medical Branch, University of Texas Medical Branch.
Cryo-electron microscopy (Cryo-EM)1
is a powerful approach to investigate the functional structure of proteins and complexes in a hydrated state and membrane environment2
Coagulation Factor VIII (FVIII)3
is a multi-domain blood plasma glycoprotein. Defect or deficiency of FVIII is the cause for Hemophilia type A - a severe bleeding disorder. Upon proteolytic activation, FVIII binds to the serine protease Factor IXa on the negatively charged platelet membrane, which is critical for normal blood clotting4
. Despite the pivotal role FVIII plays in coagulation, structural information for its membrane-bound state is incomplete5
. Recombinant FVIII concentrate is the most effective drug against Hemophilia type A and commercially available FVIII can be expressed as human or porcine, both forming functional complexes with human Factor IXa6,7
In this study we present a combination of Cryo-electron microscopy (Cryo-EM), lipid nanotechnology and structure analysis applied to resolve the membrane-bound structure of two highly homologous FVIII forms: human and porcine. The methodology developed in our laboratory to helically organize the two functional recombinant FVIII forms on negatively charged lipid nanotubes (LNT) is described. The representative results demonstrate that our approach is sufficiently sensitive to define the differences in the helical organization between the two highly homologous in sequence (86% sequence identity) proteins. Detailed protocols for the helical organization, Cryo-EM and electron tomography (ET) data acquisition are given. The two-dimensional (2D) and three-dimensional (3D) structure analysis applied to obtain the 3D reconstructions of human and porcine FVIII-LNT is discussed. The presented human and porcine FVIII-LNT structures show the potential of the proposed methodology to calculate the functional, membrane-bound organization of blood coagulation Factor VIII at high resolution.
Bioengineering, Issue 88, Cryo-electron microscopy, Lipid nanotubes, Helical assembly, Membrane-bound organization, Coagulation factor VIII
A Strategy for Sensitive, Large Scale Quantitative Metabolomics
Institutions: Cornell University, Cornell University.
Metabolite profiling has been a valuable asset in the study of metabolism in health and disease. However, current platforms have different limiting factors, such as labor intensive sample preparations, low detection limits, slow scan speeds, intensive method optimization for each metabolite, and the inability to measure both positively and negatively charged ions in single experiments. Therefore, a novel metabolomics protocol could advance metabolomics studies. Amide-based hydrophilic chromatography enables polar metabolite analysis without any chemical derivatization. High resolution MS using the Q-Exactive (QE-MS) has improved ion optics, increased scan speeds (256 msec at resolution 70,000), and has the capability of carrying out positive/negative switching. Using a cold methanol extraction strategy, and coupling an amide column with QE-MS enables robust detection of 168 targeted polar metabolites and thousands of additional features simultaneously. Data processing is carried out with commercially available software in a highly efficient way, and unknown features extracted from the mass spectra can be queried in databases.
Chemistry, Issue 87, high-resolution mass spectrometry, metabolomics, positive/negative switching, low mass calibration, Orbitrap
High-resolution Spatiotemporal Analysis of Receptor Dynamics by Single-molecule Fluorescence Microscopy
Institutions: University of Würzburg, Germany.
Single-molecule microscopy is emerging as a powerful approach to analyze the behavior of signaling molecules, in particular concerning those aspect (e.g
., kinetics, coexistence of different states and populations, transient interactions), which are typically hidden in ensemble measurements, such as those obtained with standard biochemical or microscopy methods. Thus, dynamic events, such as receptor-receptor interactions, can be followed in real time in a living cell with high spatiotemporal resolution. This protocol describes a method based on labeling with small and bright organic fluorophores and total internal reflection fluorescence (TIRF) microscopy to directly visualize single receptors on the surface of living cells. This approach allows one to precisely localize receptors, measure the size of receptor complexes, and capture dynamic events such as transient receptor-receptor interactions. The protocol provides a detailed description of how to perform a single-molecule experiment, including sample preparation, image acquisition and image analysis. As an example, the application of this method to analyze two G-protein-coupled receptors, i.e
-adrenergic and γ-aminobutyric acid type B (GABAB
) receptor, is reported. The protocol can be adapted to other membrane proteins and different cell models, transfection methods and labeling strategies.
Bioengineering, Issue 89, pharmacology, microscopy, receptor, live-cell imaging, single-molecule, total internal reflection fluorescence, tracking, dimerization, protein-protein interactions
Setting Limits on Supersymmetry Using Simplified Models
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Assessment of Immunologically Relevant Dynamic Tertiary Structural Features of the HIV-1 V3 Loop Crown R2 Sequence by ab initio Folding
Institutions: School of Medicine, New York University.
The antigenic diversity of HIV-1 has long been an obstacle to vaccine design, and this variability is especially pronounced in the V3 loop of the virus' surface envelope glycoprotein. We previously proposed that the crown of the V3 loop, although dynamic and sequence variable, is constrained throughout the population of HIV-1 viruses to an immunologically relevant β-hairpin tertiary structure. Importantly, there are thousands of different V3 loop crown sequences in circulating HIV-1 viruses, making 3D structural characterization of trends across the diversity of viruses difficult or impossible by crystallography or NMR. Our previous successful studies with folding of the V3 crown1, 2
used the ab initio
accessible in the ICM-Pro molecular modeling software package (Molsoft LLC, La Jolla, CA) and suggested that the crown of the V3 loop, specifically from positions 10 to 22, benefits sufficiently from the flexibility and length of its flanking stems to behave to a large degree as if it were an unconstrained peptide freely folding in solution. As such, rapid ab initio
folding of just this portion of the V3 loop of any individual strain of the 60,000+ circulating HIV-1 strains can be informative. Here, we folded the V3 loop of the R2 strain to gain insight into the structural basis of its unique properties. R2 bears a rare V3 loop sequence thought to be responsible for the exquisite sensitivity of this strain to neutralization by patient sera and monoclonal antibodies4, 5
. The strain mediates CD4-independent infection and appears to elicit broadly neutralizing antibodies. We demonstrate how evaluation of the results of the folding can be informative for associating observed structures in the folding with the immunological activities observed for R2.
Infection, Issue 43, HIV-1, structure-activity relationships, ab initio simulations, antibody-mediated neutralization, vaccine design
Profiling of Methyltransferases and Other S-adenosyl-L-homocysteine-binding Proteins by Capture Compound Mass Spectrometry (CCMS)
Institutions: caprotec bioanalytics GmbH, RWTH Aachen University.
There is a variety of approaches to reduce the complexity of the proteome on the basis of functional small molecule-protein interactions such as affinity chromatography 1
or Activity Based Protein Profiling 2
. Trifunctional Capture Compounds (CCs, Figure 1A) 3
are the basis for a generic approach, in which the initial equilibrium-driven interaction between a small molecule probe (the selectivity function, here S
-homocysteine, SAH, Figure 1A) and target proteins is irreversibly fixed upon photo-crosslinking between an independent photo-activable reactivity function (here a phenylazide) of the CC and the surface of the target proteins. The sorting function (here biotin) serves to isolate the CC - protein conjugates from complex biological mixtures with the help of a solid phase (here streptavidin magnetic beads). Two configurations of the experiments are possible: "off-bead" 4
or the presently described "on-bead" configuration (Figure 1B). The selectivity function may be virtually any small molecule of interest (substrates, inhibitors, drug molecules).
-methionine (SAM, Figure 1A) is probably, second to ATP, the most widely used cofactor in nature 5, 6
. It is used as the major methyl group donor in all living organisms with the chemical reaction being catalyzed by SAM-dependent methyltransferases (MTases), which methylate DNA 7
, RNA 8
, proteins 9
, or small molecules 10
. Given the crucial role of methylation reactions in diverse physiological scenarios (gene regulation, epigenetics, metabolism), the profiling of MTases can be expected to become of similar importance in functional proteomics as the profiling of kinases. Analytical tools for their profiling, however, have not been available. We recently introduced a CC with SAH as selectivity group to fill this technological gap (Figure 1A).
SAH, the product of SAM after methyl transfer, is a known general MTase product inhibitor 11
. For this reason and because the natural cofactor SAM is used by further enzymes transferring other parts of the cofactor or initiating radical reactions as well as because of its chemical instability 12
, SAH is an ideal selectivity function for a CC to target MTases. Here, we report the utility of the SAH-CC and CCMS by profiling MTases and other SAH-binding proteins from the strain DH5α of Escherichia coli
), one of the best-characterized prokaryotes, which has served as the preferred model organism in countless biochemical, biological, and biotechnological studies. Photo-activated crosslinking enhances yield and sensitivity of the experiment, and the specificity can be readily tested for in competition experiments using an excess of free SAH.
Biochemistry, Issue 46, Capture Compound, photo-crosslink, small molecule-protein interaction, methyltransferase, S-adenosyl-l-homocysteine, SAH, S-adenosyl-l-methionine, SAM, functional proteomics, LC-MS/MS
A Protocol for Computer-Based Protein Structure and Function Prediction
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
The ITS2 Database
Institutions: University of Würzburg, University of Würzburg.
The internal transcribed spacer 2 (ITS2) has been used as a phylogenetic marker for more than two decades. As ITS2 research mainly focused on the very variable ITS2 sequence, it confined this marker to low-level phylogenetics only. However, the combination of the ITS2 sequence and its highly conserved secondary structure improves the phylogenetic resolution1
and allows phylogenetic inference at multiple taxonomic ranks, including species delimitation2-8
The ITS2 Database9
presents an exhaustive dataset of internal transcribed spacer 2 sequences from NCBI GenBank11
. Following an annotation by profile Hidden Markov Models (HMMs), the secondary structure of each sequence is predicted. First, it is tested whether a minimum energy based fold12
(direct fold) results in a correct, four helix conformation. If this is not the case, the structure is predicted by homology modeling13
. In homology modeling, an already known secondary structure is transferred to another ITS2 sequence, whose secondary structure was not able to fold correctly in a direct fold.
The ITS2 Database is not only a database for storage and retrieval of ITS2 sequence-structures. It also provides several tools to process your own ITS2 sequences, including annotation, structural prediction, motif detection and BLAST14
search on the combined sequence-structure information. Moreover, it integrates trimmed versions of 4SALE15,16
for multiple sequence-structure alignment calculation and Neighbor Joining18
tree reconstruction. Together they form a coherent analysis pipeline from an initial set of sequences to a phylogeny based on sequence and secondary structure.
In a nutshell, this workbench simplifies first phylogenetic analyses to only a few mouse-clicks, while additionally providing tools and data for comprehensive large-scale analyses.
Genetics, Issue 61, alignment, internal transcribed spacer 2, molecular systematics, secondary structure, ribosomal RNA, phylogenetic tree, homology modeling, phylogeny
Use of a Robot for High-throughput Crystallization of Membrane Proteins in Lipidic Mesophases
Institutions: Trinity College Dublin .
Structure-function studies of membrane proteins greatly benefit from having available high-resolution 3-D structures of the type provided through macromolecular X-ray crystallography (MX). An essential ingredient of MX is a steady supply of ideally diffraction-quality crystals. The in meso
or lipidic cubic phase (LCP) method for crystallizing membrane proteins is one of several methods available for crystallizing membrane proteins. It makes use of a bicontinuous mesophase in which to grow crystals. As a method, it has had some spectacular successes of late and has attracted much attention with many research groups now interested in using it. One of the challenges associated with the method is that the hosting mesophase is extremely viscous and sticky, reminiscent of a thick toothpaste. Thus, dispensing it manually in a reproducible manner in small volumes into crystallization wells requires skill, patience and a steady hand. A protocol for doing just that was developed in the Membrane Structural & Functional Biology (MS&FB) Group1-3
. JoVE video articles describing the method are available1,4
The manual approach for setting up in meso
trials has distinct advantages with specialty applications, such as crystal optimization and derivatization. It does however suffer from being a low throughput method. Here, we demonstrate a protocol for performing in meso
crystallization trials robotically. A robot offers the advantages of speed, accuracy, precision, miniaturization and being able to work continuously for extended periods under what could be regarded as hostile conditions such as in the dark, in a reducing atmosphere or at low or high temperatures. An in meso
robot, when used properly, can greatly improve the productivity of membrane protein structure and function research by facilitating crystallization which is one of the slow steps in the overall structure determination pipeline.
In this video article, we demonstrate the use of three commercially available robots that can dispense the viscous and sticky mesophase integral to in meso
crystallogenesis. The first robot was developed in the MS&FB Group5,6
. The other two have recently become available and are included here for completeness.
An overview of the protocol covered in this article is presented in Figure 1.
All manipulations were performed at room temperature (~20 °C) under ambient conditions.
Materials Science, Issue 67, automation, crystallization, glass sandwich plates, high-throughput, in meso, lipidic cubic phase, lipidic mesophases, macromolecular X-ray crystallography, membrane protein, receptor, robot
Identification of Protein Complexes in Escherichia coli using Sequential Peptide Affinity Purification in Combination with Tandem Mass Spectrometry
Institutions: University of Toronto, University of Regina, University of Toronto.
Since most cellular processes are mediated by macromolecular assemblies, the systematic identification of protein-protein interactions (PPI) and the identification of the subunit composition of multi-protein complexes can provide insight into gene function and enhance understanding of biological systems1, 2
. Physical interactions can be mapped with high confidence vialarge-scale isolation and characterization of endogenous protein complexes under near-physiological conditions based on affinity purification of chromosomally-tagged proteins in combination with mass spectrometry (APMS). This approach has been successfully applied in evolutionarily diverse organisms, including yeast, flies, worms, mammalian cells, and bacteria1-6
. In particular, we have generated a carboxy-terminal Sequential Peptide Affinity (SPA) dual tagging system for affinity-purifying native protein complexes from cultured gram-negative Escherichia coli
, using genetically-tractable host laboratory strains that are well-suited for genome-wide investigations of the fundamental biology and conserved processes of prokaryotes1, 2, 7
. Our SPA-tagging system is analogous to the tandem affinity purification method developed originally for yeast8, 9
, and consists of a calmodulin binding peptide (CBP) followed by the cleavage site for the highly specific tobacco etch virus
(TEV) protease and three copies of the FLAG epitope (3X FLAG), allowing for two consecutive rounds of affinity enrichment. After cassette amplification, sequence-specific linear PCR products encoding the SPA-tag and a selectable marker are integrated and expressed in frame as carboxy-terminal fusions in a DY330 background that is induced to transiently express a highly efficient heterologous bacteriophage lambda recombination system10
. Subsequent dual-step purification using calmodulin and anti-FLAG affinity beads enables the highly selective and efficient recovery of even low abundance protein complexes from large-scale cultures. Tandem mass spectrometry is then used to identify the stably co-purifying proteins with high sensitivity (low nanogram detection limits).
Here, we describe detailed step-by-step procedures we commonly use for systematic protein tagging, purification and mass spectrometry-based analysis of soluble protein complexes from E. coli
, which can be scaled up and potentially tailored to other bacterial species, including certain opportunistic pathogens that are amenable to recombineering. The resulting physical interactions can often reveal interesting unexpected components and connections suggesting novel mechanistic links. Integration of the PPI data with alternate molecular association data such as genetic (gene-gene) interactions and genomic-context (GC) predictions can facilitate elucidation of the global molecular organization of multi-protein complexes within biological pathways. The networks generated for E. coli
can be used to gain insight into the functional architecture of orthologous gene products in other microbes for which functional annotations are currently lacking.
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, affinity purification, Escherichia coli, gram-negative bacteria, cytosolic proteins, SPA-tagging, homologous recombination, mass spectrometry, protein interaction, protein complex
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e.
, lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.
) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization.
Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25
. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7
with a multiobjective evolutionary algorithm SPEA226
, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development
Using SCOPE to Identify Potential Regulatory Motifs in Coregulated Genes
Institutions: Dartmouth College.
SCOPE is an ensemble motif finder that uses three component algorithms in parallel to identify potential regulatory motifs by over-representation and motif position preference1
. Each component algorithm is optimized to find a different kind of motif. By taking the best of these three approaches, SCOPE performs better than any single algorithm, even in the presence of noisy data1
. In this article, we utilize a web version of SCOPE2
to examine genes that are involved in telomere maintenance. SCOPE has been incorporated into at least two other motif finding programs3,4
and has been used in other studies5-8
The three algorithms that comprise SCOPE are BEAM9
, which finds non-degenerate motifs (ACCGGT), PRISM10
, which finds degenerate motifs (ASCGWT), and SPACER11
, which finds longer bipartite motifs (ACCnnnnnnnnGGT). These three algorithms have been optimized to find their corresponding type of motif. Together, they allow SCOPE to perform extremely well.
Once a gene set has been analyzed and candidate motifs identified, SCOPE can look for other genes that contain the motif which, when added to the original set, will improve the motif score. This can occur through over-representation or motif position preference. Working with partial gene sets that have biologically verified transcription factor binding sites, SCOPE was able to identify most of the rest of the genes also regulated by the given transcription factor.
Output from SCOPE shows candidate motifs, their significance, and other information both as a table and as a graphical motif map. FAQs and video tutorials are available at the SCOPE web site which also includes a "Sample Search" button that allows the user to perform a trial run.
Scope has a very friendly user interface that enables novice users to access the algorithm's full power without having to become an expert in the bioinformatics of motif finding. As input, SCOPE can take a list of genes, or FASTA sequences. These can be entered in browser text fields, or read from a file. The output from SCOPE contains a list of all identified motifs with their scores, number of occurrences, fraction of genes containing the motif, and the algorithm used to identify the motif. For each motif, result details include a consensus representation of the motif, a sequence logo, a position weight matrix, and a list of instances for every motif occurrence (with exact positions and "strand" indicated). Results are returned in a browser window and also optionally by email. Previous papers describe the SCOPE algorithms in detail1,2,9-11
Genetics, Issue 51, gene regulation, computational biology, algorithm, promoter sequence motif
Designing and Implementing Nervous System Simulations on LEGO Robots
Institutions: Northeastern University, Bremen University of Applied Sciences.
We present a method to use the commercially available LEGO Mindstorms NXT robotics platform to test systems level neuroscience hypotheses. The first step of the method is to develop a nervous system simulation of specific reflexive behaviors of an appropriate model organism; here we use the American Lobster. Exteroceptive reflexes mediated by decussating (crossing) neural connections can explain an animal's taxis towards or away from a stimulus as described by Braitenberg and are particularly well suited for investigation using the NXT platform.1
The nervous system simulation is programmed using LabVIEW software on the LEGO Mindstorms platform. Once the nervous system is tuned properly, behavioral experiments are run on the robot and on the animal under identical environmental conditions. By controlling the sensory milieu experienced by the specimens, differences in behavioral outputs can be observed. These differences may point to specific deficiencies in the nervous system model and serve to inform the iteration of the model for the particular behavior under study. This method allows for the experimental manipulation of electronic nervous systems and serves as a way to explore neuroscience hypotheses specifically regarding the neurophysiological basis of simple innate reflexive behaviors. The LEGO Mindstorms NXT kit provides an affordable and efficient platform on which to test preliminary biomimetic robot control schemes. The approach is also well suited for the high school classroom to serve as the foundation for a hands-on inquiry-based biorobotics curriculum.
Neuroscience, Issue 75, Neurobiology, Bioengineering, Behavior, Mechanical Engineering, Computer Science, Marine Biology, Biomimetics, Marine Science, Neurosciences, Synthetic Biology, Robotics, robots, Modeling, models, Sensory Fusion, nervous system, Educational Tools, programming, software, lobster, Homarus americanus, animal model
Quantitative Real-Time PCR using the Thermo Scientific Solaris qPCR Assay
Institutions: Thermo Scientific Solaris qPCR Products.
The Solaris qPCR Gene Expression Assay is a novel type of primer/probe set, designed to simplify the qPCR process while maintaining the sensitivity and accuracy of the assay. These primer/probe sets are pre-designed to >98% of the human and mouse genomes and feature significant improvements from previously available technologies. These improvements were made possible by virtue of a novel design algorithm, developed by Thermo Scientific bioinformatics experts.
Several convenient features have been incorporated into the Solaris qPCR Assay to streamline the process of performing quantitative real-time PCR. First, the protocol is similar to commonly employed alternatives, so the methods used during qPCR are likely to be familiar. Second, the master mix is blue, which makes setting the qPCR reactions easier to track. Third, the thermal cycling conditions are the same for all assays (genes), making it possible to run many samples at a time and reducing the potential for error. Finally, the probe and primer sequence information are provided, simplifying the publication process.
Here, we demonstrate how to obtain the appropriate Solaris reagents using the GENEius product search feature found on the ordering web site (www.thermo.com/solaris) and how to use the Solaris reagents for performing qPCR using the standard curve method.
Cellular Biology, Issue 40, qPCR, probe, real-time PCR, molecular biology, Solaris, primer, gene expression assays