The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (https://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
25 Related JoVE Articles!
Inhibitory Synapse Formation in a Co-culture Model Incorporating GABAergic Medium Spiny Neurons and HEK293 Cells Stably Expressing GABAA Receptors
Institutions: University College London.
Inhibitory neurons act in the central nervous system to regulate the dynamics and spatio-temporal co-ordination of neuronal networks. GABA (γ-aminobutyric acid) is the predominant inhibitory neurotransmitter in the brain. It is released from the presynaptic terminals of inhibitory neurons within highly specialized intercellular junctions known as synapses, where it binds to GABAA
Rs) present at the plasma membrane of the synapse-receiving, postsynaptic neurons. Activation of these GABA-gated ion channels leads to influx of chloride resulting in postsynaptic potential changes that decrease the probability that these neurons will generate action potentials.
During development, diverse types of inhibitory neurons with distinct morphological, electrophysiological and neurochemical characteristics have the ability to recognize their target neurons and form synapses which incorporate specific GABAA
Rs subtypes. This principle of selective innervation of neuronal targets raises the question as to how the appropriate synaptic partners identify each other.
To elucidate the underlying molecular mechanisms, a novel in vitro
co-culture model system was established, in which medium spiny GABAergic neurons, a highly homogenous population of neurons isolated from the embryonic striatum, were cultured with stably transfected HEK293 cell lines that express different GABAA
R subtypes. Synapses form rapidly, efficiently and selectively in this system, and are easily accessible for quantification. Our results indicate that various GABAA
R subtypes differ in their ability to promote synapse formation, suggesting that this reduced in vitro
model system can be used to reproduce, at least in part, the in vivo
conditions required for the recognition of the appropriate synaptic partners and formation of specific synapses. Here the protocols for culturing the medium spiny neurons and generating HEK293 cells lines expressing GABAA
Rs are first described, followed by detailed instructions on how to combine these two cell types in co-culture and analyze the formation of synaptic contacts.
Neuroscience, Issue 93, Developmental neuroscience, synaptogenesis, synaptic inhibition, co-culture, stable cell lines, GABAergic, medium spiny neurons, HEK 293 cell line
A Practical Guide to Phylogenetics for Nonexperts
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
A Proboscis Extension Response Protocol for Investigating Behavioral Plasticity in Insects: Application to Basic, Biomedical, and Agricultural Research
Institutions: Arizona State University.
Insects modify their responses to stimuli through experience of associating those stimuli with events important for survival (e.g.
, food, mates, threats). There are several behavioral mechanisms through which an insect learns salient associations and relates them to these events. It is important to understand this behavioral plasticity for programs aimed toward assisting insects that are beneficial for agriculture. This understanding can also be used for discovering solutions to biomedical and agricultural problems created by insects that act as disease vectors and pests. The Proboscis Extension Response (PER) conditioning protocol was developed for honey bees (Apis mellifera
) over 50 years ago to study how they perceive and learn about floral odors, which signal the nectar and pollen resources a colony needs for survival. The PER procedure provides a robust and easy-to-employ framework for studying several different ecologically relevant mechanisms of behavioral plasticity. It is easily adaptable for use with several other insect species and other behavioral reflexes. These protocols can be readily employed in conjunction with various means for monitoring neural activity in the CNS via electrophysiology or bioimaging, or for manipulating targeted neuromodulatory pathways. It is a robust assay for rapidly detecting sub-lethal effects on behavior caused by environmental stressors, toxins or pesticides.
We show how the PER protocol is straightforward to implement using two procedures. One is suitable as a laboratory exercise for students or for quick assays of the effect of an experimental treatment. The other provides more thorough control of variables, which is important for studies of behavioral conditioning. We show how several measures for the behavioral response ranging from binary yes/no to more continuous variable like latency and duration of proboscis extension can be used to test hypotheses. And, we discuss some pitfalls that researchers commonly encounter when they use the procedure for the first time.
Neuroscience, Issue 91, PER, conditioning, honey bee, olfaction, olfactory processing, learning, memory, toxin assay
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Drug-induced Sensitization of Adenylyl Cyclase: Assay Streamlining and Miniaturization for Small Molecule and siRNA Screening Applications
Institutions: Purdue University, Eli Lilly and Company.
Sensitization of adenylyl cyclase (AC) signaling has been implicated in a variety of neuropsychiatric and neurologic disorders including substance abuse and Parkinson's disease. Acute activation of Gαi/o-linked receptors inhibits AC activity, whereas persistent activation of these receptors results in heterologous sensitization of AC and increased levels of intracellular cAMP. Previous studies have demonstrated that this enhancement of AC responsiveness is observed both in vitro
and in vivo
following the chronic activation of several types of Gαi/o-linked receptors including D2
dopamine and μ opioid receptors. Although heterologous sensitization of AC was first reported four decades ago, the mechanism(s) that underlie this phenomenon remain largely unknown. The lack of mechanistic data presumably reflects the complexity involved with this adaptive response, suggesting that nonbiased approaches could aid in identifying the molecular pathways involved in heterologous sensitization of AC. Previous studies have implicated kinase and Gbγ signaling as overlapping components that regulate the heterologous sensitization of AC. To identify unique and additional overlapping targets associated with sensitization of AC, the development and validation of a scalable cAMP sensitization assay is required for greater throughput. Previous approaches to study sensitization are generally cumbersome involving continuous cell culture maintenance as well as a complex methodology for measuring cAMP accumulation that involves multiple wash steps. Thus, the development of a robust cell-based assay that can be used for high throughput screening (HTS) in a 384 well format would facilitate future studies. Using two D2
dopamine receptor cellular models (i.e
), we have converted our 48-well sensitization assay (>20 steps 4-5 days) to a five-step, single day assay in 384-well format. This new format is amenable to small molecule screening, and we demonstrate that this assay design can also be readily used for reverse transfection of siRNA in anticipation of targeted siRNA library screening.
Bioengineering, Issue 83, adenylyl cyclase, cAMP, heterologous sensitization, superactivation, D2 dopamine, μ opioid, siRNA
High-throughput Image Analysis of Tumor Spheroids: A User-friendly Software Application to Measure the Size of Spheroids Automatically and Accurately
Institutions: Raymond and Beverly Sackler Foundation, New Jersey, Rutgers University, Rutgers University, Institute for Advanced Study, New Jersey.
The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro
model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application – SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary “Manual Initialize” and “Hand Draw” tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro
model for drug screens in industry and academia.
Cancer Biology, Issue 89, computer programming, high-throughput, image analysis, tumor spheroids, 3D, software application, cancer therapy, drug screen, neuroendocrine tumor cell line, BON-1, cancer research
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
A Manual Small Molecule Screen Approaching High-throughput Using Zebrafish Embryos
Institutions: University of Notre Dame.
Zebrafish have become a widely used model organism to investigate the mechanisms that underlie developmental biology and to study human disease pathology due to their considerable degree of genetic conservation with humans. Chemical genetics entails testing the effect that small molecules have on a biological process and is becoming a popular translational research method to identify therapeutic compounds. Zebrafish are specifically appealing to use for chemical genetics because of their ability to produce large clutches of transparent embryos, which are externally fertilized. Furthermore, zebrafish embryos can be easily drug treated by the simple addition of a compound to the embryo media. Using whole-mount in situ
hybridization (WISH), mRNA expression can be clearly visualized within zebrafish embryos. Together, using chemical genetics and WISH, the zebrafish becomes a potent whole organism context in which to determine the cellular and physiological effects of small molecules. Innovative advances have been made in technologies that utilize machine-based screening procedures, however for many labs such options are not accessible or remain cost-prohibitive. The protocol described here explains how to execute a manual high-throughput chemical genetic screen that requires basic resources and can be accomplished by a single individual or small team in an efficient period of time. Thus, this protocol provides a feasible strategy that can be implemented by research groups to perform chemical genetics in zebrafish, which can be useful for gaining fundamental insights into developmental processes, disease mechanisms, and to identify novel compounds and signaling pathways that have medically relevant applications.
Developmental Biology, Issue 93, zebrafish, chemical genetics, chemical screen, in vivo small molecule screen, drug discovery, whole mount in situ hybridization (WISH), high-throughput screening (HTS), high-content screening (HCS)
Flying Insect Detection and Classification with Inexpensive Sensors
Institutions: University of California, Riverside, University of California, Riverside, University of São Paulo - USP, ISCA Technologies.
An inexpensive, noninvasive system that could accurately classify flying insects would have important implications for entomological research, and allow for the development of many useful applications in vector and pest control for both medical and agricultural entomology. Given this, the last sixty years have seen many research efforts devoted to this task. To date, however, none of this research has had a lasting impact. In this work, we show that pseudo-acoustic optical sensors can produce superior data; that additional features, both intrinsic and extrinsic to the insect’s flight behavior, can be exploited to improve insect classification; that a Bayesian classification approach allows to efficiently learn classification models that are very robust to over-fitting, and a general classification framework allows to easily incorporate arbitrary number of features. We demonstrate the findings with large-scale experiments that dwarf all previous works combined, as measured by the number of insects and the number of species considered.
Bioengineering, Issue 92, flying insect detection, automatic insect classification, pseudo-acoustic optical sensors, Bayesian classification framework, flight sound, circadian rhythm
A Novel Three-dimensional Flow Chamber Device to Study Chemokine-directed Extravasation of Cells Circulating under Physiological Flow Conditions
Institutions: Torrey Pines Institute for Molecular Studies, Cascade LifeSciences Inc..
Extravasation of circulating cells from the bloodstream plays a central role in many physiological and pathophysiological processes, including stem cell homing and tumor metastasis. The three-dimensional flow chamber device (hereafter the 3D device) is a novel in vitro
technology that recreates physiological shear stress and allows each step of the cell extravasation cascade to be quantified. The 3D device consists of an upper compartment in which the cells of interest circulate under shear stress, and a lower compartment of static wells that contain the chemoattractants of interest. The two compartments are separated by porous inserts coated with a monolayer of endothelial cells (EC). An optional second insert with microenvironmental cells of interest can be placed immediately beneath the EC layer. A gas exchange unit allows the optimal CO2
tension to be maintained and provides an access point to add or withdraw cells or compounds during the experiment. The test cells circulate in the upper compartment at the desired shear stress (flow rate) controlled by a peristaltic pump. At the end of the experiment, the circulating and migrated cells are collected for further analyses. The 3D device can be used to examine cell rolling on and adhesion to EC under shear stress, transmigration in response to chemokine gradients, resistance to shear stress, cluster formation, and cell survival. In addition, the optional second insert allows the effects of crosstalk between EC and microenvironmental cells to be examined. The translational applications of the 3D device include testing of drug candidates that target cell migration and predicting the in vivo
behavior of cells after intravenous injection. Thus, the novel 3D device is a versatile and inexpensive tool to study the molecular mechanisms that mediate cellular extravasation.
Bioengineering, Issue 77, Cellular Biology, Biophysics, Physiology, Molecular Biology, Biomedical Engineering, Immunology, Cells, Biological Factors, Equipment and Supplies, Cell Physiological Phenomena, Natural Science Disciplines, Life Sciences (General), circulating cells, extravasation, physiological shear stress, endothelial cells, microenvironment, chemokine gradient, flow, chamber, cell culture, assay
A Guided Materials Screening Approach for Developing Quantitative Sol-gel Derived Protein Microarrays
Institutions: McMaster University .
Microarrays have found use in the development of high-throughput assays for new materials and discovery of small-molecule drug leads. Herein we describe a guided material screening approach to identify sol-gel based materials that are suitable for producing three-dimensional protein microarrays. The approach first identifies materials that can be printed as microarrays, narrows down the number of materials by identifying those that are compatible with a given enzyme assay, and then hones in on optimal materials based on retention of maximum enzyme activity. This approach is applied to develop microarrays suitable for two different enzyme assays, one using acetylcholinesterase and the other using a set of four key kinases involved in cancer. In each case, it was possible to produce microarrays that could be used for quantitative small-molecule screening assays and production of dose-dependent inhibitor response curves. Importantly, the ability to screen many materials produced information on the types of materials that best suited both microarray production and retention of enzyme activity. The materials data provide insight into basic material requirements necessary for tailoring optimal, high-density sol-gel derived microarrays.
Chemistry, Issue 78, Biochemistry, Chemical Engineering, Molecular Biology, Genetics, Bioengineering, Biomedical Engineering, Chemical Biology, Biocompatible Materials, Siloxanes, Enzymes, Immobilized, chemical analysis techniques, chemistry (general), materials (general), spectroscopic analysis (chemistry), polymer matrix composites, testing of materials (composite materials), Sol-gel, microarray, high-throughput screening, acetylcholinesterase, kinase, drug discovery, assay
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Psychophysiological Stress Assessment Using Biofeedback
Institutions: Cambridge Health Alliance, Harvard Medical School.
In the last half century, research in biofeedback has shown the extent to which the human mind can influence the functioning of the autonomic nervous system, previously thought to be outside of conscious control. By letting people observe signals from their own bodies, biofeedback enables them to develop greater awareness of their physiological and psychological reactions, such as stress, and to learn to modify these reactions. Biofeedback practitioners can facilitate this process by assessing people s reactions to mildly stressful events and formulating a biofeedback-based treatment plan. During stress assessment the practitioner first records a baseline for physiological readings, and then presents the client with several mild stressors, such as a cognitive, physical and emotional stressor. Variety of stressors is presented in order to determine a person's stimulus-response specificity, or differences in each person's reaction to qualitatively different stimuli. This video will demonstrate the process of psychophysiological stress assessment using biofeedback and present general guidelines for treatment planning.
Neuroscience, Issue 29, Stress, biofeedback, psychophysiological, assessment
Collecting Variable-concentration Isothermal Titration Calorimetry Datasets in Order to Determine Binding Mechanisms
Institutions: McGill University.
Isothermal titration calorimetry (ITC) is commonly used to determine the thermodynamic parameters associated with the binding of a ligand to a host macromolecule. ITC has some advantages over common spectroscopic approaches for studying host/ligand interactions. For example, the heat released or absorbed when the two components interact is directly measured and does not require any exogenous reporters. Thus the binding enthalpy and the association constant (Ka) are directly obtained from ITC data, and can be used to compute the entropic contribution. Moreover, the shape of the isotherm is dependent on the c-value and the mechanistic model involved. The c-value is defined as c = n[P]tKa, where [P]t is the protein concentration, and n is the number of ligand binding sites within the host. In many cases, multiple binding sites for a given ligand are non-equivalent and ITC allows the characterization of the thermodynamic binding parameters for each individual binding site. This however requires that the correct binding model be used. This choice can be problematic if different models can fit the same experimental data. We have previously shown that this problem can be circumvented by performing experiments at several c-values. The multiple isotherms obtained at different c-values are fit simultaneously to separate models. The correct model is next identified based on the goodness of fit across the entire variable-c dataset. This process is applied here to the aminoglycoside resistance-causing enzyme aminoglycoside N-6'-acetyltransferase-Ii (AAC(6')-Ii). Although our methodology is applicable to any system, the necessity of this strategy is better demonstrated with a macromolecule-ligand system showing allostery or cooperativity, and when different binding models provide essentially identical fits to the same data. To our knowledge, there are no such systems commercially available. AAC(6')-Ii, is a homo-dimer containing two active sites, showing cooperativity between the two subunits. However ITC data obtained at a single c-value can be fit equally well to at least two different models a two-sets-of-sites independent model and a two-site sequential (cooperative) model. Through varying the c-value as explained above, it was established that the correct binding model for AAC(6')-Ii is a two-site sequential binding model. Herein, we describe the steps that must be taken when performing ITC experiments in order to obtain datasets suitable for variable-c analyses.
Biochemistry, Issue 50, ITC, global fitting, cooperativity, binding model, ligand
Aseptic Laboratory Techniques: Plating Methods
Institutions: University of California, Los Angeles .
Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories
(BMBL) as well as Material Safety Data Sheets
(MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection
(ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to:
● Perform plating procedures without contaminating media.
● Isolate single bacterial colonies by the streak-plating method.
● Use pour-plating and spread-plating methods to determine the concentration of bacteria.
● Perform soft agar overlays when working with phage.
● Transfer bacterial cells from one plate to another using the replica-plating procedure.
● Given an experimental task, select the appropriate plating method.
Basic Protocols, Issue 63, Streak plates, pour plates, soft agar overlays, spread plates, replica plates, bacteria, colonies, phage, plaques, dilutions
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus
, consequently the name Taq
PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to:
● Set up reactions and thermal cycling conditions for a conventional PCR experiment
● Understand the function of various reaction components and their overall effect on a PCR experiment
● Design and optimize a PCR experiment for any DNA template
● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
High-throughput Screening for Small-molecule Modulators of Inward Rectifier Potassium Channels
Institutions: Vanderbilt University School of Medicine, Vanderbilt University School of Medicine, Vanderbilt University School of Medicine.
Specific members of the inward rectifier potassium (Kir) channel family are postulated drug targets for a variety of disorders, including hypertension, atrial fibrillation, and pain1,2
. For the most part, however, progress toward understanding their therapeutic potential or even basic physiological functions has been slowed by the lack of good pharmacological tools. Indeed, the molecular pharmacology of the inward rectifier family has lagged far behind that of the S4 superfamily of voltage-gated potassium (Kv) channels, for which a number of nanomolar-affinity and highly selective peptide toxin modulators have been discovered3
. The bee venom toxin tertiapin and its derivatives are potent inhibitors of Kir1.1 and Kir3 channels4,5
, but peptides are of limited use therapeutically as well as experimentally due to their antigenic properties and poor bioavailability, metabolic stability and tissue penetrance. The development of potent and selective small-molecule probes with improved pharmacological properties will be a key to fully understanding the physiology and therapeutic potential of Kir channels.
The Molecular Libraries Probes Production Center Network (MLPCN) supported by the National Institutes of Health (NIH) Common Fund has created opportunities for academic scientists to initiate probe discovery campaigns for molecular targets and signaling pathways in need of better pharmacology6
. The MLPCN provides researchers access to industry-scale screening centers and medicinal chemistry and informatics support to develop small-molecule probes to elucidate the function of genes and gene networks. The critical step in gaining entry to the MLPCN is the development of a robust target- or pathway-specific assay that is amenable for high-throughput screening (HTS).
Here, we describe how to develop a fluorescence-based thallium (Tl+
) flux assay of Kir channel function for high-throughput compound screening7,8,9,10
.The assay is based on the permeability of the K+
channel pore to the K+
. A commercially available fluorescent Tl+
reporter dye is used to detect transmembrane flux of Tl+
through the pore. There are at least three commercially available dyes that are suitable for Tl+
flux assays: BTC, FluoZin-2, and FluxOR7,8
. This protocol describes assay development using FluoZin-2. Although originally developed and marketed as a zinc indicator, FluoZin-2 exhibits a robust and dose-dependent increase in fluorescence emission upon Tl+
binding. We began working with FluoZin-2 before FluxOR was available7,8
and have continued to do so9,10
. However, the steps in assay development are essentially identical for all three dyes, and users should determine which dye is most appropriate for their specific needs. We also discuss the assay's performance benchmarks that must be reached to be considered for entry to the MLPCN. Since Tl+
readily permeates most K+
channels, the assay should be adaptable to most K+
Biochemistry, Issue 71, Molecular Biology, Chemistry, Cellular Biology, Chemical Biology, Pharmacology, Molecular Pharmacology, Potassium channels, drug discovery, drug screening, high throughput, small molecules, fluorescence, thallium flux, checkerboard analysis, DMSO, cell lines, screen, assay, assay development
Cerenkov Luminescence Imaging (CLI) for Cancer Therapy Monitoring
Institutions: Stanford University .
In molecular imaging, positron emission tomography (PET) and optical imaging (OI) are two of the most important and thus most widely used modalities1-3
. PET is characterized by its excellent sensitivity and quantification ability while OI is notable for non-radiation, relative low cost, short scanning time, high throughput, and wide availability to basic researchers. However, both modalities have their shortcomings as well. PET suffers from poor spatial resolution and high cost, while OI is mostly limited to preclinical applications because of its limited tissue penetration along with prominent scattering optical signals through the thickness of living tissues.
Recently a bridge between PET and OI has emerged with the discovery of Cerenkov Luminescence Imaging (CLI)4-6
. CLI is a new imaging modality that harnesses Cerenkov Radiation (CR) to image radionuclides with OI instruments. Russian Nobel laureate Alekseyevich Cerenkov and his colleagues originally discovered CR in 1934. It is a form of electromagnetic radiation emitted when a charged particle travels at a superluminal speed in a dielectric medium7,8
. The charged particle, whether positron or electron, perturbs the electromagnetic field of the medium by displacing the electrons in its atoms. After passing of the disruption photons are emitted as the displaced electrons return to the ground state. For instance, one 18
F decay was estimated to produce an average of 3 photons in water5
Since its emergence, CLI has been investigated for its use in a variety of preclinical applications including in vivo
tumor imaging, reporter gene imaging, radiotracer development, multimodality imaging, among others4,5,9,10,11
. The most important reason why CLI has enjoyed much success so far is that this new technology takes advantage of the low cost and wide availability of OI to image radionuclides, which used to be imaged only by more expensive and less available nuclear imaging modalities such as PET.
Here, we present the method of using CLI to monitor cancer drug therapy. Our group has recently investigated this new application and validated its feasibility by a proof-of-concept study12
. We demonstrated that CLI and PET exhibited excellent correlations across different tumor xenografts and imaging probes. This is consistent with the overarching principle of CR that CLI essentially visualizes the same radionuclides as PET. We selected Bevacizumab (Avastin; Genentech/Roche) as our therapeutic agent because it is a well-known angiogenesis inhibitor13,14
. Maturation of this technology in the near future can be envisioned to have a significant impact on preclinical drug development, screening, as well as therapy monitoring of patients receiving treatments.
Cancer Biology, Issue 69, Medicine, Molecular Biology, Cerenkov Luminescence Imaging, CLI, cancer therapy monitoring, optical imaging, PET, radionuclides, Avastin, imaging
Mass Cytometry: Protocol for Daily Tuning and Running Cell Samples on a CyTOF Mass Cytometer
Institutions: Stanford University .
In recent years, the rapid analysis of single cells has commonly been performed using flow cytometry and fluorescently-labeled antibodies. However, the issue of spectral overlap of fluorophore emissions has limited the number of simultaneous probes. In contrast, the new CyTOF mass cytometer by DVS Sciences couples a liquid single-cell introduction system to an ICP-MS.1
Rather than fluorophores, chelating polymers containing highly-enriched metal isotopes are coupled to antibodies or other specific probes.2-5
Because of the metal purity and mass resolution of the mass cytometer, there is no "spectral overlap" from neighboring isotopes, and therefore no need for compensation matrices. Additionally, due to the use of lanthanide metals, there is no biological background and therefore no equivalent of autofluorescence. With a mass window spanning atomic mass 103-203, theoretically up to 100 labels could be distinguished simultaneously. Currently, more than 35 channels are available using the chelating reagents available from DVS Sciences, allowing unprecedented dissection of the immunological profile of samples.6-7
Disadvantages to mass cytometry include the strict requirement for a separate metal isotope per probe (no equivalent of forward or side scatter), and the fact that it is a destructive technique (no possibility of sorting recovery). The current configuration of the mass cytometer also has a cell transmission rate of only ~25%, thus requiring a higher input number of cells.
Optimal daily performance of the mass cytometer requires several steps. The basic goal of the optimization is to maximize the measured signal intensity of the desired metal isotopes (M) while minimizing the formation of oxides (M+16) that will decrease the M signal intensity and interfere with any desired signal at M+16. The first step is to warm up the machine so a hot, stable ICP plasma has been established. Second, the settings for current and make-up gas flow rate must be optimized on a daily basis. During sample collection, the maximum cell event rate is limited by detector efficiency and processing speed to 1000 cells/sec. However, depending on the sample quality, a slower cell event rate (300-500 cells/sec) is usually desirable to allow better resolution between cells events and thus maximize intact singlets over doublets and debris. Finally, adequate cleaning of the machine at the end of the day helps minimize background signal due to free metal.
Bioengineering, Issue 69, CyTOF, mass cytometry, PBMCs, ICP-MS, multiparametric
Test Samples for Optimizing STORM Super-Resolution Microscopy
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e.
, lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.
) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization.
Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25
. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7
with a multiobjective evolutionary algorithm SPEA226
, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development
Facilitating the Analysis of Immunological Data with Visual Analytic Techniques
Institutions: University of British Columbia, University of British Columbia, University of British Columbia.
Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.
Immunology, Issue 47, Visual analytics, flow cytometry, Luminex, Tableau, cytokine, innate immunity, single nucleotide polymorphism
Interview: Protein Folding and Studies of Neurodegenerative Diseases
Institutions: MIT - Massachusetts Institute of Technology.
In this interview, Dr. Lindquist describes relationships between protein folding, prion diseases and neurodegenerative disorders. The problem of the protein folding is at the core of the modern biology. In addition to their traditional biochemical functions, proteins can mediate transfer of biological information and therefore can be considered a genetic material. This recently discovered function of proteins has important implications for studies of human disorders. Dr. Lindquist also describes current experimental approaches to investigate the mechanism of neurodegenerative diseases based on genetic studies in model organisms.
Neuroscience, issue 17, protein folding, brain, neuron, prion, neurodegenerative disease, yeast, screen, Translational Research