Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
21 Related JoVE Articles!
Drug-induced Sensitization of Adenylyl Cyclase: Assay Streamlining and Miniaturization for Small Molecule and siRNA Screening Applications
Institutions: Purdue University, Eli Lilly and Company.
Sensitization of adenylyl cyclase (AC) signaling has been implicated in a variety of neuropsychiatric and neurologic disorders including substance abuse and Parkinson's disease. Acute activation of Gαi/o-linked receptors inhibits AC activity, whereas persistent activation of these receptors results in heterologous sensitization of AC and increased levels of intracellular cAMP. Previous studies have demonstrated that this enhancement of AC responsiveness is observed both in vitro
and in vivo
following the chronic activation of several types of Gαi/o-linked receptors including D2
dopamine and μ opioid receptors. Although heterologous sensitization of AC was first reported four decades ago, the mechanism(s) that underlie this phenomenon remain largely unknown. The lack of mechanistic data presumably reflects the complexity involved with this adaptive response, suggesting that nonbiased approaches could aid in identifying the molecular pathways involved in heterologous sensitization of AC. Previous studies have implicated kinase and Gbγ signaling as overlapping components that regulate the heterologous sensitization of AC. To identify unique and additional overlapping targets associated with sensitization of AC, the development and validation of a scalable cAMP sensitization assay is required for greater throughput. Previous approaches to study sensitization are generally cumbersome involving continuous cell culture maintenance as well as a complex methodology for measuring cAMP accumulation that involves multiple wash steps. Thus, the development of a robust cell-based assay that can be used for high throughput screening (HTS) in a 384 well format would facilitate future studies. Using two D2
dopamine receptor cellular models (i.e
), we have converted our 48-well sensitization assay (>20 steps 4-5 days) to a five-step, single day assay in 384-well format. This new format is amenable to small molecule screening, and we demonstrate that this assay design can also be readily used for reverse transfection of siRNA in anticipation of targeted siRNA library screening.
Bioengineering, Issue 83, adenylyl cyclase, cAMP, heterologous sensitization, superactivation, D2 dopamine, μ opioid, siRNA
High-throughput Screening for Broad-spectrum Chemical Inhibitors of RNA Viruses
Institutions: Institut Pasteur, CNRS UMR3569, Institut Pasteur, CNRS UMR3523, Institut Pasteur.
RNA viruses are responsible for major human diseases such as flu, bronchitis, dengue, Hepatitis C or measles. They also represent an emerging threat because of increased worldwide exchanges and human populations penetrating more and more natural ecosystems. A good example of such an emerging situation is chikungunya virus epidemics of 2005-2006 in the Indian Ocean. Recent progresses in our understanding of cellular pathways controlling viral replication suggest that compounds targeting host cell functions, rather than the virus itself, could inhibit a large panel of RNA viruses. Some broad-spectrum antiviral compounds have been identified with host target-oriented assays. However, measuring the inhibition of viral replication in cell cultures using reduction of cytopathic effects as a readout still represents a paramount screening strategy. Such functional screens have been greatly improved by the development of recombinant viruses expressing reporter enzymes capable of bioluminescence such as luciferase. In the present report, we detail a high-throughput screening pipeline, which combines recombinant measles and chikungunya viruses with cellular viability assays, to identify compounds with a broad-spectrum antiviral profile.
Immunology, Issue 87, Viral infections, high-throughput screening assays, broad-spectrum antivirals, chikungunya virus, measles virus, luciferase reporter, chemical libraries
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
A Step Beyond BRET: Fluorescence by Unbound Excitation from Luminescence (FUEL)
Institutions: Institut Pasteur, Stanford School of Medicine, Institut d'Imagerie Biomédicale, Vanderbilt School of Medicine, The Walter & Eliza Hall Institute of Medical Research, Institut Pasteur, Institut Pasteur.
Fluorescence by Unbound Excitation from Luminescence (FUEL) is a radiative excitation-emission process that produces increased signal and contrast enhancement in vitro
and in vivo
. FUEL shares many of the same underlying principles as Bioluminescence Resonance Energy Transfer (BRET), yet greatly differs in the acceptable working distances between the luminescent source and the fluorescent entity. While BRET is effectively limited to a maximum of 2 times the Förster radius, commonly less than 14 nm, FUEL can occur at distances up to µm or even cm in the absence of an optical absorber. Here we expand upon the foundation and applicability of FUEL by reviewing the relevant principles behind the phenomenon and demonstrate its compatibility with a wide variety of fluorophores and fluorescent nanoparticles. Further, the utility of antibody-targeted FUEL is explored. The examples shown here provide evidence that FUEL can be utilized for applications where BRET is not possible, filling the spatial void that exists between BRET and traditional whole animal imaging.
Bioengineering, Issue 87, Biochemical Phenomena, Biochemical Processes, Energy Transfer, Fluorescence Resonance Energy Transfer (FRET), FUEL, BRET, CRET, Förster, bioluminescence, In vivo
MISSION esiRNA for RNAi Screening in Mammalian Cells
Institutions: Max Planck Institute of Molecular Cell Biology and Genetics.
RNA interference (RNAi) is a basic cellular mechanism for the control of gene expression. RNAi is induced by short double-stranded RNAs also known as small interfering RNAs (siRNAs). The short double-stranded RNAs originate from longer double stranded precursors by the activity of Dicer, a protein of the RNase III family of endonucleases. The resulting fragments are components of the RNA-induced silencing complex (RISC), directing it to the cognate target mRNA. RISC cleaves the target mRNA thereby reducing the expression of the encoded protein1,2,3
. RNAi has become a powerful and widely used experimental method for loss of gene function studies in mammalian cells utilizing small interfering RNAs.
Currently two main methods are available for the production of small interfering RNAs. One method involves chemical synthesis, whereas an alternative method employs endonucleolytic cleavage of target specific long double-stranded RNAs by RNase III in vitro
. Thereby, a diverse pool of siRNA-like oligonucleotides is produced which is also known as endoribonuclease-prepared siRNA or esiRNA. A comparison of efficacy of chemically derived siRNAs and esiRNAs shows that both triggers are potent in target-gene silencing. Differences can, however, be seen when comparing specificity. Many single chemically synthesized siRNAs produce prominent off-target effects, whereas the complex mixture inherent in esiRNAs leads to a more specific knockdown10
In this study, we present the design of genome-scale MISSION esiRNA libraries and its utilization for RNAi screening exemplified by a DNA-content screen for the identification of genes involved in cell cycle progression. We show how to optimize the transfection protocol and the assay for screening in high throughput. We also demonstrate how large data-sets can be evaluated statistically and present methods to validate primary hits. Finally, we give potential starting points for further functional characterizations of validated hits.
Cellular Biology, Issue 39, MISSION, esiRNA, RNAi, cell cycle, high throughput screening
Comprehensive Analysis of Transcription Dynamics from Brain Samples Following Behavioral Experience
Institutions: The Hebrew University of Jerusalem.
The encoding of experiences in the brain and the consolidation of long-term memories depend on gene transcription. Identifying the function of specific genes in encoding experience is one of the main objectives of molecular neuroscience. Furthermore, the functional association of defined genes with specific behaviors has implications for understanding the basis of neuropsychiatric disorders. Induction of robust transcription programs has been observed in the brains of mice following various behavioral manipulations. While some genetic elements are utilized recurrently following different behavioral manipulations and in different brain nuclei, transcriptional programs are overall unique to the inducing stimuli and the structure in which they are studied1,2
In this publication, a protocol is described for robust and comprehensive transcriptional profiling from brain nuclei of mice in response to behavioral manipulation. The protocol is demonstrated in the context of analysis of gene expression dynamics in the nucleus accumbens following acute cocaine experience. Subsequent to a defined in vivo
experience, the target neural tissue is dissected; followed by RNA purification, reverse transcription and utilization of microfluidic arrays for comprehensive qPCR analysis of multiple target genes. This protocol is geared towards comprehensive analysis (addressing 50-500 genes) of limiting quantities of starting material, such as small brain samples or even single cells.
The protocol is most advantageous for parallel analysis of multiple samples (e.g.
single cells, dynamic analysis following pharmaceutical, viral or behavioral perturbations). However, the protocol could also serve for the characterization and quality assurance of samples prior to whole-genome studies by microarrays or RNAseq, as well as validation of data obtained from whole-genome studies.
Behavior, Issue 90,
Brain, behavior, RNA, transcription, nucleus accumbens, cocaine, high-throughput qPCR, experience-dependent plasticity, gene regulatory networks, microdissection
Transcranial Direct Current Stimulation and Simultaneous Functional Magnetic Resonance Imaging
Institutions: University of Queensland, Charité Universitätsmedizin.
Transcranial direct current stimulation (tDCS) is a noninvasive brain stimulation technique that uses weak electrical currents administered to the scalp to manipulate cortical excitability and, consequently, behavior and brain function. In the last decade, numerous studies have addressed short-term and long-term effects of tDCS on different measures of behavioral performance during motor and cognitive tasks, both in healthy individuals and in a number of different patient populations. So far, however, little is known about the neural underpinnings of tDCS-action in humans with regard to large-scale brain networks. This issue can be addressed by combining tDCS with functional brain imaging techniques like functional magnetic resonance imaging (fMRI) or electroencephalography (EEG).
In particular, fMRI is the most widely used brain imaging technique to investigate the neural mechanisms underlying cognition and motor functions. Application of tDCS during fMRI allows analysis of the neural mechanisms underlying behavioral tDCS effects with high spatial resolution across the entire brain. Recent studies using this technique identified stimulation induced changes in task-related functional brain activity at the stimulation site and also in more distant brain regions, which were associated with behavioral improvement. In addition, tDCS administered during resting-state fMRI allowed identification of widespread changes in whole brain functional connectivity.
Future studies using this combined protocol should yield new insights into the mechanisms of tDCS action in health and disease and new options for more targeted application of tDCS in research and clinical settings. The present manuscript describes this novel technique in a step-by-step fashion, with a focus on technical aspects of tDCS administered during fMRI.
Behavior, Issue 86, noninvasive brain stimulation, transcranial direct current stimulation (tDCS), anodal stimulation (atDCS), cathodal stimulation (ctDCS), neuromodulation, task-related fMRI, resting-state fMRI, functional magnetic resonance imaging (fMRI), electroencephalography (EEG), inferior frontal gyrus (IFG)
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+
release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
A Manual Small Molecule Screen Approaching High-throughput Using Zebrafish Embryos
Institutions: University of Notre Dame.
Zebrafish have become a widely used model organism to investigate the mechanisms that underlie developmental biology and to study human disease pathology due to their considerable degree of genetic conservation with humans. Chemical genetics entails testing the effect that small molecules have on a biological process and is becoming a popular translational research method to identify therapeutic compounds. Zebrafish are specifically appealing to use for chemical genetics because of their ability to produce large clutches of transparent embryos, which are externally fertilized. Furthermore, zebrafish embryos can be easily drug treated by the simple addition of a compound to the embryo media. Using whole-mount in situ
hybridization (WISH), mRNA expression can be clearly visualized within zebrafish embryos. Together, using chemical genetics and WISH, the zebrafish becomes a potent whole organism context in which to determine the cellular and physiological effects of small molecules. Innovative advances have been made in technologies that utilize machine-based screening procedures, however for many labs such options are not accessible or remain cost-prohibitive. The protocol described here explains how to execute a manual high-throughput chemical genetic screen that requires basic resources and can be accomplished by a single individual or small team in an efficient period of time. Thus, this protocol provides a feasible strategy that can be implemented by research groups to perform chemical genetics in zebrafish, which can be useful for gaining fundamental insights into developmental processes, disease mechanisms, and to identify novel compounds and signaling pathways that have medically relevant applications.
Developmental Biology, Issue 93, zebrafish, chemical genetics, chemical screen, in vivo small molecule screen, drug discovery, whole mount in situ hybridization (WISH), high-throughput screening (HTS), high-content screening (HCS)
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Optimized Negative Staining: a High-throughput Protocol for Examining Small and Asymmetric Protein Structure by Electron Microscopy
Institutions: The Molecular Foundry.
Structural determination of proteins is rather challenging for proteins with molecular masses between 40 - 200 kDa. Considering that more than half of natural proteins have a molecular mass between 40 - 200 kDa1,2
, a robust and high-throughput method with a nanometer resolution capability is needed. Negative staining (NS) electron microscopy (EM) is an easy, rapid, and qualitative approach which has frequently been used in research laboratories to examine protein structure and protein-protein interactions. Unfortunately, conventional NS protocols often generate structural artifacts on proteins, especially with lipoproteins that usually form presenting rouleaux artifacts. By using images of lipoproteins from cryo-electron microscopy (cryo-EM) as a standard, the key parameters in NS specimen preparation conditions were recently screened and reported as the optimized NS protocol (OpNS), a modified conventional NS protocol 3
. Artifacts like rouleaux can be greatly limited by OpNS, additionally providing high contrast along with reasonably high‐resolution (near 1 nm) images of small and asymmetric proteins. These high-resolution and high contrast images are even favorable for an individual protein (a single object, no average) 3D reconstruction, such as a 160 kDa antibody, through the method of electron tomography4,5
. Moreover, OpNS can be a high‐throughput tool to examine hundreds of samples of small proteins. For example, the previously published mechanism of 53 kDa cholesteryl ester transfer protein (CETP) involved the screening and imaging of hundreds of samples 6
. Considering cryo-EM rarely successfully images proteins less than 200 kDa has yet to publish any study involving screening over one hundred sample conditions, it is fair to call OpNS a high-throughput method for studying small proteins. Hopefully the OpNS protocol presented here can be a useful tool to push the boundaries of EM and accelerate EM studies into small protein structure, dynamics and mechanisms.
Environmental Sciences, Issue 90, small and asymmetric protein structure, electron microscopy, optimized negative staining
A Practical Guide to Phylogenetics for Nonexperts
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
Generation of Comprehensive Thoracic Oncology Database - Tool for Translational Research
Institutions: University of Chicago, University of Chicago, Northshore University Health Systems, University of Chicago, University of Chicago, University of Chicago.
The Thoracic Oncology Program Database Project was created to serve as a comprehensive, verified, and accessible repository for well-annotated cancer specimens and clinical data to be available to researchers within the Thoracic Oncology Research Program. This database also captures a large volume of genomic and proteomic data obtained from various tumor tissue studies. A team of clinical and basic science researchers, a biostatistician, and a bioinformatics expert was convened to design the database. Variables of interest were clearly defined and their descriptions were written within a standard operating manual to ensure consistency of data annotation. Using a protocol for prospective tissue banking and another protocol for retrospective banking, tumor and normal tissue samples from patients consented to these protocols were collected. Clinical information such as demographics, cancer characterization, and treatment plans for these patients were abstracted and entered into an Access database. Proteomic and genomic data have been included in the database and have been linked to clinical information for patients described within the database. The data from each table were linked using the relationships function in Microsoft Access to allow the database manager to connect clinical and laboratory information during a query. The queried data can then be exported for statistical analysis and hypothesis generation.
Medicine, Issue 47, Database, Thoracic oncology, Bioinformatics, Biorepository, Microsoft Access, Proteomics, Genomics
One Dimensional Turing-Like Handshake Test for Motor Intelligence
Institutions: Ben-Gurion University.
In the Turing test, a computer model is deemed to "think intelligently" if it can generate answers that are not distinguishable from those of a human. However, this test is limited to the linguistic aspects of machine intelligence. A salient function of the brain is the control of movement, and the movement of the human hand is a sophisticated demonstration of this function. Therefore, we propose a Turing-like handshake test, for machine motor intelligence. We administer the test through a telerobotic system in which the interrogator is engaged in a task of holding a robotic stylus and interacting with another party (human or artificial). Instead of asking the interrogator whether the other party is a person or a computer program, we employ a two-alternative forced choice method and ask which of two systems is more human-like. We extract a quantitative grade for each model according to its resemblance to the human handshake motion and name it "Model Human-Likeness Grade" (MHLG). We present three methods to estimate the MHLG. (i) By calculating the proportion of subjects' answers that the model is more human-like than the human; (ii) By comparing two weighted sums of human and model handshakes we fit a psychometric curve and extract the point of subjective equality (PSE); (iii) By comparing a given model with a weighted sum of human and random signal, we fit a psychometric curve to the answers of the interrogator and extract the PSE for the weight of the human in the weighted sum. Altogether, we provide a protocol to test computational models of the human handshake. We believe that building a model is a necessary step in understanding any phenomenon and, in this case, in understanding the neural mechanisms responsible for the generation of the human handshake.
Neuroscience, Issue 46, Turing test, Human Machine Interface, Haptics, Teleoperation, Motor Control, Motor Behavior, Diagnostics, Perception, handshake, telepresence
Investigating the Effects of Antipsychotics and Schizotypy on the N400 Using Event-Related Potentials and Semantic Categorization
Institutions: McGill University, McGill University, McGill University, McGill University.
Within the field of cognitive neuroscience, functional magnetic resonance imaging (fMRI) is a popular method of visualizing brain function. This is in part because of its excellent spatial resolution, which allows researchers to identify brain areas associated with specific cognitive processes. However, in the quest to localize brain functions, it is relevant to note that many cognitive, sensory, and motor processes have temporal distinctions that are imperative to capture, an aspect that is left unfulfilled by fMRI’s suboptimal temporal resolution. To better understand cognitive processes, it is thus advantageous to utilize event-related potential (ERP) recording as a method of gathering information about the brain. Some of its advantages include its fantastic temporal resolution, which gives researchers the ability to follow the activity of the brain down to the millisecond. It also directly indexes both excitatory and inhibitory post-synaptic potentials by which most brain computations are performed. This sits in contrast to fMRI, which captures an index of metabolic activity. Further, the non-invasive ERP method does not require a contrast condition: raw ERPs can be examined for just one experimental condition, a distinction from fMRI where control conditions must be subtracted from the experimental condition, leading to uncertainty in associating observations with experimental or contrast conditions. While it is limited by its poor spatial and subcortical activity resolution, ERP recordings’ utility, relative cost-effectiveness, and associated advantages offer strong rationale for its use in cognitive neuroscience to track rapid temporal changes in neural activity. In an effort to foster increase in its use as a research imaging method, and to ensure proper and accurate data collection, the present article will outline – in the framework of a paradigm using semantic categorization to examine the effects of antipsychotics and schizotypy on the N400 – the procedure and key aspects associated with ERP data acquisition.
Behavior, Issue 93, Electrical brain activity, Semantic categorization, Event-related brain potentials, Neuroscience, Cognition, Psychiatry, Antipsychotic medication, N400, Schizotypy, Schizophrenia.
Cerenkov Luminescence Imaging (CLI) for Cancer Therapy Monitoring
Institutions: Stanford University .
In molecular imaging, positron emission tomography (PET) and optical imaging (OI) are two of the most important and thus most widely used modalities1-3
. PET is characterized by its excellent sensitivity and quantification ability while OI is notable for non-radiation, relative low cost, short scanning time, high throughput, and wide availability to basic researchers. However, both modalities have their shortcomings as well. PET suffers from poor spatial resolution and high cost, while OI is mostly limited to preclinical applications because of its limited tissue penetration along with prominent scattering optical signals through the thickness of living tissues.
Recently a bridge between PET and OI has emerged with the discovery of Cerenkov Luminescence Imaging (CLI)4-6
. CLI is a new imaging modality that harnesses Cerenkov Radiation (CR) to image radionuclides with OI instruments. Russian Nobel laureate Alekseyevich Cerenkov and his colleagues originally discovered CR in 1934. It is a form of electromagnetic radiation emitted when a charged particle travels at a superluminal speed in a dielectric medium7,8
. The charged particle, whether positron or electron, perturbs the electromagnetic field of the medium by displacing the electrons in its atoms. After passing of the disruption photons are emitted as the displaced electrons return to the ground state. For instance, one 18
F decay was estimated to produce an average of 3 photons in water5
Since its emergence, CLI has been investigated for its use in a variety of preclinical applications including in vivo
tumor imaging, reporter gene imaging, radiotracer development, multimodality imaging, among others4,5,9,10,11
. The most important reason why CLI has enjoyed much success so far is that this new technology takes advantage of the low cost and wide availability of OI to image radionuclides, which used to be imaged only by more expensive and less available nuclear imaging modalities such as PET.
Here, we present the method of using CLI to monitor cancer drug therapy. Our group has recently investigated this new application and validated its feasibility by a proof-of-concept study12
. We demonstrated that CLI and PET exhibited excellent correlations across different tumor xenografts and imaging probes. This is consistent with the overarching principle of CR that CLI essentially visualizes the same radionuclides as PET. We selected Bevacizumab (Avastin; Genentech/Roche) as our therapeutic agent because it is a well-known angiogenesis inhibitor13,14
. Maturation of this technology in the near future can be envisioned to have a significant impact on preclinical drug development, screening, as well as therapy monitoring of patients receiving treatments.
Cancer Biology, Issue 69, Medicine, Molecular Biology, Cerenkov Luminescence Imaging, CLI, cancer therapy monitoring, optical imaging, PET, radionuclides, Avastin, imaging
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Training Synesthetic Letter-color Associations by Reading in Color
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
Facilitating the Analysis of Immunological Data with Visual Analytic Techniques
Institutions: University of British Columbia, University of British Columbia, University of British Columbia.
Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.
Immunology, Issue 47, Visual analytics, flow cytometry, Luminex, Tableau, cytokine, innate immunity, single nucleotide polymorphism
Using SCOPE to Identify Potential Regulatory Motifs in Coregulated Genes
Institutions: Dartmouth College.
SCOPE is an ensemble motif finder that uses three component algorithms in parallel to identify potential regulatory motifs by over-representation and motif position preference1
. Each component algorithm is optimized to find a different kind of motif. By taking the best of these three approaches, SCOPE performs better than any single algorithm, even in the presence of noisy data1
. In this article, we utilize a web version of SCOPE2
to examine genes that are involved in telomere maintenance. SCOPE has been incorporated into at least two other motif finding programs3,4
and has been used in other studies5-8
The three algorithms that comprise SCOPE are BEAM9
, which finds non-degenerate motifs (ACCGGT), PRISM10
, which finds degenerate motifs (ASCGWT), and SPACER11
, which finds longer bipartite motifs (ACCnnnnnnnnGGT). These three algorithms have been optimized to find their corresponding type of motif. Together, they allow SCOPE to perform extremely well.
Once a gene set has been analyzed and candidate motifs identified, SCOPE can look for other genes that contain the motif which, when added to the original set, will improve the motif score. This can occur through over-representation or motif position preference. Working with partial gene sets that have biologically verified transcription factor binding sites, SCOPE was able to identify most of the rest of the genes also regulated by the given transcription factor.
Output from SCOPE shows candidate motifs, their significance, and other information both as a table and as a graphical motif map. FAQs and video tutorials are available at the SCOPE web site which also includes a "Sample Search" button that allows the user to perform a trial run.
Scope has a very friendly user interface that enables novice users to access the algorithm's full power without having to become an expert in the bioinformatics of motif finding. As input, SCOPE can take a list of genes, or FASTA sequences. These can be entered in browser text fields, or read from a file. The output from SCOPE contains a list of all identified motifs with their scores, number of occurrences, fraction of genes containing the motif, and the algorithm used to identify the motif. For each motif, result details include a consensus representation of the motif, a sequence logo, a position weight matrix, and a list of instances for every motif occurrence (with exact positions and "strand" indicated). Results are returned in a browser window and also optionally by email. Previous papers describe the SCOPE algorithms in detail1,2,9-11
Genetics, Issue 51, gene regulation, computational biology, algorithm, promoter sequence motif