Communication between animals is diverse and complex. Animals may communicate using auditory, seismic, chemosensory, electrical, or visual signals. In particular, understanding the constraints on visual signal design for communication has been of great interest. Traditional methods for investigating animal interactions have used basic observational techniques, staged encounters, or physical manipulation of morphology. Less intrusive methods have tried to simulate conspecifics using crude playback tools, such as mirrors, still images, or models. As technology has become more advanced, video playback has emerged as another tool in which to examine visual communication (Rosenthal, 2000). However, to move one step further, the application of computer-animation now allows researchers to specifically isolate critical components necessary to elicit social responses from conspecifics, and manipulate these features to control interactions. Here, I provide detail on how to create an animation using the Jacky dragon as a model, but this process may be adaptable for other species. In building the animation, I elected to use Lightwave 3D to alter object morphology, add texture, install bones, and provide comparable weight shading that prevents exaggerated movement. The animation is then matched to select motor patterns to replicate critical movement features. Finally, the sequence must rendered into an individual clip for presentation. Although there are other adaptable techniques, this particular method had been demonstrated to be effective in eliciting both conspicuous and social responses in staged interactions.
24 Related JoVE Articles!
In Situ SIMS and IR Spectroscopy of Well-defined Surfaces Prepared by Soft Landing of Mass-selected Ions
Institutions: Pacific Northwest National Laboratory.
Soft landing of mass-selected ions onto surfaces is a powerful approach for the highly-controlled preparation of materials that are inaccessible using conventional synthesis techniques. Coupling soft landing with in situ
characterization using secondary ion mass spectrometry (SIMS) and infrared reflection absorption spectroscopy (IRRAS) enables analysis of well-defined surfaces under clean vacuum conditions. The capabilities of three soft-landing instruments constructed in our laboratory are illustrated for the representative system of surface-bound organometallics prepared by soft landing of mass-selected ruthenium tris(bipyridine) dications, [Ru(bpy)3
(bpy = bipyridine), onto carboxylic acid terminated self-assembled monolayer surfaces on gold (COOH-SAMs). In situ
time-of-flight (TOF)-SIMS provides insight into the reactivity of the soft-landed ions. In addition, the kinetics of charge reduction, neutralization and desorption occurring on the COOH-SAM both during and after ion soft landing are studied using in situ
Fourier transform ion cyclotron resonance (FT-ICR)-SIMS measurements. In situ
IRRAS experiments provide insight into how the structure of organic ligands surrounding metal centers is perturbed through immobilization of organometallic ions on COOH-SAM surfaces by soft landing. Collectively, the three instruments provide complementary information about the chemical composition, reactivity and structure of well-defined species supported on surfaces.
Chemistry, Issue 88, soft landing, mass selected ions, electrospray, secondary ion mass spectrometry, infrared spectroscopy, organometallic, catalysis
Irrelevant Stimuli and Action Control: Analyzing the Influence of Ignored Stimuli via the Distractor-Response Binding Paradigm
Institutions: Trier University, Trier University.
Selection tasks in which simple stimuli (e.g.
letters) are presented and a target stimulus has to be selected against one or more distractor stimuli are frequently used in the research on human action control. One important question in these settings is how distractor stimuli, competing with the target stimulus for a response, influence actions. The distractor-response binding paradigm can be used to investigate this influence. It is particular useful to separately analyze response retrieval and distractor inhibition effects. Computer-based experiments are used to collect the data (reaction times and error rates). In a number of sequentially presented pairs of stimulus arrays (prime-probe design), participants respond to targets while ignoring distractor stimuli. Importantly, the factors response relation in the arrays of each pair (repetition vs. change) and distractor relation (repetition vs. change) are varied orthogonally. The repetition of the same distractor then has a different effect depending on response relation (repetition vs. change) between arrays. This result pattern can be explained by response retrieval due to distractor repetition. In addition, distractor inhibition effects are indicated by a general advantage due to distractor repetition. The described paradigm has proven useful to determine relevant parameters for response retrieval effects on human action.
Behavior, Issue 87, stimulus-response binding, distractor-response binding, response retrieval, distractor inhibition, event file, action control, selection task
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+
release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Measuring Neural and Behavioral Activity During Ongoing Computerized Social Interactions: An Examination of Event-Related Brain Potentials
Institutions: Illinois Wesleyan University.
Social exclusion is a complex social phenomenon with powerful negative consequences. Given the impact of social exclusion on mental and emotional health, an understanding of how perceptions of social exclusion develop over the course of a social interaction is important for advancing treatments aimed at lessening the harmful costs of being excluded. To date, most scientific examinations of social exclusion have looked at exclusion after a social interaction has been completed. While this has been very helpful in developing an understanding of what happens to a person following exclusion, it has not helped to clarify the moment-to-moment dynamics of the process of social exclusion. Accordingly, the current protocol was developed to obtain an improved understanding of social exclusion by examining the patterns of event-related brain activation that are present during social interactions. This protocol allows greater precision and sensitivity in detailing the social processes that lead people to feel as though they have been excluded from a social interaction. Importantly, the current protocol can be adapted to include research projects that vary the nature of exclusionary social interactions by altering how frequently participants are included, how long the periods of exclusion will last in each interaction, and when exclusion will take place during the social interactions. Further, the current protocol can be used to examine variables and constructs beyond those related to social exclusion. This capability to address a variety of applications across psychology by obtaining both neural and behavioral data during ongoing social interactions suggests the present protocol could be at the core of a developing area of scientific inquiry related to social interactions.
Behavior, Issue 93, Event-related brain potentials (ERPs), Social Exclusion, Neuroscience, N2, P3, Cognitive Control
Internalization and Observation of Fluorescent Biomolecules in Living Microorganisms via Electroporation
Institutions: University of Oxford, Genome Center.
The ability to study biomolecules in vivo
is crucial for understanding their function in a biological context. One powerful approach involves fusing molecules of interest to fluorescent proteins such as GFP to study their expression, localization and function. However, GFP and its derivatives are significantly larger and less photostable than organic fluorophores generally used for in vitro
experiments, and this can limit the scope of investigation.
We recently introduced a straightforward, versatile and high-throughput method based on electroporation, allowing the internalization of biomolecules labeled with organic fluorophores into living microorganisms. Here we describe how to use electroporation to internalize labeled DNA fragments or proteins into Escherichia coli
and Saccharomyces cerevisiæ
, how to quantify the number of internalized molecules using fluorescence microscopy, and how to quantify the viability of electroporated cells. Data can be acquired at the single-cell or single-molecule level using fluorescence or FRET. The possibility of internalizing non-labeled molecules that trigger a physiological observable response in vivo
is also presented. Finally, strategies of optimization of the protocol for specific biological systems are discussed.
Microbiology, Issue 96, Electroporation, fluorescence, FRET, in vivo, single-molecule imaging, bacteria, Escherichia coli, yeast, internalization, labeled DNA, labeled proteins
Genome-wide Protein-protein Interaction Screening by Protein-fragment Complementation Assay (PCA) in Living Cells
Institutions: Université Laval.
Proteins are the building blocks, effectors and signal mediators of cellular processes. A protein’s function, regulation and localization often depend on its interactions with other proteins. Here, we describe a protocol for the yeast protein-fragment complementation assay (PCA), a powerful method to detect direct and proximal associations between proteins in living cells. The interaction between two proteins, each fused to a dihydrofolate reductase (DHFR) protein fragment, translates into growth of yeast strains in presence of the drug methotrexate (MTX). Differential fitness, resulting from different amounts of reconstituted DHFR enzyme, can be quantified on high-density colony arrays, allowing to differentiate interacting from non-interacting bait-prey pairs. The high-throughput protocol presented here is performed using a robotic platform that parallelizes mating of bait and prey strains carrying complementary DHFR-fragment fusion proteins and the survival assay on MTX. This protocol allows to systematically test for thousands of protein-protein interactions (PPIs) involving bait proteins of interest and offers several advantages over other PPI detection assays, including the study of proteins expressed from their endogenous promoters without the need for modifying protein localization and for the assembly of complex reporter constructs.
Cellular Biology, Issue 97, Protein-protein interaction (PPI); high-throughput screening; yeast; protein-fragment complementation assay (PCA); dihydrofolate reductase (DHFR); high-density arrays; systems biology; biological networks
Closed-loop Neuro-robotic Experiments to Test Computational Properties of Neuronal Networks
Institutions: Istituto Italiano di Tecnologia.
Information coding in the Central Nervous System (CNS) remains unexplored. There is mounting evidence that, even at a very low level, the representation of a given stimulus might be dependent on context and history. If this is actually the case, bi-directional interactions between the brain (or if need be a reduced model of it) and sensory-motor system can shed a light on how encoding and decoding of information is performed. Here an experimental system is introduced and described in which the activity of a neuronal element (i.e.
, a network of neurons extracted from embryonic mammalian hippocampi) is given context and used to control the movement of an artificial agent, while environmental information is fed back to the culture as a sequence of electrical stimuli. This architecture allows a quick selection of diverse encoding, decoding, and learning algorithms to test different hypotheses on the computational properties of neuronal networks.
Neuroscience, Issue 97, Micro Electrode Arrays (MEA), in vitro cultures, coding, decoding, tetanic stimulation, spike, burst
Scalable 96-well Plate Based iPSC Culture and Production Using a Robotic Liquid Handling System
Institutions: InvivoSciences, Inc., Gilson, Inc..
Continued advancement in pluripotent stem cell culture is closing the gap between bench and bedside for using these cells in regenerative medicine, drug discovery and safety testing. In order to produce stem cell derived biopharmaceutics and cells for tissue engineering and transplantation, a cost-effective cell-manufacturing technology is essential. Maintenance of pluripotency and stable performance of cells in downstream applications (e.g.
, cell differentiation) over time is paramount to large scale cell production. Yet that can be difficult to achieve especially if cells are cultured manually where the operator can introduce significant variability as well as be prohibitively expensive to scale-up. To enable high-throughput, large-scale stem cell production and remove operator influence novel stem cell culture protocols using a bench-top multi-channel liquid handling robot were developed that require minimal technician involvement or experience. With these protocols human induced pluripotent stem cells (iPSCs) were cultured in feeder-free conditions directly from a frozen stock and maintained in 96-well plates. Depending on cell line and desired scale-up rate, the operator can easily determine when to passage based on a series of images showing the optimal colony densities for splitting. Then the necessary reagents are prepared to perform a colony split to new plates without a centrifugation step. After 20 passages (~3 months), two iPSC lines maintained stable karyotypes, expressed stem cell markers, and differentiated into cardiomyocytes with high efficiency. The system can perform subsequent high-throughput screening of new differentiation protocols or genetic manipulation designed for 96-well plates. This technology will reduce the labor and technical burden to produce large numbers of identical stem cells for a myriad of applications.
Developmental Biology, Issue 99, iPSC, high-throughput, robotic, liquid-handling, scalable, stem cell, automated stem cell culture, 96-well
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Generation of Shear Adhesion Map Using SynVivo Synthetic Microvascular Networks
Institutions: CFD Research Corporation.
Cell/particle adhesion assays are critical to understanding the biochemical interactions involved in disease pathophysiology and have important applications in the quest for the development of novel therapeutics. Assays using static conditions fail to capture the dependence of adhesion on shear, limiting their correlation with in vivo
environment. Parallel plate flow chambers that quantify adhesion under physiological fluid flow need multiple experiments for the generation of a shear adhesion map. In addition, they do not represent the in vivo
scale and morphology and require large volumes (~ml) of reagents for experiments. In this study, we demonstrate the generation of shear adhesion map from a single experiment using a microvascular network based microfluidic device, SynVivo-SMN. This device recreates the complex in vivo
vasculature including geometric scale, morphological elements, flow features and cellular interactions in an in vitro
format, thereby providing a biologically realistic environment for basic and applied research in cellular behavior, drug delivery, and drug discovery. The assay was demonstrated by studying the interaction of the 2 µm biotin-coated particles with avidin-coated surfaces of the microchip. The entire range of shear observed in the microvasculature is obtained in a single assay enabling adhesion vs. shear map for the particles under physiological conditions.
Bioengineering, Issue 87, particle, adhesion, shear, microfluidics, vasculature, networks
In Vivo 2-Photon Calcium Imaging in Layer 2/3 of Mice
Institutions: University of California, Los Angeles.
To understand network dynamics of microcircuits in the neocortex, it is essential to simultaneously record the activity of a large number of neurons . In-vivo two-photon calcium imaging is the only method that allows one to record the activity of a dense neuronal population with single-cell resolution . The method consists in implanting a cranial imaging window, injecting a fluorescent calcium indicator dye that can be taken up by large numbers of neurons and finally recording the activity of neurons with time lapse calcium imaging using an in-vivo two photon microscope. Co-injection of astrocyte-specific dyes allows one to differentiate neurons from astrocytes. The technique can be performed in mice expressing fluorescent molecules in specific subpopulations of neurons to better understand the network interactions of different groups of cells.
Neuroscience, Issue 13, 2-photon, two-photon, GFP mice, craniotomy, spine dynamics, cranial window
Brain Imaging Investigation of the Neural Correlates of Observing Virtual Social Interactions
Institutions: University of Alberta, University of Illinois, University of Alberta, University of Alberta, University of Alberta, University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign.
The ability to gauge social interactions is crucial in the assessment of others’ intentions. Factors such as facial expressions and body language affect our decisions in personal and professional life alike 1
. These "friend or foe
" judgements are often based on first impressions, which in turn may affect our decisions to "approach or avoid
". Previous studies investigating the neural correlates of social cognition tended to use static facial stimuli 2
. Here, we illustrate an experimental design in which whole-body animated characters were used in conjunction with functional magnetic resonance imaging (fMRI) recordings. Fifteen participants were presented with short movie-clips of guest-host interactions in a business setting, while fMRI data were recorded; at the end of each movie, participants also provided ratings of the host behaviour. This design mimics more closely real-life situations, and hence may contribute to better understanding of the neural mechanisms of social interactions in healthy behaviour, and to gaining insight into possible causes of deficits in social behaviour in such clinical conditions as social anxiety and autism 3
Neuroscience, Issue 53, Social Perception, Social Knowledge, Social Cognition Network, Non-Verbal Communication, Decision-Making, Event-Related fMRI
Plasma Lithography Surface Patterning for Creation of Cell Networks
Institutions: University of Arizona , University of Arizona .
Systematic manipulation of a cell microenvironment with micro- and nanoscale resolution is often required for deciphering various cellular and molecular phenomena. To address this requirement, we have developed a plasma lithography technique to manipulate the cellular microenvironment by creating a patterned surface with feature sizes ranging from 100 nm to millimeters. The goal of this technique is to be able to study, in a controlled way, the behaviors of individual cells as well as groups of cells and their interactions.
This plasma lithography method is based on selective modification of the surface chemistry on a substrate by means of shielding the contact of low-temperature plasma with a physical mold. This selective shielding leaves a chemical pattern which can guide cell attachment and movement. This pattern, or surface template, can then be used to create networks of cells whose structure can mimic that found in nature and produces a controllable environment for experimental investigations. The technique is well suited to studying biological phenomenon as it produces stable surface patterns on transparent polymeric substrates in a biocompatible manner. The surface patterns last for weeks to months and can thus guide interaction with cells for long time periods which facilitates the study of long-term cellular processes, such as differentiation and adaption. The modification to the surface is primarily chemical in nature and thus does not introduce topographical or physical interference for interpretation of results. It also does not involve any harsh or toxic substances to achieve patterning and is compatible for tissue culture. Furthermore, it can be applied to modify various types of polymeric substrates, which due to the ability to tune their properties are ideal for and are widely used in biological applications. The resolution achievable is also beneficial, as isolation of specific processes such as migration, adhesion, or binding allows for discrete, clear observations at the single to multicell level.
This method has been employed to form diverse networks of different cell types for investigations involving migration, signaling, tissue formation, and the behavior and interactions of neurons arraigned in a network.
Bioengineering, Issue 52, Cell Network, Surface Patterning, Self-Organization, Developmental Biology, Tissue Engineering, Nanopattern, Micropattern, Self-Assembly, Cell Guidance, Neuron
A Protocol for Computer-Based Protein Structure and Function Prediction
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g.
protein-protein) and functional (e.g.
gene-gene or genetic) interactions (GI)1
. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2
. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7
, but GI information remains sparse for prokaryotes8
, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10
Here, we present the key steps required to perform quantitative E. coli
Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9
, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format.
Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g.
the 'Keio' collection11
) and essential gene hypomorphic mutations (i.e.
alleles conferring reduced protein expression, stability, or activity9, 12, 13
) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14
. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9
. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2
. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e.
slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2
as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
Applications of EEG Neuroimaging Data: Event-related Potentials, Spectral Power, and Multiscale Entropy
When considering human neuroimaging data, an appreciation of signal variability represents a fundamental innovation in the way we think about brain signal. Typically, researchers represent the brain's response as the mean across repeated experimental trials and disregard signal fluctuations over time as "noise". However, it is becoming clear that brain signal variability conveys meaningful functional information about neural network dynamics. This article describes the novel method of multiscale entropy (MSE) for quantifying brain signal variability. MSE may be particularly informative of neural network dynamics because it shows timescale dependence and sensitivity to linear and nonlinear dynamics in the data.
Neuroscience, Issue 76, Neurobiology, Anatomy, Physiology, Medicine, Biomedical Engineering, Electroencephalography, EEG, electroencephalogram, Multiscale entropy, sample entropy, MEG, neuroimaging, variability, noise, timescale, non-linear, brain signal, information theory, brain, imaging
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4
is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1
). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6
. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8
. Using logistic regression analysis of subject scores (i.e.
pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e.
composite networks with improved discrimination of patients from healthy control subjects5,6
. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9
. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10
. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11
. These standardized values can in turn be used to assist in differential diagnosis12,13
and to assess disease progression and treatment effects at the network level7,14-16
. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Microwave-assisted Functionalization of Poly(ethylene glycol) and On-resin Peptides for Use in Chain Polymerizations and Hydrogel Formation
Institutions: University of Rochester, University of Rochester, University of Rochester Medical Center.
One of the main benefits to using poly(ethylene glycol) (PEG) macromers in hydrogel formation is synthetic versatility. The ability to draw from a large variety of PEG molecular weights and configurations (arm number, arm length, and branching pattern) affords researchers tight control over resulting hydrogel structures and properties, including Young’s modulus and mesh size. This video will illustrate a rapid, efficient, solvent-free, microwave-assisted method to methacrylate PEG precursors into poly(ethylene glycol) dimethacrylate (PEGDM). This synthetic method provides much-needed starting materials for applications in drug delivery and regenerative medicine. The demonstrated method is superior to traditional methacrylation methods as it is significantly faster and simpler, as well as more economical and environmentally friendly, using smaller amounts of reagents and solvents. We will also demonstrate an adaptation of this technique for on-resin methacrylamide functionalization of peptides. This on-resin method allows the N-terminus of peptides to be functionalized with methacrylamide groups prior to deprotection and cleavage from resin. This allows for selective addition of methacrylamide groups to the N-termini of the peptides while amino acids with reactive side groups (e.g.
primary amine of lysine, primary alcohol of serine, secondary alcohols of threonine, and phenol of tyrosine) remain protected, preventing functionalization at multiple sites. This article will detail common analytical methods (proton Nuclear Magnetic Resonance spectroscopy (;
H-NMR) and Matrix Assisted Laser Desorption Ionization Time of Flight mass spectrometry (MALDI-ToF)) to assess the efficiency of the functionalizations. Common pitfalls and suggested troubleshooting methods will be addressed, as will modifications of the technique which can be used to further tune macromer functionality and resulting hydrogel physical and chemical properties. Use of synthesized products for the formation of hydrogels for drug delivery and cell-material interaction studies will be demonstrated, with particular attention paid to modifying hydrogel composition to affect mesh size, controlling hydrogel stiffness and drug release.
Chemistry, Issue 80, Poly(ethylene glycol), peptides, polymerization, polymers, methacrylation, peptide functionalization, 1H-NMR, MALDI-ToF, hydrogels, macromer synthesis
A Practical Guide to Phylogenetics for Nonexperts
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
Assessment of Social Cognition in Non-human Primates Using a Network of Computerized Automated Learning Device (ALDM) Test Systems
Institutions: Aix-Marseille University.
Fagot & Paleressompoulle1
and Fagot & Bonte2
have published an automated learning device (ALDM) for the study of cognitive abilities of monkeys maintained in semi-free ranging conditions. Data accumulated during the last five years have consistently demonstrated the efficiency of this protocol to investigate individual/physical cognition in monkeys, and have further shown that this procedure reduces stress level during animal testing3
. This paper demonstrates that networks of ALDM can also be used to investigate different facets of social cognition and in-group expressed behaviors in monkeys, and describes three illustrative protocols developed for that purpose. The first study demonstrates how ethological assessments of social behavior and computerized assessments of cognitive performance could be integrated to investigate the effects of socially exhibited moods on the cognitive performance of individuals. The second study shows that batteries of ALDM running in parallel can provide unique information on the influence of the presence of others on task performance. Finally, the last study shows that networks of ALDM test units can also be used to study issues related to social transmission and cultural evolution. Combined together, these three studies demonstrate clearly that ALDM testing is a highly promising experimental tool for bridging the gap in the animal literature between research on individual cognition and research on social cognition.
Behavior, Issue 99, Baboon, automated learning device, cultural transmission, emotion, social facilitation, cognition, operant conditioning.