Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
24 Related JoVE Articles!
Single Particle Electron Microscopy Reconstruction of the Exosome Complex Using the Random Conical Tilt Method
Institutions: Yale University.
Single particle electron microscopy (EM) reconstruction has recently become a popular tool to get the three-dimensional (3D) structure of large macromolecular complexes. Compared to X-ray crystallography, it has some unique advantages. First, single particle EM reconstruction does not need to crystallize the protein sample, which is the bottleneck in X-ray crystallography, especially for large macromolecular complexes. Secondly, it does not need large amounts of protein samples. Compared with milligrams of proteins necessary for crystallization, single particle EM reconstruction only needs several micro-liters of protein solution at nano-molar concentrations, using the negative staining EM method. However, despite a few macromolecular assemblies with high symmetry, single particle EM is limited at relatively low resolution (lower than 1 nm resolution) for many specimens especially those without symmetry. This technique is also limited by the size of the molecules under study, i.e. 100 kDa for negatively stained specimens and 300 kDa for frozen-hydrated specimens in general.
For a new sample of unknown structure, we generally use a heavy metal solution to embed the molecules by negative staining. The specimen is then examined in a transmission electron microscope to take two-dimensional (2D) micrographs of the molecules. Ideally, the protein molecules have a homogeneous 3D structure but exhibit different orientations in the micrographs. These micrographs are digitized and processed in computers as "single particles". Using two-dimensional alignment and classification techniques, homogenous molecules in the same views are clustered into classes. Their averages enhance the signal of the molecule's 2D shapes. After we assign the particles with the proper relative orientation (Euler angles), we will be able to reconstruct the 2D particle images into a 3D virtual volume.
In single particle 3D reconstruction, an essential step is to correctly assign the proper orientation of each single particle. There are several methods to assign the view for each particle, including the angular reconstitution1
and random conical tilt (RCT) method2
. In this protocol, we describe our practice in getting the 3D reconstruction of yeast exosome complex using negative staining EM and RCT. It should be noted that our protocol of electron microscopy and image processing follows the basic principle of RCT but is not the only way to perform the method. We first describe how to embed the protein sample into a layer of Uranyl-Formate with a thickness comparable to the protein size, using a holey carbon grid covered with a layer of continuous thin carbon film. Then the specimen is inserted into a transmission electron microscope to collect untilted (0-degree) and tilted (55-degree) pairs of micrographs that will be used later for processing and obtaining an initial 3D model of the yeast exosome. To this end, we perform RCT and then refine the initial 3D model by using the projection matching refinement method3
Structural Biology, Issue 49, Electron microscopy, single particle three-dimensional reconstruction, exosome complex, negative staining
Using plusTipTracker Software to Measure Microtubule Dynamics in Xenopus laevis Growth Cones
Institutions: Boston College.
Microtubule (MT) plus-end-tracking proteins (+TIPs) localize to the growing plus-ends of MTs and regulate MT dynamics1,2
. One of the most well-known and widely-utilized +TIPs for analyzing MT dynamics is the End-Binding protein, EB1, which binds all growing MT plus-ends, and thus, is a marker for MT polymerization1
. Many studies of EB1 behavior within growth cones have used time-consuming and biased computer-assisted, hand-tracking methods to analyze individual MTs1-3
. Our approach is to quantify global parameters of MT dynamics using the software package, plusTipTracker4
, following the acquisition of high-resolution, live images of tagged EB1 in cultured embryonic growth cones5
. This software is a MATLAB-based, open-source, user-friendly package that combines automated detection, tracking, visualization, and analysis for movies of fluorescently-labeled +TIPs. Here, we present the protocol for using plusTipTracker for the analysis of fluorescently-labeled +TIP comets in cultured Xenopus laevis
growth cones. However, this software can also be used to characterize MT dynamics in various cell types6-8
Molecular Biology, Issue 91, plusTipTracker, microtubule plus-end-tracking proteins, EB1, growth cone, Xenopus laevis, live cell imaging analysis, microtubule dynamics
Trajectory Data Analyses for Pedestrian Space-time Activity Study
Institutions: Kean University, University of Wisconsin-Madison.
It is well recognized that human movement in the spatial and temporal dimensions has direct influence on disease transmission1-3
. An infectious disease typically spreads via contact between infected and susceptible individuals in their overlapped activity spaces. Therefore, daily mobility-activity information can be used as an indicator to measure exposures to risk factors of infection. However, a major difficulty and thus the reason for paucity of studies of infectious disease transmission at the micro scale arise from the lack of detailed individual mobility data. Previously in transportation and tourism research detailed space-time activity data often relied on the time-space diary technique, which requires subjects to actively record their activities in time and space. This is highly demanding for the participants and collaboration from the participants greatly affects the quality of data4
Modern technologies such as GPS and mobile communications have made possible the automatic collection of trajectory data. The data collected, however, is not ideal for modeling human space-time activities, limited by the accuracies of existing devices. There is also no readily available tool for efficient processing of the data for human behavior study. We present here a suite of methods and an integrated ArcGIS desktop-based visual interface for the pre-processing and spatiotemporal analyses of trajectory data. We provide examples of how such processing may be used to model human space-time activities, especially with error-rich pedestrian trajectory data, that could be useful in public health studies such as infectious disease transmission modeling.
The procedure presented includes pre-processing, trajectory segmentation, activity space characterization, density estimation and visualization, and a few other exploratory analysis methods. Pre-processing is the cleaning of noisy raw trajectory data. We introduce an interactive visual pre-processing interface as well as an automatic module. Trajectory segmentation5
involves the identification of indoor and outdoor parts from pre-processed space-time tracks. Again, both interactive visual segmentation and automatic segmentation are supported. Segmented space-time tracks are then analyzed to derive characteristics of one's activity space such as activity radius etc.
Density estimation and visualization are used to examine large amount of trajectory data to model hot spots and interactions. We demonstrate both density surface mapping6
and density volume rendering7
. We also include a couple of other exploratory data analyses (EDA) and visualizations tools, such as Google Earth animation support and connection analysis. The suite of analytical as well as visual methods presented in this paper may be applied to any trajectory data for space-time activity studies.
Environmental Sciences, Issue 72, Computer Science, Behavior, Infectious Diseases, Geography, Cartography, Data Display, Disease Outbreaks, cartography, human behavior, Trajectory data, space-time activity, GPS, GIS, ArcGIS, spatiotemporal analysis, visualization, segmentation, density surface, density volume, exploratory data analysis, modelling
A Method for Investigating Age-related Differences in the Functional Connectivity of Cognitive Control Networks Associated with Dimensional Change Card Sort Performance
Institutions: University of Western Ontario.
The ability to adjust behavior to sudden changes in the environment develops gradually in childhood and adolescence. For example, in the Dimensional Change Card Sort task, participants switch from sorting cards one way, such as shape, to sorting them a different way, such as color. Adjusting behavior in this way exacts a small performance cost, or switch cost, such that responses are typically slower and more error-prone on switch trials in which the sorting rule changes as compared to repeat trials in which the sorting rule remains the same. The ability to flexibly adjust behavior is often said to develop gradually, in part because behavioral costs such as switch costs typically decrease with increasing age. Why aspects of higher-order cognition, such as behavioral flexibility, develop so gradually remains an open question. One hypothesis is that these changes occur in association with functional changes in broad-scale cognitive control networks. On this view, complex mental operations, such as switching, involve rapid interactions between several distributed brain regions, including those that update and maintain task rules, re-orient attention, and select behaviors. With development, functional connections between these regions strengthen, leading to faster and more efficient switching operations. The current video describes a method of testing this hypothesis through the collection and multivariate analysis of fMRI data from participants of different ages.
Behavior, Issue 87, Neurosciences, fMRI, Cognitive Control, Development, Functional Connectivity
Rapid Analysis and Exploration of Fluorescence Microscopy Images
Institutions: UT Southwestern Medical Center, UT Southwestern Medical Center, Princeton University.
Despite rapid advances in high-throughput microscopy, quantitative image-based assays still pose significant challenges. While a variety of specialized image analysis tools are available, most traditional image-analysis-based workflows have steep learning curves (for fine tuning of analysis parameters) and result in long turnaround times between imaging and analysis. In particular, cell segmentation, the process of identifying individual cells in an image, is a major bottleneck in this regard.
Here we present an alternate, cell-segmentation-free workflow based on PhenoRipper, an open-source software platform designed for the rapid analysis and exploration of microscopy images. The pipeline presented here is optimized for immunofluorescence microscopy images of cell cultures and requires minimal user intervention. Within half an hour, PhenoRipper can analyze data from a typical 96-well experiment and generate image profiles. Users can then visually explore their data, perform quality control on their experiment, ensure response to perturbations and check reproducibility of replicates. This facilitates a rapid feedback cycle between analysis and experiment, which is crucial during assay optimization. This protocol is useful not just as a first pass analysis for quality control, but also may be used as an end-to-end solution, especially for screening. The workflow described here scales to large data sets such as those generated by high-throughput screens, and has been shown to group experimental conditions by phenotype accurately over a wide range of biological systems. The PhenoBrowser interface provides an intuitive framework to explore the phenotypic space and relate image properties to biological annotations. Taken together, the protocol described here will lower the barriers to adopting quantitative analysis of image based screens.
Basic Protocol, Issue 85, PhenoRipper, fluorescence microscopy, image analysis, High-content analysis, high-throughput screening, Open-source, Phenotype
Using an Automated 3D-tracking System to Record Individual and Shoals of Adult Zebrafish
Like many aquatic animals, zebrafish (Danio rerio
) moves in a 3D space. It is thus preferable to use a 3D recording system to study its behavior. The presented automatic video tracking system accomplishes this by using a mirror system and a calibration procedure that corrects for the considerable error introduced by the transition of light from water to air. With this system it is possible to record both single and groups of adult zebrafish. Before use, the system has to be calibrated. The system consists of three modules: Recording, Path Reconstruction, and Data Processing. The step-by-step protocols for calibration and using the three modules are presented. Depending on the experimental setup, the system can be used for testing neophobia, white aversion, social cohesion, motor impairments, novel object exploration etc
. It is especially promising as a first-step tool to study the effects of drugs or mutations on basic behavioral patterns. The system provides information about vertical and horizontal distribution of the zebrafish, about the xyz-components of kinematic parameters (such as locomotion, velocity, acceleration, and turning angle) and it provides the data necessary to calculate parameters for social cohesions when testing shoals.
Behavior, Issue 82, neuroscience, Zebrafish, Danio rerio, anxiety, Shoaling, Pharmacology, 3D-tracking, MK801
Network Analysis of the Default Mode Network Using Functional Connectivity MRI in Temporal Lobe Epilepsy
Institutions: Baylor College of Medicine, Michael E. DeBakey VA Medical Center, University of California, Los Angeles, University of California, Los Angeles.
Functional connectivity MRI (fcMRI) is an fMRI method that examines the connectivity of different brain areas based on the correlation of BOLD signal fluctuations over time. Temporal Lobe Epilepsy (TLE) is the most common type of adult epilepsy and involves multiple brain networks. The default mode network (DMN) is involved in conscious, resting state cognition and is thought to be affected in TLE where seizures cause impairment of consciousness. The DMN in epilepsy was examined using seed based fcMRI. The anterior and posterior hubs of the DMN were used as seeds in this analysis. The results show a disconnection between the anterior and posterior hubs of the DMN in TLE during the basal state. In addition, increased DMN connectivity to other brain regions in left TLE along with decreased connectivity in right TLE is revealed. The analysis demonstrates how seed-based fcMRI can be used to probe cerebral networks in brain disorders such as TLE.
Medicine, Issue 90, Default Mode Network (DMN), Temporal Lobe Epilepsy (TLE), fMRI, MRI, functional connectivity MRI (fcMRI), blood oxygenation level dependent (BOLD)
A Strategy for Sensitive, Large Scale Quantitative Metabolomics
Institutions: Cornell University, Cornell University.
Metabolite profiling has been a valuable asset in the study of metabolism in health and disease. However, current platforms have different limiting factors, such as labor intensive sample preparations, low detection limits, slow scan speeds, intensive method optimization for each metabolite, and the inability to measure both positively and negatively charged ions in single experiments. Therefore, a novel metabolomics protocol could advance metabolomics studies. Amide-based hydrophilic chromatography enables polar metabolite analysis without any chemical derivatization. High resolution MS using the Q-Exactive (QE-MS) has improved ion optics, increased scan speeds (256 msec at resolution 70,000), and has the capability of carrying out positive/negative switching. Using a cold methanol extraction strategy, and coupling an amide column with QE-MS enables robust detection of 168 targeted polar metabolites and thousands of additional features simultaneously. Data processing is carried out with commercially available software in a highly efficient way, and unknown features extracted from the mass spectra can be queried in databases.
Chemistry, Issue 87, high-resolution mass spectrometry, metabolomics, positive/negative switching, low mass calibration, Orbitrap
Genomic MRI - a Public Resource for Studying Sequence Patterns within Genomic DNA
Institutions: University of Toledo Health Science Campus.
Non-coding genomic regions in complex eukaryotes, including intergenic areas, introns, and untranslated segments of exons, are profoundly non-random in their nucleotide composition and consist of a complex mosaic of sequence patterns. These patterns include so-called Mid-Range Inhomogeneity (MRI) regions -- sequences 30-10000 nucleotides in length that are enriched by a particular base or combination of bases (e.g. (G+T)-rich, purine-rich, etc.). MRI regions are associated with unusual (non-B-form) DNA structures that are often involved in regulation of gene expression, recombination, and other genetic processes (Fedorova & Fedorov 2010). The existence of a strong fixation bias within MRI regions against mutations that tend to reduce their sequence inhomogeneity additionally supports the functionality and importance of these genomic sequences (Prakash et al.
Here we demonstrate a freely available Internet resource -- the Genomic MRI
program package -- designed for computational analysis of genomic sequences in order to find and characterize various MRI patterns within them (Bechtel et al.
2008). This package also allows generation of randomized sequences with various properties and level of correspondence to the natural input DNA sequences. The main goal of this resource is to facilitate examination of vast regions of non-coding DNA that are still scarcely investigated and await thorough exploration and recognition.
Genetics, Issue 51, bioinformatics, computational biology, genomics, non-randomness, signals, gene regulation, DNA conformation
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Determination of Mammalian Cell Counts, Cell Size and Cell Health Using the Moxi Z Mini Automated Cell Counter
Institutions: Orflo Technologies, University of Utah .
Particle and cell counting is used for a variety of applications including routine cell culture, hematological analysis, and industrial controls1-5
. A critical breakthrough in cell/particle counting technologies was the development of the Coulter technique by Wallace Coulter over 50 years ago. The technique involves the application of an electric field across a micron-sized aperture and hydrodynamically focusing single particles through the aperture. The resulting occlusion of the aperture by the particles yields a measurable change in electric impedance that can be directly and precisely correlated to cell size/volume. The recognition of the approach as the benchmark in cell/particle counting stems from the extraordinary precision and accuracy of its particle sizing and counts, particularly as compared to manual and imaging based technologies (accuracies on the order of 98% for Coulter counters versus 75-80% for manual and vision-based systems). This can be attributed to the fact that, unlike imaging-based approaches to cell counting, the Coulter Technique makes a true three-dimensional (3-D) measurement of cells/particles which dramatically reduces count interference from debris and clustering by calculating precise volumetric information about the cells/particles. Overall this provides a means for enumerating and sizing cells in a more accurate, less tedious, less time-consuming, and less subjective means than other counting techniques6
Despite the prominence of the Coulter technique in cell counting, its widespread use in routine biological studies has been prohibitive due to the cost and size of traditional instruments. Although a less expensive Coulter-based instrument has been produced, it has limitations as compared to its more expensive counterparts in the correction for "coincidence events" in which two or more cells pass through the aperture and are measured simultaneously. Another limitation with existing Coulter technologies is the lack of metrics on the overall health of cell samples. Consequently, additional techniques must often be used in conjunction with Coulter counting to assess cell viability. This extends experimental setup time and cost since the traditional methods of viability assessment require cell staining and/or use of expensive and cumbersome equipment such as a flow cytometer.
The Moxi Z mini automated cell counter, described here, is an ultra-small benchtop instrument that combines the accuracy of the Coulter Principle with a thin-film sensor technology to enable precise sizing and counting of particles ranging from 3-25 microns, depending on the cell counting cassette used. The M type cassette can be used to count particles from with average diameters of 4 - 25 microns (dynamic range 2 - 34 microns), and the Type S cassette can be used to count particles with and average diameter of 3 - 20 microns (dynamic range 2 - 26 microns). Since the system uses a volumetric measurement method, the 4-25 microns corresponds to a cell volume range of 34 - 8,180 fL and the 3 - 20 microns corresponds to a cell volume range of 14 - 4200 fL, which is relevant when non-spherical particles are being measured. To perform mammalian cell counts using the Moxi Z, the cells to be counted are first diluted with ORFLO or similar diluent. A cell counting cassette is inserted into the instrument, and the sample is loaded into the port of the cassette. Thousands of cells are pulled, single-file through a "Cell Sensing Zone" (CSZ) in the thin-film membrane over 8-15 seconds. Following the run, the instrument uses proprietary curve-fitting in conjunction with a proprietary software algorithm to provide coincidence event correction along with an assessment of overall culture health by determining the ratio of the number of cells in the population of interest to the total number of particles. The total particle counts include shrunken and broken down dead cells, as well as other debris and contaminants. The results are presented in histogram format with an automatic curve fit, with gates that can be adjusted manually as needed.
Ultimately, the Moxi Z enables counting with a precision and accuracy comparable to a Coulter Z2, the current gold standard, while providing additional culture health information. Furthermore it achieves these results in less time, with a smaller footprint, with significantly easier operation and maintenance, and at a fraction of the cost of comparable technologies.
Cellular Biology, Issue 64, Molecular Biology, cell counting, coulter counting, cell culture health assessment, particle sizing, mammalian cells, Moxi Z
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Developing Neuroimaging Phenotypes of the Default Mode Network in PTSD: Integrating the Resting State, Working Memory, and Structural Connectivity
Institutions: Alpert Medical School, Brown University, University of Georgia.
Complementary structural and functional neuroimaging techniques used to examine the Default Mode Network (DMN) could potentially improve assessments of psychiatric illness severity and provide added validity to the clinical diagnostic process. Recent neuroimaging research suggests that DMN processes may be disrupted in a number of stress-related psychiatric illnesses, such as posttraumatic stress disorder (PTSD).
Although specific DMN functions remain under investigation, it is generally thought to be involved in introspection and self-processing. In healthy individuals it exhibits greatest activity during periods of rest, with less activity, observed as deactivation, during cognitive tasks, e.g.
, working memory. This network consists of the medial prefrontal cortex, posterior cingulate cortex/precuneus, lateral parietal cortices and medial temporal regions.
Multiple functional and structural imaging approaches have been developed to study the DMN. These have unprecedented potential to further the understanding of the function and dysfunction of this network. Functional approaches, such as the evaluation of resting state connectivity and task-induced deactivation, have excellent potential to identify targeted neurocognitive and neuroaffective (functional) diagnostic markers and may indicate illness severity and prognosis with increased accuracy or specificity. Structural approaches, such as evaluation of morphometry and connectivity, may provide unique markers of etiology and long-term outcomes. Combined, functional and structural methods provide strong multimodal, complementary and synergistic approaches to develop valid DMN-based imaging phenotypes in stress-related psychiatric conditions. This protocol aims to integrate these methods to investigate DMN structure and function in PTSD, relating findings to illness severity and relevant clinical factors.
Medicine, Issue 89, default mode network, neuroimaging, functional magnetic resonance imaging, diffusion tensor imaging, structural connectivity, functional connectivity, posttraumatic stress disorder
The ITS2 Database
Institutions: University of Würzburg, University of Würzburg.
The internal transcribed spacer 2 (ITS2) has been used as a phylogenetic marker for more than two decades. As ITS2 research mainly focused on the very variable ITS2 sequence, it confined this marker to low-level phylogenetics only. However, the combination of the ITS2 sequence and its highly conserved secondary structure improves the phylogenetic resolution1
and allows phylogenetic inference at multiple taxonomic ranks, including species delimitation2-8
The ITS2 Database9
presents an exhaustive dataset of internal transcribed spacer 2 sequences from NCBI GenBank11
. Following an annotation by profile Hidden Markov Models (HMMs), the secondary structure of each sequence is predicted. First, it is tested whether a minimum energy based fold12
(direct fold) results in a correct, four helix conformation. If this is not the case, the structure is predicted by homology modeling13
. In homology modeling, an already known secondary structure is transferred to another ITS2 sequence, whose secondary structure was not able to fold correctly in a direct fold.
The ITS2 Database is not only a database for storage and retrieval of ITS2 sequence-structures. It also provides several tools to process your own ITS2 sequences, including annotation, structural prediction, motif detection and BLAST14
search on the combined sequence-structure information. Moreover, it integrates trimmed versions of 4SALE15,16
for multiple sequence-structure alignment calculation and Neighbor Joining18
tree reconstruction. Together they form a coherent analysis pipeline from an initial set of sequences to a phylogeny based on sequence and secondary structure.
In a nutshell, this workbench simplifies first phylogenetic analyses to only a few mouse-clicks, while additionally providing tools and data for comprehensive large-scale analyses.
Genetics, Issue 61, alignment, internal transcribed spacer 2, molecular systematics, secondary structure, ribosomal RNA, phylogenetic tree, homology modeling, phylogeny
Characterization of Electrode Materials for Lithium Ion and Sodium Ion Batteries Using Synchrotron Radiation Techniques
Institutions: Lawrence Berkeley National Laboratory, University of Illinois at Chicago, Stanford Synchrotron Radiation Lightsource, Haldor Topsøe A/S, PolyPlus Battery Company.
Intercalation compounds such as transition metal oxides or phosphates are the most commonly used electrode materials in Li-ion and Na-ion batteries. During insertion or removal of alkali metal ions, the redox states of transition metals in the compounds change and structural transformations such as phase transitions and/or lattice parameter increases or decreases occur. These behaviors in turn determine important characteristics of the batteries such as the potential profiles, rate capabilities, and cycle lives. The extremely bright and tunable x-rays produced by synchrotron radiation allow rapid acquisition of high-resolution data that provide information about these processes. Transformations in the bulk materials, such as phase transitions, can be directly observed using X-ray diffraction (XRD), while X-ray absorption spectroscopy (XAS) gives information about the local electronic and geometric structures (e.g.
changes in redox states and bond lengths). In situ
experiments carried out on operating cells are particularly useful because they allow direct correlation between the electrochemical and structural properties of the materials. These experiments are time-consuming and can be challenging to design due to the reactivity and air-sensitivity of the alkali metal anodes used in the half-cell configurations, and/or the possibility of signal interference from other cell components and hardware. For these reasons, it is appropriate to carry out ex situ
on electrodes harvested from partially charged or cycled cells) in some cases. Here, we present detailed protocols for the preparation of both ex situ
and in situ
samples for experiments involving synchrotron radiation and demonstrate how these experiments are done.
Physics, Issue 81, X-Ray Absorption Spectroscopy, X-Ray Diffraction, inorganic chemistry, electric batteries (applications), energy storage, Electrode materials, Li-ion battery, Na-ion battery, X-ray Absorption Spectroscopy (XAS), in situ X-ray diffraction (XRD)
Setting Limits on Supersymmetry Using Simplified Models
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Electric Cell-substrate Impedance Sensing for the Quantification of Endothelial Proliferation, Barrier Function, and Motility
Institutions: Institute for Cardiovascular Research, VU University Medical Center, Institute for Cardiovascular Research, VU University Medical Center.
Electric Cell-substrate Impedance Sensing (ECIS) is an in vitro
impedance measuring system to quantify the behavior of cells within adherent cell layers. To this end, cells are grown in special culture chambers on top of opposing, circular gold electrodes. A constant small alternating current is applied between the electrodes and the potential across is measured. The insulating properties of the cell membrane create a resistance towards the electrical current flow resulting in an increased electrical potential between the electrodes. Measuring cellular impedance in this manner allows the automated study of cell attachment, growth, morphology, function, and motility. Although the ECIS measurement itself is straightforward and easy to learn, the underlying theory is complex and selection of the right settings and correct analysis and interpretation of the data is not self-evident. Yet, a clear protocol describing the individual steps from the experimental design to preparation, realization, and analysis of the experiment is not available. In this article the basic measurement principle as well as possible applications, experimental considerations, advantages and limitations of the ECIS system are discussed. A guide is provided for the study of cell attachment, spreading and proliferation; quantification of cell behavior in a confluent layer, with regard to barrier function, cell motility, quality of cell-cell and cell-substrate adhesions; and quantification of wound healing and cellular responses to vasoactive stimuli. Representative results are discussed based on human microvascular (MVEC) and human umbilical vein endothelial cells (HUVEC), but are applicable to all adherent growing cells.
Bioengineering, Issue 85, ECIS, Impedance Spectroscopy, Resistance, TEER, Endothelial Barrier, Cell Adhesions, Focal Adhesions, Proliferation, Migration, Motility, Wound Healing
Surface Renewal: An Advanced Micrometeorological Method for Measuring and Processing Field-Scale Energy Flux Density Data
Institutions: United States Department of Agriculture-Agricultural Research Service, University of California, Davis, University of Chile, University of California, Davis, URS Corporation Australia Pty. Ltd..
Advanced micrometeorological methods have become increasingly important in soil, crop, and environmental sciences. For many scientists without formal training in atmospheric science, these techniques are relatively inaccessible. Surface renewal and other flux measurement methods require an understanding of boundary layer meteorology and extensive training in instrumentation and multiple data management programs. To improve accessibility of these techniques, we describe the underlying theory of surface renewal measurements, demonstrate how to set up a field station for surface renewal with eddy covariance calibration, and utilize our open-source turnkey data logger program to perform flux data acquisition and processing. The new turnkey program returns to the user a simple data table with the corrected fluxes and quality control parameters, and eliminates the need for researchers to shuttle between multiple processing programs to obtain the final flux data. An example of data generated from these measurements demonstrates how crop water use is measured with this technique. The output information is useful to growers for making irrigation decisions in a variety of agricultural ecosystems. These stations are currently deployed in numerous field experiments by researchers in our group and the California Department of Water Resources in the following crops: rice, wine and raisin grape vineyards, alfalfa, almond, walnut, peach, lemon, avocado, and corn.
Environmental Sciences, Issue 82, Conservation of Natural Resources, Engineering, Agriculture, plants, energy balance, irrigated agriculture, flux data, evapotranspiration, agrometeorology
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Specimen Preparation, Imaging, and Analysis Protocols for Knife-edge Scanning Microscopy
Institutions: Texas A&M University, University of Illinois, Kettering University, 3Scan, Texas A&M University.
Major advances in high-throughput, high-resolution, 3D microscopy techniques have enabled the acquisition of large volumes of neuroanatomical data at submicrometer resolution. One of the first such instruments producing whole-brain-scale data is the Knife-Edge Scanning Microscope (KESM)7, 5, 9
, developed and hosted in the authors' lab. KESM has been used to section and image whole mouse brains at submicrometer resolution, revealing the intricate details of the neuronal networks (Golgi)1, 4, 8
, vascular networks (India ink)1, 4
, and cell body distribution (Nissl)3
. The use of KESM is not restricted to the mouse nor the brain. We have successfully imaged the octopus brain6
, mouse lung, and rat brain. We are currently working on whole zebra fish embryos. Data like these can greatly contribute to connectomics research10
; to microcirculation and hemodynamic research; and to stereology research by providing an exact ground-truth.
In this article, we will describe the pipeline, including specimen preparation (fixing, staining, and embedding), KESM configuration and setup, sectioning and imaging with the KESM, image processing, data preparation, and data visualization and analysis. The emphasis will be on specimen preparation and visualization/analysis of obtained KESM data. We expect the detailed protocol presented in this article to help broaden the access to KESM and increase its utilization.
Bioengineering, Issue 58, Physical sectioning, serial sectioning, light microscopy, brain imaging, microtome
Combining Transcranial Magnetic Stimulation and fMRI to Examine the Default Mode Network
Institutions: Beth Israel Deaconess Medical Center.
The default mode network is a group of brain regions that are active when an individual is not focused on the outside world and the brain is at "wakeful rest."1,2,3
It is thought the default mode network corresponds to self-referential or "internal mentation".2,3
It has been hypothesized that, in humans, activity within the default mode network is correlated with certain pathologies (for instance, hyper-activation has been linked to schizophrenia 4,5,6
and autism spectrum disorders 7
whilst hypo-activation of the network has been linked to Alzheimer's and other neurodegenerative diseases 8
). As such, noninvasive modulation of this network may represent a potential therapeutic intervention for a number of neurological and psychiatric pathologies linked to abnormal network activation. One possible tool to effect this modulation is Transcranial Magnetic Stimulation: a non-invasive neurostimulatory and neuromodulatory technique that can transiently or lastingly modulate cortical excitability (either increasing or decreasing it) via the application of localized magnetic field pulses.9
In order to explore the default mode network's propensity towards and tolerance of modulation, we will be combining TMS (to the left inferior parietal lobe) with functional magnetic resonance imaging (fMRI). Through this article, we will examine the protocol and considerations necessary to successfully combine these two neuroscientific tools.
Neuroscience, Issue 46, Transcranial Magnetic Stimulation, rTMS, fMRI, Default Mode Network, functional connectivity, resting state
Using SCOPE to Identify Potential Regulatory Motifs in Coregulated Genes
Institutions: Dartmouth College.
SCOPE is an ensemble motif finder that uses three component algorithms in parallel to identify potential regulatory motifs by over-representation and motif position preference1
. Each component algorithm is optimized to find a different kind of motif. By taking the best of these three approaches, SCOPE performs better than any single algorithm, even in the presence of noisy data1
. In this article, we utilize a web version of SCOPE2
to examine genes that are involved in telomere maintenance. SCOPE has been incorporated into at least two other motif finding programs3,4
and has been used in other studies5-8
The three algorithms that comprise SCOPE are BEAM9
, which finds non-degenerate motifs (ACCGGT), PRISM10
, which finds degenerate motifs (ASCGWT), and SPACER11
, which finds longer bipartite motifs (ACCnnnnnnnnGGT). These three algorithms have been optimized to find their corresponding type of motif. Together, they allow SCOPE to perform extremely well.
Once a gene set has been analyzed and candidate motifs identified, SCOPE can look for other genes that contain the motif which, when added to the original set, will improve the motif score. This can occur through over-representation or motif position preference. Working with partial gene sets that have biologically verified transcription factor binding sites, SCOPE was able to identify most of the rest of the genes also regulated by the given transcription factor.
Output from SCOPE shows candidate motifs, their significance, and other information both as a table and as a graphical motif map. FAQs and video tutorials are available at the SCOPE web site which also includes a "Sample Search" button that allows the user to perform a trial run.
Scope has a very friendly user interface that enables novice users to access the algorithm's full power without having to become an expert in the bioinformatics of motif finding. As input, SCOPE can take a list of genes, or FASTA sequences. These can be entered in browser text fields, or read from a file. The output from SCOPE contains a list of all identified motifs with their scores, number of occurrences, fraction of genes containing the motif, and the algorithm used to identify the motif. For each motif, result details include a consensus representation of the motif, a sequence logo, a position weight matrix, and a list of instances for every motif occurrence (with exact positions and "strand" indicated). Results are returned in a browser window and also optionally by email. Previous papers describe the SCOPE algorithms in detail1,2,9-11
Genetics, Issue 51, gene regulation, computational biology, algorithm, promoter sequence motif
Layers of Symbiosis - Visualizing the Termite Hindgut Microbial Community
Institutions: California Institute of Technology - Caltech.
Jared Leadbetter takes us for a nature walk through the diversity of life resident in the termite hindgut - a microenvironment containing 250 different species found nowhere else on Earth. Jared reveals that the symbiosis exhibited by this system is multi-layered and involves not only a relationship between the termite and its gut inhabitants, but also involves a complex web of symbiosis among the gut microbes themselves.
Microbiology, issue 4, microbial community, symbiosis, hindgut