Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
20 Related JoVE Articles!
Comprehensive Analysis of Transcription Dynamics from Brain Samples Following Behavioral Experience
Institutions: The Hebrew University of Jerusalem.
The encoding of experiences in the brain and the consolidation of long-term memories depend on gene transcription. Identifying the function of specific genes in encoding experience is one of the main objectives of molecular neuroscience. Furthermore, the functional association of defined genes with specific behaviors has implications for understanding the basis of neuropsychiatric disorders. Induction of robust transcription programs has been observed in the brains of mice following various behavioral manipulations. While some genetic elements are utilized recurrently following different behavioral manipulations and in different brain nuclei, transcriptional programs are overall unique to the inducing stimuli and the structure in which they are studied1,2
In this publication, a protocol is described for robust and comprehensive transcriptional profiling from brain nuclei of mice in response to behavioral manipulation. The protocol is demonstrated in the context of analysis of gene expression dynamics in the nucleus accumbens following acute cocaine experience. Subsequent to a defined in vivo
experience, the target neural tissue is dissected; followed by RNA purification, reverse transcription and utilization of microfluidic arrays for comprehensive qPCR analysis of multiple target genes. This protocol is geared towards comprehensive analysis (addressing 50-500 genes) of limiting quantities of starting material, such as small brain samples or even single cells.
The protocol is most advantageous for parallel analysis of multiple samples (e.g.
single cells, dynamic analysis following pharmaceutical, viral or behavioral perturbations). However, the protocol could also serve for the characterization and quality assurance of samples prior to whole-genome studies by microarrays or RNAseq, as well as validation of data obtained from whole-genome studies.
Behavior, Issue 90,
Brain, behavior, RNA, transcription, nucleus accumbens, cocaine, high-throughput qPCR, experience-dependent plasticity, gene regulatory networks, microdissection
Magnetic Resonance Imaging Quantification of Pulmonary Perfusion using Calibrated Arterial Spin Labeling
Institutions: University of California San Diego - UCSD, University of California San Diego - UCSD, University of California San Diego - UCSD.
This demonstrates a MR imaging method to measure the spatial distribution of pulmonary blood flow in healthy subjects
during normoxia (inspired O2
, fraction (FI
) = 0.21) hypoxia (FI
= 0.125), and hyperoxia
= 1.00). In addition, the physiological responses of the subject are monitored in the MR scan environment. MR images
were obtained on a 1.5 T GE MRI scanner during a breath hold from a sagittal slice in the right lung at functional residual capacity. An arterial
spin labeling sequence (ASL-FAIRER) was used to measure the spatial distribution of pulmonary blood flow 1,2
and a multi-echo fast
gradient echo (mGRE) sequence 3
was used to quantify the regional proton (i.e. H2
O) density, allowing the quantification
of density-normalized perfusion for each voxel (milliliters blood per minute per gram lung tissue).
With a pneumatic switching valve and facemask equipped with a 2-way non-rebreathing valve, different oxygen concentrations
were introduced to the subject in the MR scanner through the inspired gas tubing. A metabolic cart collected expiratory gas via expiratory tubing. Mixed expiratory O2
concentrations, oxygen consumption, carbon dioxide production, respiratory exchange ratio,
respiratory frequency and tidal volume were measured. Heart rate and oxygen saturation were monitored using pulse-oximetry.
Data obtained from a normal subject showed that, as expected, heart rate was higher in hypoxia (60 bpm) than during normoxia (51) or hyperoxia (50) and the arterial oxygen saturation (SpO2
) was reduced during hypoxia to 86%. Mean ventilation was 8.31 L/min BTPS during hypoxia, 7.04 L/min during normoxia, and 6.64 L/min during hyperoxia. Tidal volume was 0.76 L during hypoxia, 0.69 L during normoxia, and 0.67 L during hyperoxia.
Representative quantified ASL data showed that the mean density normalized perfusion was 8.86 ml/min/g during hypoxia, 8.26 ml/min/g during normoxia and 8.46 ml/min/g during hyperoxia, respectively. In this subject, the relative dispersion4
, an index of global heterogeneity, was increased in hypoxia (1.07 during hypoxia, 0.85 during normoxia, and 0.87 during hyperoxia) while the fractal dimension (Ds), another index of heterogeneity reflecting vascular branching structure, was unchanged (1.24 during hypoxia, 1.26 during normoxia, and 1.26 during hyperoxia).
Overview. This protocol will demonstrate the acquisition of data to measure the distribution of pulmonary perfusion noninvasively under conditions of normoxia, hypoxia, and hyperoxia using a magnetic resonance imaging technique known as arterial spin labeling (ASL).
Rationale: Measurement of pulmonary blood flow and lung proton density using MR technique offers high spatial resolution images which can be quantified and the ability to perform repeated measurements under several different physiological conditions. In human studies, PET, SPECT, and CT are commonly used as the alternative techniques. However, these techniques involve exposure to ionizing radiation, and thus are not suitable for repeated measurements in human subjects.
Medicine, Issue 51, arterial spin labeling, lung proton density, functional lung imaging, hypoxic pulmonary vasoconstriction, oxygen consumption, ventilation, magnetic resonance imaging
Assessment of Immunologically Relevant Dynamic Tertiary Structural Features of the HIV-1 V3 Loop Crown R2 Sequence by ab initio Folding
Institutions: School of Medicine, New York University.
The antigenic diversity of HIV-1 has long been an obstacle to vaccine design, and this variability is especially pronounced in the V3 loop of the virus' surface envelope glycoprotein. We previously proposed that the crown of the V3 loop, although dynamic and sequence variable, is constrained throughout the population of HIV-1 viruses to an immunologically relevant β-hairpin tertiary structure. Importantly, there are thousands of different V3 loop crown sequences in circulating HIV-1 viruses, making 3D structural characterization of trends across the diversity of viruses difficult or impossible by crystallography or NMR. Our previous successful studies with folding of the V3 crown1, 2
used the ab initio
accessible in the ICM-Pro molecular modeling software package (Molsoft LLC, La Jolla, CA) and suggested that the crown of the V3 loop, specifically from positions 10 to 22, benefits sufficiently from the flexibility and length of its flanking stems to behave to a large degree as if it were an unconstrained peptide freely folding in solution. As such, rapid ab initio
folding of just this portion of the V3 loop of any individual strain of the 60,000+ circulating HIV-1 strains can be informative. Here, we folded the V3 loop of the R2 strain to gain insight into the structural basis of its unique properties. R2 bears a rare V3 loop sequence thought to be responsible for the exquisite sensitivity of this strain to neutralization by patient sera and monoclonal antibodies4, 5
. The strain mediates CD4-independent infection and appears to elicit broadly neutralizing antibodies. We demonstrate how evaluation of the results of the folding can be informative for associating observed structures in the folding with the immunological activities observed for R2.
Infection, Issue 43, HIV-1, structure-activity relationships, ab initio simulations, antibody-mediated neutralization, vaccine design
Automated Interactive Video Playback for Studies of Animal Communication
Institutions: Texas A&M University (TAMU), Texas A&M University (TAMU).
Video playback is a widely-used technique for the controlled manipulation and presentation of visual signals in animal communication. In particular, parameter-based computer animation offers the opportunity to independently manipulate any number of behavioral, morphological, or spectral characteristics in the context of realistic, moving images of animals on screen. A major limitation of conventional playback, however, is that the visual stimulus lacks the ability to interact with the live animal. Borrowing from video-game technology, we have created an automated, interactive system for video playback that controls animations in response to real-time signals from a video tracking system. We demonstrated this method by conducting mate-choice trials on female swordtail fish, Xiphophorus birchmanni
. Females were given a simultaneous choice between a courting male conspecific and a courting male heterospecific (X. malinche
) on opposite sides of an aquarium. The virtual male stimulus was programmed to track the horizontal position of the female, as courting males do in the wild. Mate-choice trials on wild-caught X. birchmanni
females were used to validate the prototype's ability to effectively generate a realistic visual stimulus.
Neuroscience, Issue 48, Computer animation, visual communication, mate choice, Xiphophorus birchmanni, tracking
Label-free in situ Imaging of Lignification in Plant Cell Walls
Institutions: University of California, Berkeley, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Meeting growing energy demands safely and efficiently is a pressing global challenge. Therefore, research into biofuels production that seeks to find cost-effective and sustainable solutions has become a topical and critical task. Lignocellulosic biomass is poised to become the primary source of biomass for the conversion to liquid biofuels1-6
. However, the recalcitrance of these plant cell wall materials to cost-effective and efficient degradation presents a major impediment for their use in the production of biofuels and chemicals4
. In particular, lignin, a complex and irregular poly-phenylpropanoid heteropolymer, becomes problematic to the postharvest deconstruction of lignocellulosic biomass. For example in biomass conversion for biofuels, it inhibits saccharification in processes aimed at producing simple sugars for fermentation7
. The effective use of plant biomass for industrial purposes is in fact largely dependent on the extent to which the plant cell wall is lignified. The removal of lignin is a costly and limiting factor8
and lignin has therefore become a key plant breeding and genetic engineering target in order to improve cell wall conversion.
Analytical tools that permit the accurate rapid characterization of lignification of plant cell walls become increasingly important for evaluating a large number of breeding populations. Extractive procedures for the isolation of native components such as lignin are inevitably destructive, bringing about significant chemical and structural modifications9-11
. Analytical chemical in situ
methods are thus invaluable tools for the compositional and structural characterization of lignocellulosic materials. Raman microscopy is a technique that relies on inelastic or Raman scattering of monochromatic light, like that from a laser, where the shift in energy of the laser photons is related to molecular vibrations and presents an intrinsic label-free molecular "fingerprint" of the sample. Raman microscopy can afford non-destructive and comparatively inexpensive measurements with minimal sample preparation, giving insights into chemical composition and molecular structure in a close to native state. Chemical imaging by confocal Raman microscopy has been previously used for the visualization of the spatial distribution of cellulose and lignin in wood cell walls12-14
. Based on these earlier results, we have recently adopted this method to compare lignification in wild type and lignin-deficient transgenic Populus trichocarpa
(black cottonwood) stem wood15
. Analyzing the lignin Raman bands16,17
in the spectral region between 1,600 and 1,700 cm-1
, lignin signal intensity and localization were mapped in situ
. Our approach visualized differences in lignin content, localization, and chemical composition. Most recently, we demonstrated Raman imaging of cell wall polymers in Arabidopsis thaliana
with lateral resolution that is sub-μm18
. Here, this method is presented affording visualization of lignin in plant cell walls and comparison of lignification in different tissues, samples or species without staining or labeling of the tissues.
Plant Biology, Issue 45, Raman microscopy, lignin, poplar wood, Arabidopsis thaliana
Patterned Photostimulation with Digital Micromirror Devices to Investigate Dendritic Integration Across Branch Points
Institutions: University of Maryland School of Medicine.
Light is a versatile and precise means to control neuronal excitability. The recent introduction of light sensitive effectors such as channel-rhodopsin and caged neurotransmitters have led to interests in developing better means to control patterns of light in space and time that are useful for experimental neuroscience. One conventional strategy, employed in confocal and 2-photon microscopy, is to focus light to a diffraction limited spot and then scan that single spot sequentially over the region of interest. This approach becomes problematic if large areas have to be stimulated within a brief time window, a problem more applicable to photostimulation than for imaging. An alternate strategy is to project the complete spatial pattern on the target with the aid of a digital micromirror device (DMD). The DMD approach is appealing because the hardware components are relatively inexpensive and is supported by commercial interests. Because such a system is not available for upright microscopes, we will discuss the critical issues in the construction and operations of such a DMD system. Even though we will be primarily describing the construction of the system for UV photolysis, the modifications for building the much simpler visible light system for optogenetic experiments will also be provided. The UV photolysis system was used to carryout experiments to study a fundamental question in neuroscience, how are spatially distributed inputs integrated across distal dendritic branch points. The results suggest that integration can be non-linear across branch points and the supralinearity is largely mediated by NMDA receptors.
Bioengineering, Issue 49, DMD, photolysis, dendrite, photostimulation, DLP, optogenetics
Development of an Audio-based Virtual Gaming Environment to Assist with Navigation Skills in the Blind
Institutions: Massachusetts Eye and Ear Infirmary, Harvard Medical School, University of Chile .
Audio-based Environment Simulator (AbES) is virtual environment software designed to improve real world navigation skills in the blind. Using only audio based cues and set within the context of a video game metaphor, users gather relevant spatial information regarding a building's layout. This allows the user to develop an accurate spatial cognitive map of a large-scale three-dimensional space that can be manipulated for the purposes of a real indoor navigation task. After game play, participants are then assessed on their ability to navigate within the target physical building represented in the game. Preliminary results suggest that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building as indexed by their performance on a series of navigation tasks. These tasks included path finding through the virtual and physical building, as well as a series of drop off tasks. We find that the immersive and highly interactive nature of the AbES software appears to greatly engage the blind user to actively explore the virtual environment. Applications of this approach may extend to larger populations of visually impaired individuals.
Medicine, Issue 73, Behavior, Neuroscience, Anatomy, Physiology, Neurobiology, Ophthalmology, Psychology, Behavior and Behavior Mechanisms, Technology, Industry, virtual environments, action video games, blind, audio, rehabilitation, indoor navigation, spatial cognitive map, Audio-based Environment Simulator, virtual reality, cognitive psychology, clinical techniques
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g.
carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Laboratory-determined Phosphorus Flux from Lake Sediments as a Measure of Internal Phosphorus Loading
Institutions: Grand Valley State University.
Eutrophication is a water quality issue in lakes worldwide, and there is a critical need to identify and control nutrient sources. Internal phosphorus (P) loading from lake sediments can account for a substantial portion of the total P load in eutrophic, and some mesotrophic, lakes. Laboratory determination of P release rates from sediment cores is one approach for determining the role of internal P loading and guiding management decisions. Two principal alternatives to experimental determination of sediment P release exist for estimating internal load: in situ
measurements of changes in hypolimnetic P over time and P mass balance. The experimental approach using laboratory-based sediment incubations to quantify internal P load is a direct method, making it a valuable tool for lake management and restoration.
Laboratory incubations of sediment cores can help determine the relative importance of internal vs. external P loads, as well as be used to answer a variety of lake management and research questions. We illustrate the use of sediment core incubations to assess the effectiveness of an aluminum sulfate (alum) treatment for reducing sediment P release. Other research questions that can be investigated using this approach include the effects of sediment resuspension and bioturbation on P release.
The approach also has limitations. Assumptions must be made with respect to: extrapolating results from sediment cores to the entire lake; deciding over what time periods to measure nutrient release; and addressing possible core tube artifacts. A comprehensive dissolved oxygen monitoring strategy to assess temporal and spatial redox status in the lake provides greater confidence in annual P loads estimated from sediment core incubations.
Environmental Sciences, Issue 85, Limnology, internal loading, eutrophication, nutrient flux, sediment coring, phosphorus, lakes
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
Functional Interrogation of Adult Hypothalamic Neurogenesis with Focal Radiological Inhibition
Institutions: California Institute of Technology, Johns Hopkins University School of Medicine, Johns Hopkins University School of Medicine, University Of Washington Medical Center, Johns Hopkins University School of Medicine.
The functional characterization of adult-born neurons remains a significant challenge. Approaches to inhibit adult neurogenesis via invasive viral delivery or transgenic animals have potential confounds that make interpretation of results from these studies difficult. New radiological tools are emerging, however, that allow one to noninvasively investigate the function of select groups of adult-born neurons through accurate and precise anatomical targeting in small animals. Focal ionizing radiation inhibits the birth and differentiation of new neurons, and allows targeting of specific neural progenitor regions. In order to illuminate the potential functional role that adult hypothalamic neurogenesis plays in the regulation of physiological processes, we developed a noninvasive focal irradiation technique to selectively inhibit the birth of adult-born neurons in the hypothalamic median eminence. We describe a method for C
omputer tomography-guided f
radiation (CFIR) delivery to enable precise and accurate anatomical targeting in small animals. CFIR uses three-dimensional volumetric image guidance for localization and targeting of the radiation dose, minimizes radiation exposure to nontargeted brain regions, and allows for conformal dose distribution with sharp beam boundaries. This protocol allows one to ask questions regarding the function of adult-born neurons, but also opens areas to questions in areas of radiobiology, tumor biology, and immunology. These radiological tools will facilitate the translation of discoveries at the bench to the bedside.
Neuroscience, Issue 81, Neural Stem Cells (NSCs), Body Weight, Radiotherapy, Image-Guided, Metabolism, Energy Metabolism, Neurogenesis, Cell Proliferation, Neurosciences, Irradiation, Radiological treatment, Computer-tomography (CT) imaging, Hypothalamus, Hypothalamic Proliferative Zone (HPZ), Median Eminence (ME), Small Animal Radiation Research Platform (SARRP)
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Characterization of Electrode Materials for Lithium Ion and Sodium Ion Batteries Using Synchrotron Radiation Techniques
Institutions: Lawrence Berkeley National Laboratory, University of Illinois at Chicago, Stanford Synchrotron Radiation Lightsource, Haldor Topsøe A/S, PolyPlus Battery Company.
Intercalation compounds such as transition metal oxides or phosphates are the most commonly used electrode materials in Li-ion and Na-ion batteries. During insertion or removal of alkali metal ions, the redox states of transition metals in the compounds change and structural transformations such as phase transitions and/or lattice parameter increases or decreases occur. These behaviors in turn determine important characteristics of the batteries such as the potential profiles, rate capabilities, and cycle lives. The extremely bright and tunable x-rays produced by synchrotron radiation allow rapid acquisition of high-resolution data that provide information about these processes. Transformations in the bulk materials, such as phase transitions, can be directly observed using X-ray diffraction (XRD), while X-ray absorption spectroscopy (XAS) gives information about the local electronic and geometric structures (e.g.
changes in redox states and bond lengths). In situ
experiments carried out on operating cells are particularly useful because they allow direct correlation between the electrochemical and structural properties of the materials. These experiments are time-consuming and can be challenging to design due to the reactivity and air-sensitivity of the alkali metal anodes used in the half-cell configurations, and/or the possibility of signal interference from other cell components and hardware. For these reasons, it is appropriate to carry out ex situ
on electrodes harvested from partially charged or cycled cells) in some cases. Here, we present detailed protocols for the preparation of both ex situ
and in situ
samples for experiments involving synchrotron radiation and demonstrate how these experiments are done.
Physics, Issue 81, X-Ray Absorption Spectroscopy, X-Ray Diffraction, inorganic chemistry, electric batteries (applications), energy storage, Electrode materials, Li-ion battery, Na-ion battery, X-ray Absorption Spectroscopy (XAS), in situ X-ray diffraction (XRD)
Trajectory Data Analyses for Pedestrian Space-time Activity Study
Institutions: Kean University, University of Wisconsin-Madison.
It is well recognized that human movement in the spatial and temporal dimensions has direct influence on disease transmission1-3
. An infectious disease typically spreads via contact between infected and susceptible individuals in their overlapped activity spaces. Therefore, daily mobility-activity information can be used as an indicator to measure exposures to risk factors of infection. However, a major difficulty and thus the reason for paucity of studies of infectious disease transmission at the micro scale arise from the lack of detailed individual mobility data. Previously in transportation and tourism research detailed space-time activity data often relied on the time-space diary technique, which requires subjects to actively record their activities in time and space. This is highly demanding for the participants and collaboration from the participants greatly affects the quality of data4
Modern technologies such as GPS and mobile communications have made possible the automatic collection of trajectory data. The data collected, however, is not ideal for modeling human space-time activities, limited by the accuracies of existing devices. There is also no readily available tool for efficient processing of the data for human behavior study. We present here a suite of methods and an integrated ArcGIS desktop-based visual interface for the pre-processing and spatiotemporal analyses of trajectory data. We provide examples of how such processing may be used to model human space-time activities, especially with error-rich pedestrian trajectory data, that could be useful in public health studies such as infectious disease transmission modeling.
The procedure presented includes pre-processing, trajectory segmentation, activity space characterization, density estimation and visualization, and a few other exploratory analysis methods. Pre-processing is the cleaning of noisy raw trajectory data. We introduce an interactive visual pre-processing interface as well as an automatic module. Trajectory segmentation5
involves the identification of indoor and outdoor parts from pre-processed space-time tracks. Again, both interactive visual segmentation and automatic segmentation are supported. Segmented space-time tracks are then analyzed to derive characteristics of one's activity space such as activity radius etc.
Density estimation and visualization are used to examine large amount of trajectory data to model hot spots and interactions. We demonstrate both density surface mapping6
and density volume rendering7
. We also include a couple of other exploratory data analyses (EDA) and visualizations tools, such as Google Earth animation support and connection analysis. The suite of analytical as well as visual methods presented in this paper may be applied to any trajectory data for space-time activity studies.
Environmental Sciences, Issue 72, Computer Science, Behavior, Infectious Diseases, Geography, Cartography, Data Display, Disease Outbreaks, cartography, human behavior, Trajectory data, space-time activity, GPS, GIS, ArcGIS, spatiotemporal analysis, visualization, segmentation, density surface, density volume, exploratory data analysis, modelling
Morris Water Maze Experiment
Institutions: Michigan State University (MSU).
The Morris water maze is widely used to study spatial memory and learning. Animals are placed in a pool of water that is colored opaque with powdered non-fat milk or non-toxic tempera paint, where they must swim to a hidden escape platform. Because they are in opaque water, the animals cannot see the platform, and cannot rely on scent to find the escape route. Instead, they must rely on external/extra-maze cues. As the animals become more familiar with the task, they are able to find the platform more quickly. Developed by Richard G. Morris in 1984, this paradigm has become one of the "gold standards" of behavioral neuroscience.
Behavior, Issue 19, Declarative, Hippocampus, Memory, Procedural, Rodent, Spatial Learning
Laser Capture Microdissection of Mammalian Tissue
Institutions: University of California, Irvine (UCI).
Laser capture microscopy, also known as laser microdissection (LMD), enables the user to isolate small numbers of cells or tissues from frozen or formalin-fixed, paraffin-embedded tissue sections. LMD techniques rely on a thermo labile membrane placed either on top of, or underneath, the tissue section. In one method, focused laser energy is used to melt the membrane onto the underlying cells, which can then be lifted out of the tissue section. In the other, the laser energy vaporizes the foil along a path "drawn" on the tissue, allowing the selected cells to fall into a collection device. Each technique allows the selection of cells with a minimum resolution of several microns. DNA, RNA, protein, and lipid samples may be isolated and analyzed from micro-dissected samples. In this video, we demonstrate the use of the Leica AS-LMD laser microdissection instrument in seven segments, including an introduction to the principles of LMD, initializing the instrument for use, general considerations for sample preparation, mounting the specimen and setting up capture tubes, aligning the microscope, adjusting the capture controls, and capturing tissue specimens. Laser-capture micro-dissection enables the investigator to isolate samples of pure cell populations as small as a few cell-equivalents. This allows the analysis of cells of interest that are free of neighboring contaminants, which may confound experimental results.
Issue 8, Basic Protocols, Laser Capture Microdissection, Microdissection Techniques, Leica