JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Determination of the optimal training principle and input variables in artificial neural network model for the biweekly chlorophyll-a prediction: a case study of the Yuqiao Reservoir, China.
PUBLISHED: 03-15-2015
Predicting the levels of chlorophyll-a (Chl-a) is a vital component of water quality management, which ensures that urban drinking water is safe from harmful algal blooms. This study developed a model to predict Chl-a levels in the Yuqiao Reservoir (Tianjin, China) biweekly using water quality and meteorological data from 1999-2012. First, six artificial neural networks (ANNs) and two non-ANN methods (principal component analysis and the support vector regression model) were compared to determine the appropriate training principle. Subsequently, three predictors with different input variables were developed to examine the feasibility of incorporating meteorological factors into Chl-a prediction, which usually only uses water quality data. Finally, a sensitivity analysis was performed to examine how the Chl-a predictor reacts to changes in input variables. The results were as follows: first, ANN is a powerful predictive alternative to the traditional modeling techniques used for Chl-a prediction. The back program (BP) model yields slightly better results than all other ANNs, with the normalized mean square error (NMSE), the correlation coefficient (Corr), and the Nash-Sutcliffe coefficient of efficiency (NSE) at 0.003 mg/l, 0.880 and 0.754, respectively, in the testing period. Second, the incorporation of meteorological data greatly improved Chl-a prediction compared to models solely using water quality factors or meteorological data; the correlation coefficient increased from 0.574-0.686 to 0.880 when meteorological data were included. Finally, the Chl-a predictor is more sensitive to air pressure and pH compared to other water quality and meteorological variables.
Authors: Alberto Natali, Laura M. Roy, Roberta Croce.
Published: 10-10-2014
In plants and green algae, light is captured by the light-harvesting complexes (LHCs), a family of integral membrane proteins that coordinate chlorophylls and carotenoids. In vivo, these proteins are folded with pigments to form complexes which are inserted in the thylakoid membrane of the chloroplast. The high similarity in the chemical and physical properties of the members of the family, together with the fact that they can easily lose pigments during isolation, makes their purification in a native state challenging. An alternative approach to obtain homogeneous preparations of LHCs was developed by Plumley and Schmidt in 19871, who showed that it was possible to reconstitute these complexes in vitro starting from purified pigments and unfolded apoproteins, resulting in complexes with properties very similar to that of native complexes. This opened the way to the use of bacterial expressed recombinant proteins for in vitro reconstitution. The reconstitution method is powerful for various reasons: (1) pure preparations of individual complexes can be obtained, (2) pigment composition can be controlled to assess their contribution to structure and function, (3) recombinant proteins can be mutated to study the functional role of the individual residues (e.g., pigment binding sites) or protein domain (e.g., protein-protein interaction, folding). This method has been optimized in several laboratories and applied to most of the light-harvesting complexes. The protocol described here details the method of reconstituting light-harvesting complexes in vitro currently used in our laboratory, and examples describing applications of the method are provided.
23 Related JoVE Articles!
Play Button
Measuring Diffusion Coefficients via Two-photon Fluorescence Recovery After Photobleaching
Authors: Kelley D. Sullivan, Edward B. Brown.
Institutions: University of Rochester, University of Rochester.
Multi-fluorescence recovery after photobleaching is a microscopy technique used to measure the diffusion coefficient (or analogous transport parameters) of macromolecules, and can be applied to both in vitro and in vivo biological systems. Multi-fluorescence recovery after photobleaching is performed by photobleaching a region of interest within a fluorescent sample using an intense laser flash, then attenuating the beam and monitoring the fluorescence as still-fluorescent molecules from outside the region of interest diffuse in to replace the photobleached molecules. We will begin our demonstration by aligning the laser beam through the Pockels Cell (laser modulator) and along the optical path through the laser scan box and objective lens to the sample. For simplicity, we will use a sample of aqueous fluorescent dye. We will then determine the proper experimental parameters for our sample including, monitor and bleaching powers, bleach duration, bin widths (for photon counting), and fluorescence recovery time. Next, we will describe the procedure for taking recovery curves, a process that can be largely automated via LabVIEW (National Instruments, Austin, TX) for enhanced throughput. Finally, the diffusion coefficient is determined by fitting the recovery data to the appropriate mathematical model using a least-squares fitting algorithm, readily programmable using software such as MATLAB (The Mathworks, Natick, MA).
Cellular Biology, Issue 36, Diffusion, fluorescence recovery after photobleaching, MP-FRAP, FPR, multi-photon
Play Button
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Authors: Mirella Vivoli, Halina R. Novak, Jennifer A. Littlechild, Nicholas J. Harmer.
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Play Button
Fundus Photography as a Convenient Tool to Study Microvascular Responses to Cardiovascular Disease Risk Factors in Epidemiological Studies
Authors: Patrick De Boever, Tijs Louwies, Eline Provost, Luc Int Panis, Tim S. Nawrot.
Institutions: Flemish Institute for Technological Research (VITO), Hasselt University, Hasselt University, Leuven University.
The microcirculation consists of blood vessels with diameters less than 150 µm. It makes up a large part of the circulatory system and plays an important role in maintaining cardiovascular health. The retina is a tissue that lines the interior of the eye and it is the only tissue that allows for a non-invasive analysis of the microvasculature. Nowadays, high-quality fundus images can be acquired using digital cameras. Retinal images can be collected in 5 min or less, even without dilatation of the pupils. This unobtrusive and fast procedure for visualizing the microcirculation is attractive to apply in epidemiological studies and to monitor cardiovascular health from early age up to old age. Systemic diseases that affect the circulation can result in progressive morphological changes in the retinal vasculature. For example, changes in the vessel calibers of retinal arteries and veins have been associated with hypertension, atherosclerosis, and increased risk of stroke and myocardial infarction. The vessel widths are derived using image analysis software and the width of the six largest arteries and veins are summarized in the Central Retinal Arteriolar Equivalent (CRAE) and the Central Retinal Venular Equivalent (CRVE). The latter features have been shown useful to study the impact of modifiable lifestyle and environmental cardiovascular disease risk factors. The procedures to acquire fundus images and the analysis steps to obtain CRAE and CRVE are described. Coefficients of variation of repeated measures of CRAE and CRVE are less than 2% and within-rater reliability is very high. Using a panel study, the rapid response of the retinal vessel calibers to short-term changes in particulate air pollution, a known risk factor for cardiovascular mortality and morbidity, is reported. In conclusion, retinal imaging is proposed as a convenient and instrumental tool for epidemiological studies to study microvascular responses to cardiovascular disease risk factors.
Medicine, Issue 92, retina, microvasculature, image analysis, Central Retinal Arteriolar Equivalent, Central Retinal Venular Equivalent, air pollution, particulate matter, black carbon
Play Button
DTI of the Visual Pathway - White Matter Tracts and Cerebral Lesions
Authors: Ardian Hana, Andreas Husch, Vimal Raj Nitish Gunness, Christophe Berthold, Anisa Hana, Georges Dooms, Hans Boecher Schwarz, Frank Hertel.
Institutions: Centre Hospitalier de Luxembourg, University of Applied Sciences Trier, Erasmus Universiteit Rotterdam, Centre Hospitalier de Luxembourg.
DTI is a technique that identifies white matter tracts (WMT) non-invasively in healthy and non-healthy patients using diffusion measurements. Similar to visual pathways (VP), WMT are not visible with classical MRI or intra-operatively with microscope. DTI will help neurosurgeons to prevent destruction of the VP while removing lesions adjacent to this WMT. We have performed DTI on fifty patients before and after surgery between March 2012 to January 2014. To navigate we used a 3DT1-weighted sequence. Additionally, we performed a T2-weighted and DTI-sequences. The parameters used were, FOV: 200 x 200 mm, slice thickness: 2 mm, and acquisition matrix: 96 x 96 yielding nearly isotropic voxels of 2 x 2 x 2 mm. Axial MRI was carried out using a 32 gradient direction and one b0-image. We used Echo-Planar-Imaging (EPI) and ASSET parallel imaging with an acceleration factor of 2 and b-value of 800 s/mm². The scanning time was less than 9 min. The DTI-data obtained were processed using a FDA approved surgical navigation system program which uses a straightforward fiber-tracking approach known as fiber assignment by continuous tracking (FACT). This is based on the propagation of lines between regions of interest (ROI) which is defined by a physician. A maximum angle of 50, FA start value of 0.10 and ADC stop value of 0.20 mm²/s were the parameters used for tractography. There are some limitations to this technique. The limited acquisition time frame enforces trade-offs in the image quality. Another important point not to be neglected is the brain shift during surgery. As for the latter intra-operative MRI might be helpful. Furthermore the risk of false positive or false negative tracts needs to be taken into account which might compromise the final results.
Medicine, Issue 90, Neurosurgery, brain, visual pathway, white matter tracts, visual cortex, optic chiasm, glioblastoma, meningioma, metastasis
Play Button
Subcloning Plus Insertion (SPI) - A Novel Recombineering Method for the Rapid Construction of Gene Targeting Vectors
Authors: Thimma R. Reddy, Emma J. Kelsall, Léna M.S. Fevat, Sarah E. Munson, Shaun M. Cowley.
Institutions: University of Leicester, Center for Fisheries, Environment and Aquaculture Sciences, University of Leicester.
Gene targeting refers to the precise modification of a genetic locus using homologous recombination. The generation of novel cell lines and transgenic mouse models using this method necessitates the construction of a ‘targeting’ vector, which contains homologous DNA sequences to the target gene, and has for many years been a limiting step in the process. Vector construction can be performed in vivo in Escherichia coli cells using homologous recombination mediated by phage recombinases using a technique termed recombineering. Recombineering is the preferred technique to subclone the long homology sequences (>4kb) and various targeting elements including selection markers that are required to mediate efficient allelic exchange between a targeting vector and its cognate genomic locus. Typical recombineering protocols follow an iterative scheme of step-wise integration of the targeting elements and require intermediate purification and transformation steps. Here, we present a novel recombineering methodology of vector assembly using a multiplex approach. Plasmid gap repair is performed by the simultaneous capture of genomic sequence from mouse Bacterial Artificial Chromosome libraries and the insertion of dual bacterial and mammalian selection markers. This subcloning plus insertion method is highly efficient and yields a majority of correct recombinants. We present data for the construction of different types of conditional gene knockout, or knock-in, vectors and BAC reporter vectors that have been constructed using this method. SPI vector construction greatly extends the repertoire of the recombineering toolbox and provides a simple, rapid and cost-effective method of constructing these highly complex vectors.
Molecular Biology, Issue 95, recombineering, gap-repair, subcloning plus insertion, transgene, knockout, mouse
Play Button
Closed-loop Neuro-robotic Experiments to Test Computational Properties of Neuronal Networks
Authors: Jacopo Tessadori, Michela Chiappalone.
Institutions: Istituto Italiano di Tecnologia.
Information coding in the Central Nervous System (CNS) remains unexplored. There is mounting evidence that, even at a very low level, the representation of a given stimulus might be dependent on context and history. If this is actually the case, bi-directional interactions between the brain (or if need be a reduced model of it) and sensory-motor system can shed a light on how encoding and decoding of information is performed. Here an experimental system is introduced and described in which the activity of a neuronal element (i.e., a network of neurons extracted from embryonic mammalian hippocampi) is given context and used to control the movement of an artificial agent, while environmental information is fed back to the culture as a sequence of electrical stimuli. This architecture allows a quick selection of diverse encoding, decoding, and learning algorithms to test different hypotheses on the computational properties of neuronal networks.
Neuroscience, Issue 97, Micro Electrode Arrays (MEA), in vitro cultures, coding, decoding, tetanic stimulation, spike, burst
Play Button
The Double-H Maze: A Robust Behavioral Test for Learning and Memory in Rodents
Authors: Robert D. Kirch, Richard C. Pinnell, Ulrich G. Hofmann, Jean-Christophe Cassel.
Institutions: University Hospital Freiburg, UMR 7364 Université de Strasbourg, CNRS, Neuropôle de Strasbourg.
Spatial cognition research in rodents typically employs the use of maze tasks, whose attributes vary from one maze to the next. These tasks vary by their behavioral flexibility and required memory duration, the number of goals and pathways, and also the overall task complexity. A confounding feature in many of these tasks is the lack of control over the strategy employed by the rodents to reach the goal, e.g., allocentric (declarative-like) or egocentric (procedural) based strategies. The double-H maze is a novel water-escape memory task that addresses this issue, by allowing the experimenter to direct the type of strategy learned during the training period. The double-H maze is a transparent device, which consists of a central alleyway with three arms protruding on both sides, along with an escape platform submerged at the extremity of one of these arms. Rats can be trained using an allocentric strategy by alternating the start position in the maze in an unpredictable manner (see protocol 1; §4.7), thus requiring them to learn the location of the platform based on the available allothetic cues. Alternatively, an egocentric learning strategy (protocol 2; §4.8) can be employed by releasing the rats from the same position during each trial, until they learn the procedural pattern required to reach the goal. This task has been proven to allow for the formation of stable memory traces. Memory can be probed following the training period in a misleading probe trial, in which the starting position for the rats alternates. Following an egocentric learning paradigm, rats typically resort to an allocentric-based strategy, but only when their initial view on the extra-maze cues differs markedly from their original position. This task is ideally suited to explore the effects of drugs/perturbations on allocentric/egocentric memory performance, as well as the interactions between these two memory systems.
Behavior, Issue 101, Double-H maze, spatial memory, procedural memory, consolidation, allocentric, egocentric, habits, rodents, video tracking system
Play Button
Exploring the Effects of Atmospheric Forcings on Evaporation: Experimental Integration of the Atmospheric Boundary Layer and Shallow Subsurface
Authors: Kathleen Smits, Victoria Eagen, Andrew Trautz.
Institutions: Colorado School of Mines.
Evaporation is directly influenced by the interactions between the atmosphere, land surface and soil subsurface. This work aims to experimentally study evaporation under various surface boundary conditions to improve our current understanding and characterization of this multiphase phenomenon as well as to validate numerical heat and mass transfer theories that couple Navier-Stokes flow in the atmosphere and Darcian flow in the porous media. Experimental data were collected using a unique soil tank apparatus interfaced with a small climate controlled wind tunnel. The experimental apparatus was instrumented with a suite of state of the art sensor technologies for the continuous and autonomous collection of soil moisture, soil thermal properties, soil and air temperature, relative humidity, and wind speed. This experimental apparatus can be used to generate data under well controlled boundary conditions, allowing for better control and gathering of accurate data at scales of interest not feasible in the field. Induced airflow at several distinct wind speeds over the soil surface resulted in unique behavior of heat and mass transfer during the different evaporative stages.
Environmental Sciences, Issue 100, Bare-soil evaporation, Land-atmosphere interactions, Heat and mass flux, Porous media, Wind tunnel, Soil thermal properties, Multiphase flow
Play Button
Phage Phenomics: Physiological Approaches to Characterize Novel Viral Proteins
Authors: Savannah E. Sanchez, Daniel A. Cuevas, Jason E. Rostron, Tiffany Y. Liang, Cullen G. Pivaroff, Matthew R. Haynes, Jim Nulton, Ben Felts, Barbara A. Bailey, Peter Salamon, Robert A. Edwards, Alex B. Burgin, Anca M. Segall, Forest Rohwer.
Institutions: San Diego State University, San Diego State University, San Diego State University, San Diego State University, San Diego State University, Argonne National Laboratory, Broad Institute.
Current investigations into phage-host interactions are dependent on extrapolating knowledge from (meta)genomes. Interestingly, 60 - 95% of all phage sequences share no homology to current annotated proteins. As a result, a large proportion of phage genes are annotated as hypothetical. This reality heavily affects the annotation of both structural and auxiliary metabolic genes. Here we present phenomic methods designed to capture the physiological response(s) of a selected host during expression of one of these unknown phage genes. Multi-phenotype Assay Plates (MAPs) are used to monitor the diversity of host substrate utilization and subsequent biomass formation, while metabolomics provides bi-product analysis by monitoring metabolite abundance and diversity. Both tools are used simultaneously to provide a phenotypic profile associated with expression of a single putative phage open reading frame (ORF). Representative results for both methods are compared, highlighting the phenotypic profile differences of a host carrying either putative structural or metabolic phage genes. In addition, the visualization techniques and high throughput computational pipelines that facilitated experimental analysis are presented.
Immunology, Issue 100, phenomics, phage, viral metagenome, Multi-phenotype Assay Plates (MAPs), continuous culture, metabolomics
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Using Informational Connectivity to Measure the Synchronous Emergence of fMRI Multi-voxel Information Across Time
Authors: Marc N. Coutanche, Sharon L. Thompson-Schill.
Institutions: University of Pennsylvania.
It is now appreciated that condition-relevant information can be present within distributed patterns of functional magnetic resonance imaging (fMRI) brain activity, even for conditions with similar levels of univariate activation. Multi-voxel pattern (MVP) analysis has been used to decode this information with great success. FMRI investigators also often seek to understand how brain regions interact in interconnected networks, and use functional connectivity (FC) to identify regions that have correlated responses over time. Just as univariate analyses can be insensitive to information in MVPs, FC may not fully characterize the brain networks that process conditions with characteristic MVP signatures. The method described here, informational connectivity (IC), can identify regions with correlated changes in MVP-discriminability across time, revealing connectivity that is not accessible to FC. The method can be exploratory, using searchlights to identify seed-connected areas, or planned, between pre-selected regions-of-interest. The results can elucidate networks of regions that process MVP-related conditions, can breakdown MVPA searchlight maps into separate networks, or can be compared across tasks and patient groups.
Neuroscience, Issue 89, fMRI, MVPA, connectivity, informational connectivity, functional connectivity, networks, multi-voxel pattern analysis, decoding, classification, method, multivariate
Play Button
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Authors: Aya D. Pusic, Yelena Y. Grinberg, Heidi M. Mitchell, Richard P. Kraig.
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers. In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic and modulate SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
Play Button
A Protocol for Computer-Based Protein Structure and Function Prediction
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
Play Button
Annotation of Plant Gene Function via Combined Genomics, Metabolomics and Informatics
Authors: Takayuki Tohge, Alisdair R. Fernie.
Institutions: Max-Planck-Institut.
Given the ever expanding number of model plant species for which complete genome sequences are available and the abundance of bio-resources such as knockout mutants, wild accessions and advanced breeding populations, there is a rising burden for gene functional annotation. In this protocol, annotation of plant gene function using combined co-expression gene analysis, metabolomics and informatics is provided (Figure 1). This approach is based on the theory of using target genes of known function to allow the identification of non-annotated genes likely to be involved in a certain metabolic process, with the identification of target compounds via metabolomics. Strategies are put forward for applying this information on populations generated by both forward and reverse genetics approaches in spite of none of these are effortless. By corollary this approach can also be used as an approach to characterise unknown peaks representing new or specific secondary metabolites in the limited tissues, plant species or stress treatment, which is currently the important trial to understanding plant metabolism.
Plant Biology, Issue 64, Genetics, Bioinformatics, Metabolomics, Plant metabolism, Transcriptome analysis, Functional annotation, Computational biology, Plant biology, Theoretical biology, Spectroscopy and structural analysis
Play Button
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Authors: Wenan Chen, Ashwin Belle, Charles Cockrell, Kevin R. Ward, Kayvan Najarian.
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Play Button
Establishment of Microbial Eukaryotic Enrichment Cultures from a Chemically Stratified Antarctic Lake and Assessment of Carbon Fixation Potential
Authors: Jenna M. Dolhi, Nicholas Ketchum, Rachael M. Morgan-Kiss.
Institutions: Miami University .
Lake Bonney is one of numerous permanently ice-covered lakes located in the McMurdo Dry Valleys, Antarctica. The perennial ice cover maintains a chemically stratified water column and unlike other inland bodies of water, largely prevents external input of carbon and nutrients from streams. Biota are exposed to numerous environmental stresses, including year-round severe nutrient deficiency, low temperatures, extreme shade, hypersalinity, and 24-hour darkness during the winter 1. These extreme environmental conditions limit the biota in Lake Bonney almost exclusively to microorganisms 2. Single-celled microbial eukaryotes (called "protists") are important players in global biogeochemical cycling 3 and play important ecological roles in the cycling of carbon in the dry valley lakes, occupying both primary and tertiary roles in the aquatic food web. In the dry valley aquatic food web, protists that fix inorganic carbon (autotrophy) are the major producers of organic carbon for organotrophic organisms 4, 2. Phagotrophic or heterotrophic protists capable of ingesting bacteria and smaller protists act as the top predators in the food web 5. Last, an unknown proportion of the protist population is capable of combined mixotrophic metabolism 6, 7. Mixotrophy in protists involves the ability to combine photosynthetic capability with phagotrophic ingestion of prey microorganisms. This form of mixotrophy differs from mixotrophic metabolism in bacterial species, which generally involves uptake dissolved carbon molecules. There are currently very few protist isolates from permanently ice-capped polar lakes, and studies of protist diversity and ecology in this extreme environment have been limited 8, 4, 9, 10, 5. A better understanding of protist metabolic versatility in the simple dry valley lake food web will aid in the development of models for the role of protists in the global carbon cycle. We employed an enrichment culture approach to isolate potentially phototrophic and mixotrophic protists from Lake Bonney. Sampling depths in the water column were chosen based on the location of primary production maxima and protist phylogenetic diversity 4, 11, as well as variability in major abiotic factors affecting protist trophic modes: shallow sampling depths are limited for major nutrients, while deeper sampling depths are limited by light availability. In addition, lake water samples were supplemented with multiple types of growth media to promote the growth of a variety of phototrophic organisms. RubisCO catalyzes the rate limiting step in the Calvin Benson Bassham (CBB) cycle, the major pathway by which autotrophic organisms fix inorganic carbon and provide organic carbon for higher trophic levels in aquatic and terrestrial food webs 12. In this study, we applied a radioisotope assay modified for filtered samples 13 to monitor maximum carboxylase activity as a proxy for carbon fixation potential and metabolic versatility in the Lake Bonney enrichment cultures.
Microbiology, Issue 62, Antarctic lake, McMurdo Dry Valleys, Enrichment cultivation, Microbial eukaryotes, RubisCO
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Isolation of Native Soil Microorganisms with Potential for Breaking Down Biodegradable Plastic Mulch Films Used in Agriculture
Authors: Graham Bailes, Margaret Lind, Andrew Ely, Marianne Powell, Jennifer Moore-Kucera, Carol Miles, Debra Inglis, Marion Brodhagen.
Institutions: Western Washington University, Washington State University Northwestern Research and Extension Center, Texas Tech University.
Fungi native to agricultural soils that colonized commercially available biodegradable mulch (BDM) films were isolated and assessed for potential to degrade plastics. Typically, when formulations of plastics are known and a source of the feedstock is available, powdered plastic can be suspended in agar-based media and degradation determined by visualization of clearing zones. However, this approach poorly mimics in situ degradation of BDMs. First, BDMs are not dispersed as small particles throughout the soil matrix. Secondly, BDMs are not sold commercially as pure polymers, but rather as films containing additives (e.g. fillers, plasticizers and dyes) that may affect microbial growth. The procedures described herein were used for isolates acquired from soil-buried mulch films. Fungal isolates acquired from excavated BDMs were tested individually for growth on pieces of new, disinfested BDMs laid atop defined medium containing no carbon source except agar. Isolates that grew on BDMs were further tested in liquid medium where BDMs were the sole added carbon source. After approximately ten weeks, fungal colonization and BDM degradation were assessed by scanning electron microscopy. Isolates were identified via analysis of ribosomal RNA gene sequences. This report describes methods for fungal isolation, but bacteria also were isolated using these methods by substituting media appropriate for bacteria. Our methodology should prove useful for studies investigating breakdown of intact plastic films or products for which plastic feedstocks are either unknown or not available. However our approach does not provide a quantitative method for comparing rates of BDM degradation.
Microbiology, Issue 75, Plant Biology, Environmental Sciences, Agricultural Sciences, Soil Science, Molecular Biology, Cellular Biology, Genetics, Mycology, Fungi, Bacteria, Microorganisms, Biodegradable plastic, biodegradable mulch, compostable plastic, compostable mulch, plastic degradation, composting, breakdown, soil, 18S ribosomal DNA, isolation, culture
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
A New Approach for the Comparative Analysis of Multiprotein Complexes Based on 15N Metabolic Labeling and Quantitative Mass Spectrometry
Authors: Kerstin Trompelt, Janina Steinbeck, Mia Terashima, Michael Hippler.
Institutions: University of Münster, Carnegie Institution for Science.
The introduced protocol provides a tool for the analysis of multiprotein complexes in the thylakoid membrane, by revealing insights into complex composition under different conditions. In this protocol the approach is demonstrated by comparing the composition of the protein complex responsible for cyclic electron flow (CEF) in Chlamydomonas reinhardtii, isolated from genetically different strains. The procedure comprises the isolation of thylakoid membranes, followed by their separation into multiprotein complexes by sucrose density gradient centrifugation, SDS-PAGE, immunodetection and comparative, quantitative mass spectrometry (MS) based on differential metabolic labeling (14N/15N) of the analyzed strains. Detergent solubilized thylakoid membranes are loaded on sucrose density gradients at equal chlorophyll concentration. After ultracentrifugation, the gradients are separated into fractions, which are analyzed by mass-spectrometry based on equal volume. This approach allows the investigation of the composition within the gradient fractions and moreover to analyze the migration behavior of different proteins, especially focusing on ANR1, CAS, and PGRL1. Furthermore, this method is demonstrated by confirming the results with immunoblotting and additionally by supporting the findings from previous studies (the identification and PSI-dependent migration of proteins that were previously described to be part of the CEF-supercomplex such as PGRL1, FNR, and cyt f). Notably, this approach is applicable to address a broad range of questions for which this protocol can be adopted and e.g. used for comparative analyses of multiprotein complex composition isolated from distinct environmental conditions.
Microbiology, Issue 85, Sucrose density gradients, Chlamydomonas, multiprotein complexes, 15N metabolic labeling, thylakoids
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Quantification of Heavy Metals and Other Inorganic Contaminants on the Productivity of Microalgae
Authors: Katerine Napan, Derek Hess, Brian McNeil, Jason C. Quinn.
Institutions: Utah State University.
Increasing demand for renewable fuels has researchers investigating the feasibility of alternative feedstocks, such as microalgae. Inherent advantages include high potential yield, use of non-arable land and integration with waste streams. The nutrient requirements of a large-scale microalgae production system will require the coupling of cultivation systems with industrial waste resources, such as carbon dioxide from flue gas and nutrients from wastewater. Inorganic contaminants present in these wastes can potentially lead to bioaccumulation in microalgal biomass negatively impact productivity and limiting end use. This study focuses on the experimental evaluation of the impact and the fate of 14 inorganic contaminants (As, Cd, Co, Cr, Cu, Hg, Mn, Ni, Pb, Sb, Se, Sn, V and Zn) on Nannochloropsis salina growth. Microalgae were cultivated in photobioreactors illuminated at 984 µmol m-2 sec-1 and maintained at pH 7 in a growth media polluted with inorganic contaminants at levels expected based on the composition found in commercial coal flue gas systems. Contaminants present in the biomass and the medium at the end of a 7 day growth period were analytically quantified through cold vapor atomic absorption spectrometry for Hg and through inductively coupled plasma mass spectrometry for As, Cd, Co, Cr, Cu, Mn, Ni, Pb, Sb, Se, Sn, V and Zn. Results show N. salina is a sensitive strain to the multi-metal environment with a statistical decrease in biomass yieldwith the introduction of these contaminants. The techniques presented here are adequate for quantifying algal growth and determining the fate of inorganic contaminants.
Environmental Sciences, Issue 101, algae, heavy metals, Nannochloropsis salina, photobioreactor, flue gas, inductively coupled plasma mass spectrometry, ICPMS, cold vapor atomic absorption spectrometry, CVAAS
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.