JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Merging applicability domains for in silico assessment of chemical mutagenicity.
J Chem Inf Model
PUBLISHED: 02-14-2014
Using a benchmark Ames mutagenicity data set, we evaluated the performance of molecular fingerprints as descriptors for developing quantitative structure-activity relationship (QSAR) models and defining applicability domains with two machine-learning methods: random forest (RF) and variable nearest neighbor (v-NN). The two methods focus on complementary aspects of chemical mutagenicity and use different characteristics of the molecular fingerprints to achieve high levels of prediction accuracies. Thus, while RF flags mutagenic compounds using the presence or absence of small molecular fragments akin to structural alerts, the v-NN method uses molecular structural similarity as measured by fingerprint-based Tanimoto distances between molecules. We showed that the extended connectivity fingerprints could intuitively be used to define and quantify an applicability domain for either method. The importance of using applicability domains in QSAR modeling cannot be understated; compounds that are outside the applicability domain do not have any close representative in the training set, and therefore, we cannot make reliable predictions. Using either approach, we developed highly robust models that rival the performance of a state-of-the-art proprietary software package. Importantly, based on the complementary approach used by the methods, we showed that by combining the model predictions we raised the applicability domain from roughly 80% to 90%. These results indicated that the proposed QSAR protocol constituted a highly robust chemical mutagenicity prediction model.
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Published: 07-25-2013
ABSTRACT
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (https://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
23 Related JoVE Articles!
Play Button
The ITS2 Database
Authors: Benjamin Merget, Christian Koetschan, Thomas Hackl, Frank Förster, Thomas Dandekar, Tobias Müller, Jörg Schultz, Matthias Wolf.
Institutions: University of Würzburg, University of Würzburg.
The internal transcribed spacer 2 (ITS2) has been used as a phylogenetic marker for more than two decades. As ITS2 research mainly focused on the very variable ITS2 sequence, it confined this marker to low-level phylogenetics only. However, the combination of the ITS2 sequence and its highly conserved secondary structure improves the phylogenetic resolution1 and allows phylogenetic inference at multiple taxonomic ranks, including species delimitation2-8. The ITS2 Database9 presents an exhaustive dataset of internal transcribed spacer 2 sequences from NCBI GenBank11 accurately reannotated10. Following an annotation by profile Hidden Markov Models (HMMs), the secondary structure of each sequence is predicted. First, it is tested whether a minimum energy based fold12 (direct fold) results in a correct, four helix conformation. If this is not the case, the structure is predicted by homology modeling13. In homology modeling, an already known secondary structure is transferred to another ITS2 sequence, whose secondary structure was not able to fold correctly in a direct fold. The ITS2 Database is not only a database for storage and retrieval of ITS2 sequence-structures. It also provides several tools to process your own ITS2 sequences, including annotation, structural prediction, motif detection and BLAST14 search on the combined sequence-structure information. Moreover, it integrates trimmed versions of 4SALE15,16 and ProfDistS17 for multiple sequence-structure alignment calculation and Neighbor Joining18 tree reconstruction. Together they form a coherent analysis pipeline from an initial set of sequences to a phylogeny based on sequence and secondary structure. In a nutshell, this workbench simplifies first phylogenetic analyses to only a few mouse-clicks, while additionally providing tools and data for comprehensive large-scale analyses.
Genetics, Issue 61, alignment, internal transcribed spacer 2, molecular systematics, secondary structure, ribosomal RNA, phylogenetic tree, homology modeling, phylogeny
3806
Play Button
Using Informational Connectivity to Measure the Synchronous Emergence of fMRI Multi-voxel Information Across Time
Authors: Marc N. Coutanche, Sharon L. Thompson-Schill.
Institutions: University of Pennsylvania.
It is now appreciated that condition-relevant information can be present within distributed patterns of functional magnetic resonance imaging (fMRI) brain activity, even for conditions with similar levels of univariate activation. Multi-voxel pattern (MVP) analysis has been used to decode this information with great success. FMRI investigators also often seek to understand how brain regions interact in interconnected networks, and use functional connectivity (FC) to identify regions that have correlated responses over time. Just as univariate analyses can be insensitive to information in MVPs, FC may not fully characterize the brain networks that process conditions with characteristic MVP signatures. The method described here, informational connectivity (IC), can identify regions with correlated changes in MVP-discriminability across time, revealing connectivity that is not accessible to FC. The method can be exploratory, using searchlights to identify seed-connected areas, or planned, between pre-selected regions-of-interest. The results can elucidate networks of regions that process MVP-related conditions, can breakdown MVPA searchlight maps into separate networks, or can be compared across tasks and patient groups.
Neuroscience, Issue 89, fMRI, MVPA, connectivity, informational connectivity, functional connectivity, networks, multi-voxel pattern analysis, decoding, classification, method, multivariate
51226
Play Button
Assessment of Immunologically Relevant Dynamic Tertiary Structural Features of the HIV-1 V3 Loop Crown R2 Sequence by ab initio Folding
Authors: David Almond, Timothy Cardozo.
Institutions: School of Medicine, New York University.
The antigenic diversity of HIV-1 has long been an obstacle to vaccine design, and this variability is especially pronounced in the V3 loop of the virus' surface envelope glycoprotein. We previously proposed that the crown of the V3 loop, although dynamic and sequence variable, is constrained throughout the population of HIV-1 viruses to an immunologically relevant β-hairpin tertiary structure. Importantly, there are thousands of different V3 loop crown sequences in circulating HIV-1 viruses, making 3D structural characterization of trends across the diversity of viruses difficult or impossible by crystallography or NMR. Our previous successful studies with folding of the V3 crown1, 2 used the ab initio algorithm 3 accessible in the ICM-Pro molecular modeling software package (Molsoft LLC, La Jolla, CA) and suggested that the crown of the V3 loop, specifically from positions 10 to 22, benefits sufficiently from the flexibility and length of its flanking stems to behave to a large degree as if it were an unconstrained peptide freely folding in solution. As such, rapid ab initio folding of just this portion of the V3 loop of any individual strain of the 60,000+ circulating HIV-1 strains can be informative. Here, we folded the V3 loop of the R2 strain to gain insight into the structural basis of its unique properties. R2 bears a rare V3 loop sequence thought to be responsible for the exquisite sensitivity of this strain to neutralization by patient sera and monoclonal antibodies4, 5. The strain mediates CD4-independent infection and appears to elicit broadly neutralizing antibodies. We demonstrate how evaluation of the results of the folding can be informative for associating observed structures in the folding with the immunological activities observed for R2.
Infection, Issue 43, HIV-1, structure-activity relationships, ab initio simulations, antibody-mediated neutralization, vaccine design
2118
Play Button
Operant Sensation Seeking in the Mouse
Authors: Christopher M. Olsen, Danny G. Winder.
Institutions: Vanderbilt University Medical Center.
Operant methods are powerful behavioral tools for the study of motivated behavior. These 'self-administration' methods have been used extensively in drug addiction research due to their high construct validity. Operant studies provide researchers a tool for preclinical investigation of several aspects of the addiction process. For example, mechanisms of acute reinforcement (both drug and non-drug) can be tested using pharmacological or genetic tools to determine the ability of a molecular target to influence self-administration behavior1-6. Additionally, drug or food seeking behaviors can be studied in the absence of the primary reinforcer, and the ability of pharmacological compounds to disrupt this process is a preclinical model for discovery of molecular targets and compounds that may be useful for the treatment of addiction3,7-9. One problem with performing intravenous drug self-administration studies in the mouse is the technical difficulty of maintaining catheter patency. Attrition rates in these experiments are high and can reach 40% or higher10-15. Another general problem with drug self-administration is discerning which pharmacologically-induced effects of the reinforcer produce specific behaviors. For example, measurement of the reinforcing and neurological effects of psychostimulants can be confounded by their psychomotor effects. Operant methods using food reinforcement can avoid these pitfalls, although their utility in studying drug addiction is limited by the fact that some manipulations that alter drug self-administration have a minimal impact on food self-administration. For example, mesolimbic dopamine lesion or knockout of the D1 dopamine receptor reduce cocaine self-administration without having a significant impact on food self-administration 12,16. Sensory stimuli have been described for their ability to support operant responding as primary reinforcers (i.e. not conditioned reinforcers)17-22. Auditory and visual stimuli are self-administered by several species18,21,23, although surprisingly little is known about the neural mechanisms underlying this reinforcement. The operant sensation seeking (OSS) model is a robust model for obtaining sensory self-administration in the mouse, allowing the study of neural mechanisms important in sensory reinforcement24. An additional advantage of OSS is the ability to screen mutant mice for differences in operant behavior that may be relevant to addiction. We have reported that dopamine D1 receptor knockout mice, previously shown to be deficient in psychostimulant self-administration, also fail to acquire OSS24. This is a unique finding in that these mice are capable of learning an operant task when food is used as a reinforcer. While operant studies using food reinforcement can be useful in the study of general motivated behavior and the mechanisms underlying food reinforcement, as mentioned above, these studies are limited in their application to studying molecular mechanisms of drug addiction. Thus, there may be similar neural substrates mediating sensory and psychostimulant reinforcement that are distinct from food reinforcement, which would make OSS a particularly attractive model for the study of drug addiction processes. The degree of overlap between other molecular targets of OSS and drug reinforcers is unclear, but is a topic that we are currently pursuing. While some aspects of addiction such as resistance to extinction may be observed with OSS, we have found that escalation 25 is not observed in this model24. Interestingly, escalation of intake and some other aspects of addiction are observed with self-administration of sucrose26. Thus, when non-drug operant procedures are desired to study addiction-related processes, food or sensory reinforcers can be chosen to best fit the particular question being asked. In conclusion, both food self-administration and OSS in the mouse have the advantage of not requiring an intravenous catheter, which allows a higher throughput means to study the effects of pharmacological or genetic manipulation of neural targets involved in motivation. While operant testing using food as a reinforcer is particularly useful in the study of the regulation of food intake, OSS is particularly apt for studying reinforcement mechanisms of sensory stimuli and may have broad applicability to novelty seeking and addiction.
Neuroscience, Issue 45, novelty seeking, self-administration, addiction, motivation, reinforcement
2292
Play Button
Transgenic Rodent Assay for Quantifying Male Germ Cell Mutant Frequency
Authors: Jason M. O'Brien, Marc A. Beal, John D. Gingerich, Lynda Soper, George R. Douglas, Carole L. Yauk, Francesco Marchetti.
Institutions: Environmental Health Centre.
De novo mutations arise mostly in the male germline and may contribute to adverse health outcomes in subsequent generations. Traditional methods for assessing the induction of germ cell mutations require the use of large numbers of animals, making them impractical. As such, germ cell mutagenicity is rarely assessed during chemical testing and risk assessment. Herein, we describe an in vivo male germ cell mutation assay using a transgenic rodent model that is based on a recently approved Organisation for Economic Co-operation and Development (OECD) test guideline. This method uses an in vitro positive selection assay to measure in vivo mutations induced in a transgenic λgt10 vector bearing a reporter gene directly in the germ cells of exposed males. We further describe how the detection of mutations in the transgene recovered from germ cells can be used to characterize the stage-specific sensitivity of the various spermatogenic cell types to mutagen exposure by controlling three experimental parameters: the duration of exposure (administration time), the time between exposure and sample collection (sampling time), and the cell population collected for analysis. Because a large number of germ cells can be assayed from a single male, this method has superior sensitivity compared with traditional methods, requires fewer animals and therefore much less time and resources.
Genetics, Issue 90, sperm, spermatogonia, male germ cells, spermatogenesis, de novo mutation, OECD TG 488, transgenic rodent mutation assay, N-ethyl-N-nitrosourea, genetic toxicology
51576
Play Button
Laboratory Estimation of Net Trophic Transfer Efficiencies of PCB Congeners to Lake Trout (Salvelinus namaycush) from Its Prey
Authors: Charles P. Madenjian, Richard R. Rediske, James P. O'Keefe, Solomon R. David.
Institutions: U. S. Geological Survey, Grand Valley State University, Shedd Aquarium.
A technique for laboratory estimation of net trophic transfer efficiency (γ) of polychlorinated biphenyl (PCB) congeners to piscivorous fish from their prey is described herein. During a 135-day laboratory experiment, we fed bloater (Coregonus hoyi) that had been caught in Lake Michigan to lake trout (Salvelinus namaycush) kept in eight laboratory tanks. Bloater is a natural prey for lake trout. In four of the tanks, a relatively high flow rate was used to ensure relatively high activity by the lake trout, whereas a low flow rate was used in the other four tanks, allowing for low lake trout activity. On a tank-by-tank basis, the amount of food eaten by the lake trout on each day of the experiment was recorded. Each lake trout was weighed at the start and end of the experiment. Four to nine lake trout from each of the eight tanks were sacrificed at the start of the experiment, and all 10 lake trout remaining in each of the tanks were euthanized at the end of the experiment. We determined concentrations of 75 PCB congeners in the lake trout at the start of the experiment, in the lake trout at the end of the experiment, and in bloaters fed to the lake trout during the experiment. Based on these measurements, γ was calculated for each of 75 PCB congeners in each of the eight tanks. Mean γ was calculated for each of the 75 PCB congeners for both active and inactive lake trout. Because the experiment was replicated in eight tanks, the standard error about mean γ could be estimated. Results from this type of experiment are useful in risk assessment models to predict future risk to humans and wildlife eating contaminated fish under various scenarios of environmental contamination.
Environmental Sciences, Issue 90, trophic transfer efficiency, polychlorinated biphenyl congeners, lake trout, activity, contaminants, accumulation, risk assessment, toxic equivalents
51496
Play Button
Development of a Virtual Reality Assessment of Everyday Living Skills
Authors: Stacy A. Ruse, Vicki G. Davis, Alexandra S. Atkins, K. Ranga R. Krishnan, Kolleen H. Fox, Philip D. Harvey, Richard S.E. Keefe.
Institutions: NeuroCog Trials, Inc., Duke-NUS Graduate Medical Center, Duke University Medical Center, Fox Evaluation and Consulting, PLLC, University of Miami Miller School of Medicine.
Cognitive impairments affect the majority of patients with schizophrenia and these impairments predict poor long term psychosocial outcomes.  Treatment studies aimed at cognitive impairment in patients with schizophrenia not only require demonstration of improvements on cognitive tests, but also evidence that any cognitive changes lead to clinically meaningful improvements.  Measures of “functional capacity” index the extent to which individuals have the potential to perform skills required for real world functioning.  Current data do not support the recommendation of any single instrument for measurement of functional capacity.  The Virtual Reality Functional Capacity Assessment Tool (VRFCAT) is a novel, interactive gaming based measure of functional capacity that uses a realistic simulated environment to recreate routine activities of daily living. Studies are currently underway to evaluate and establish the VRFCAT’s sensitivity, reliability, validity, and practicality. This new measure of functional capacity is practical, relevant, easy to use, and has several features that improve validity and sensitivity of measurement of function in clinical trials of patients with CNS disorders.
Behavior, Issue 86, Virtual Reality, Cognitive Assessment, Functional Capacity, Computer Based Assessment, Schizophrenia, Neuropsychology, Aging, Dementia
51405
Play Button
Mouse Genome Engineering Using Designer Nucleases
Authors: Mario Hermann, Tomas Cermak, Daniel F. Voytas, Pawel Pelczar.
Institutions: University of Zurich, University of Minnesota.
Transgenic mice carrying site-specific genome modifications (knockout, knock-in) are of vital importance for dissecting complex biological systems as well as for modeling human diseases and testing therapeutic strategies. Recent advances in the use of designer nucleases such as zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and the clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated (Cas) 9 system for site-specific genome engineering open the possibility to perform rapid targeted genome modification in virtually any laboratory species without the need to rely on embryonic stem (ES) cell technology. A genome editing experiment typically starts with identification of designer nuclease target sites within a gene of interest followed by construction of custom DNA-binding domains to direct nuclease activity to the investigator-defined genomic locus. Designer nuclease plasmids are in vitro transcribed to generate mRNA for microinjection of fertilized mouse oocytes. Here, we provide a protocol for achieving targeted genome modification by direct injection of TALEN mRNA into fertilized mouse oocytes.
Genetics, Issue 86, Oocyte microinjection, Designer nucleases, ZFN, TALEN, Genome Engineering
50930
Play Button
The ChroP Approach Combines ChIP and Mass Spectrometry to Dissect Locus-specific Proteomic Landscapes of Chromatin
Authors: Monica Soldi, Tiziana Bonaldi.
Institutions: European Institute of Oncology.
Chromatin is a highly dynamic nucleoprotein complex made of DNA and proteins that controls various DNA-dependent processes. Chromatin structure and function at specific regions is regulated by the local enrichment of histone post-translational modifications (hPTMs) and variants, chromatin-binding proteins, including transcription factors, and DNA methylation. The proteomic characterization of chromatin composition at distinct functional regions has been so far hampered by the lack of efficient protocols to enrich such domains at the appropriate purity and amount for the subsequent in-depth analysis by Mass Spectrometry (MS). We describe here a newly designed chromatin proteomics strategy, named ChroP (Chromatin Proteomics), whereby a preparative chromatin immunoprecipitation is used to isolate distinct chromatin regions whose features, in terms of hPTMs, variants and co-associated non-histonic proteins, are analyzed by MS. We illustrate here the setting up of ChroP for the enrichment and analysis of transcriptionally silent heterochromatic regions, marked by the presence of tri-methylation of lysine 9 on histone H3. The results achieved demonstrate the potential of ChroP in thoroughly characterizing the heterochromatin proteome and prove it as a powerful analytical strategy for understanding how the distinct protein determinants of chromatin interact and synergize to establish locus-specific structural and functional configurations.
Biochemistry, Issue 86, chromatin, histone post-translational modifications (hPTMs), epigenetics, mass spectrometry, proteomics, SILAC, chromatin immunoprecipitation , histone variants, chromatome, hPTMs cross-talks
51220
Play Button
A Multi-Modal Approach to Assessing Recovery in Youth Athletes Following Concussion
Authors: Nick Reed, James Murphy, Talia Dick, Katie Mah, Melissa Paniccia, Lee Verweel, Danielle Dobney, Michelle Keightley.
Institutions: Holland Bloorview Kids Rehabilitation Hospital, University of Toronto, University of Toronto.
Concussion is one of the most commonly reported injuries amongst children and youth involved in sport participation. Following a concussion, youth can experience a range of short and long term neurobehavioral symptoms (somatic, cognitive and emotional/behavioral) that can have a significant impact on one’s participation in daily activities and pursuits of interest (e.g., school, sports, work, family/social life, etc.). Despite this, there remains a paucity in clinically driven research aimed specifically at exploring concussion within the youth sport population, and more specifically, multi-modal approaches to measuring recovery. This article provides an overview of a novel and multi-modal approach to measuring recovery amongst youth athletes following concussion. The presented approach involves the use of both pre-injury/baseline testing and post-injury/follow-up testing to assess performance across a wide variety of domains (post-concussion symptoms, cognition, balance, strength, agility/motor skills and resting state heart rate variability). The goal of this research is to gain a more objective and accurate understanding of recovery following concussion in youth athletes (ages 10-18 years). Findings from this research can help to inform the development and use of improved approaches to concussion management and rehabilitation specific to the youth sport community.
Medicine, Issue 91, concussion, children, youth, athletes, assessment, management, rehabilitation
51892
Play Button
Concurrent Quantitative Conductivity and Mechanical Properties Measurements of Organic Photovoltaic Materials using AFM
Authors: Maxim P. Nikiforov, Seth B. Darling.
Institutions: Argonne National Laboratory, University of Chicago.
Organic photovoltaic (OPV) materials are inherently inhomogeneous at the nanometer scale. Nanoscale inhomogeneity of OPV materials affects performance of photovoltaic devices. Thus, understanding of spatial variations in composition as well as electrical properties of OPV materials is of paramount importance for moving PV technology forward.1,2 In this paper, we describe a protocol for quantitative measurements of electrical and mechanical properties of OPV materials with sub-100 nm resolution. Currently, materials properties measurements performed using commercially available AFM-based techniques (PeakForce, conductive AFM) generally provide only qualitative information. The values for resistance as well as Young's modulus measured using our method on the prototypical ITO/PEDOT:PSS/P3HT:PC61BM system correspond well with literature data. The P3HT:PC61BM blend separates onto PC61BM-rich and P3HT-rich domains. Mechanical properties of PC61BM-rich and P3HT-rich domains are different, which allows for domain attribution on the surface of the film. Importantly, combining mechanical and electrical data allows for correlation of the domain structure on the surface of the film with electrical properties variation measured through the thickness of the film.
Materials Science, Issue 71, Nanotechnology, Mechanical Engineering, Electrical Engineering, Computer Science, Physics, electrical transport properties in solids, condensed matter physics, thin films (theory, deposition and growth), conductivity (solid state), AFM, atomic force microscopy, electrical properties, mechanical properties, organic photovoltaics, microengineering, photovoltaics
50293
Play Button
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Authors: Mirella Vivoli, Halina R. Novak, Jennifer A. Littlechild, Nicholas J. Harmer.
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
51809
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
A Protocol for Computer-Based Protein Structure and Function Prediction
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
3259
Play Button
Optimized Negative Staining: a High-throughput Protocol for Examining Small and Asymmetric Protein Structure by Electron Microscopy
Authors: Matthew Rames, Yadong Yu, Gang Ren.
Institutions: The Molecular Foundry.
Structural determination of proteins is rather challenging for proteins with molecular masses between 40 - 200 kDa. Considering that more than half of natural proteins have a molecular mass between 40 - 200 kDa1,2, a robust and high-throughput method with a nanometer resolution capability is needed. Negative staining (NS) electron microscopy (EM) is an easy, rapid, and qualitative approach which has frequently been used in research laboratories to examine protein structure and protein-protein interactions. Unfortunately, conventional NS protocols often generate structural artifacts on proteins, especially with lipoproteins that usually form presenting rouleaux artifacts. By using images of lipoproteins from cryo-electron microscopy (cryo-EM) as a standard, the key parameters in NS specimen preparation conditions were recently screened and reported as the optimized NS protocol (OpNS), a modified conventional NS protocol 3 . Artifacts like rouleaux can be greatly limited by OpNS, additionally providing high contrast along with reasonably high‐resolution (near 1 nm) images of small and asymmetric proteins. These high-resolution and high contrast images are even favorable for an individual protein (a single object, no average) 3D reconstruction, such as a 160 kDa antibody, through the method of electron tomography4,5. Moreover, OpNS can be a high‐throughput tool to examine hundreds of samples of small proteins. For example, the previously published mechanism of 53 kDa cholesteryl ester transfer protein (CETP) involved the screening and imaging of hundreds of samples 6. Considering cryo-EM rarely successfully images proteins less than 200 kDa has yet to publish any study involving screening over one hundred sample conditions, it is fair to call OpNS a high-throughput method for studying small proteins. Hopefully the OpNS protocol presented here can be a useful tool to push the boundaries of EM and accelerate EM studies into small protein structure, dynamics and mechanisms.
Environmental Sciences, Issue 90, small and asymmetric protein structure, electron microscopy, optimized negative staining
51087
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
Analysis of Nephron Composition and Function in the Adult Zebrafish Kidney
Authors: Kristen K. McCampbell, Kristin N. Springer, Rebecca A. Wingert.
Institutions: University of Notre Dame.
The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.
Cellular Biology, Issue 90, zebrafish; kidney; nephron; nephrology; renal; regeneration; proximal tubule; distal tubule; segment; mesonephros; physiology; acute kidney injury (AKI)
51644
Play Button
Utilization of Microscale Silicon Cantilevers to Assess Cellular Contractile Function In Vitro
Authors: Alec S.T. Smith, Christopher J. Long, Christopher McAleer, Nathaniel Bobbitt, Balaji Srinivasan, James J. Hickman.
Institutions: University of Central Florida.
The development of more predictive and biologically relevant in vitro assays is predicated on the advancement of versatile cell culture systems which facilitate the functional assessment of the seeded cells. To that end, microscale cantilever technology offers a platform with which to measure the contractile functionality of a range of cell types, including skeletal, cardiac, and smooth muscle cells, through assessment of contraction induced substrate bending. Application of multiplexed cantilever arrays provides the means to develop moderate to high-throughput protocols for assessing drug efficacy and toxicity, disease phenotype and progression, as well as neuromuscular and other cell-cell interactions. This manuscript provides the details for fabricating reliable cantilever arrays for this purpose, and the methods required to successfully culture cells on these surfaces. Further description is provided on the steps necessary to perform functional analysis of contractile cell types maintained on such arrays using a novel laser and photo-detector system. The representative data provided highlights the precision and reproducible nature of the analysis of contractile function possible using this system, as well as the wide range of studies to which such technology can be applied. Successful widespread adoption of this system could provide investigators with the means to perform rapid, low cost functional studies in vitro, leading to more accurate predictions of tissue performance, disease development and response to novel therapeutic treatment.
Bioengineering, Issue 92, cantilever, in vitro, contraction, skeletal muscle, NMJ, cardiomyocytes, functional
51866
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
51823
Play Button
Proper Care and Cleaning of the Microscope
Authors: Victoria Centonze Frohlich.
Institutions: University of Texas Health Science Center at San Antonio (UTHSCSA).
Keeping the microscope optics clean is important for high-quality imaging. Dust, fingerprints, excess immersion oil, or mounting medium on or in a microscope causes reduction in contrast and resolution. DIC is especially sensitive to contamination and scratches on the lens surfaces. This protocol details the procedure for keeping the microscope clean.
Basic Protocols, Issue 18, Current Protocols Wiley, Microscopy, Cleaning the Microscope
842
Play Button
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Authors: Wenan Chen, Ashwin Belle, Charles Cockrell, Kevin R. Ward, Kayvan Najarian.
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
3871
Play Button
Using Learning Outcome Measures to assess Doctoral Nursing Education
Authors: Glenn H. Raup, Jeff King, Romana J. Hughes, Natasha Faidley.
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
2048
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.