JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
The long and winding road to uncertainty: the link between spatial distance and feelings of uncertainty.
.
PLoS ONE
PUBLISHED: 03-11-2015
Construal Level Theory (CLT) [1] defines psychological distance as any object, event, or person that cannot be experienced by the self in the here and now. The goal of the present research was to demonstrate that feelings of uncertainty are closely linked to the concept of psychological distance. Two experiments tested the assumption that spatial distance and uncertainty are bidirectionally related. In the first experiment, we show that perceived spatial distance leads to a feeling of uncertainty. The second experiment revealed that a feeling of uncertainty leads to a perception of greater distance. By demonstrating that distance is closely tied to uncertainty, the present research extends previous research on both distance and uncertainty by incorporating previously unexplained findings within CLT. Implications of these findings such as the role of uncertainty within CLT are discussed.
Authors: Daniel E. Bradford, Katherine P. Magruder, Rachel A. Korhumel, John J. Curtin.
Published: 09-12-2014
ABSTRACT
Fear of certain threat and anxiety about uncertain threat are distinct emotions with unique behavioral, cognitive-attentional, and neuroanatomical components. Both anxiety and fear can be studied in the laboratory by measuring the potentiation of the startle reflex. The startle reflex is a defensive reflex that is potentiated when an organism is threatened and the need for defense is high. The startle reflex is assessed via electromyography (EMG) in the orbicularis oculi muscle elicited by brief, intense, bursts of acoustic white noise (i.e., “startle probes”). Startle potentiation is calculated as the increase in startle response magnitude during presentation of sets of visual threat cues that signal delivery of mild electric shock relative to sets of matched cues that signal the absence of shock (no-threat cues). In the Threat Probability Task, fear is measured via startle potentiation to high probability (100% cue-contingent shock; certain) threat cues whereas anxiety is measured via startle potentiation to low probability (20% cue-contingent shock; uncertain) threat cues. Measurement of startle potentiation during the Threat Probability Task provides an objective and easily implemented alternative to assessment of negative affect via self-report or other methods (e.g., neuroimaging) that may be inappropriate or impractical for some researchers. Startle potentiation has been studied rigorously in both animals (e.g., rodents, non-human primates) and humans which facilitates animal-to-human translational research. Startle potentiation during certain and uncertain threat provides an objective measure of negative affective and distinct emotional states (fear, anxiety) to use in research on psychopathology, substance use/abuse and broadly in affective science. As such, it has been used extensively by clinical scientists interested in psychopathology etiology and by affective scientists interested in individual differences in emotion.
20 Related JoVE Articles!
Play Button
Measuring the Mechanical Properties of Living Cells Using Atomic Force Microscopy
Authors: Gawain Thomas, Nancy A. Burnham, Terri Anne Camesano, Qi Wen.
Institutions: Worcester Polytechnic Institute, Worcester Polytechnic Institute.
Mechanical properties of cells and extracellular matrix (ECM) play important roles in many biological processes including stem cell differentiation, tumor formation, and wound healing. Changes in stiffness of cells and ECM are often signs of changes in cell physiology or diseases in tissues. Hence, cell stiffness is an index to evaluate the status of cell cultures. Among the multitude of methods applied to measure the stiffness of cells and tissues, micro-indentation using an Atomic Force Microscope (AFM) provides a way to reliably measure the stiffness of living cells. This method has been widely applied to characterize the micro-scale stiffness for a variety of materials ranging from metal surfaces to soft biological tissues and cells. The basic principle of this method is to indent a cell with an AFM tip of selected geometry and measure the applied force from the bending of the AFM cantilever. Fitting the force-indentation curve to the Hertz model for the corresponding tip geometry can give quantitative measurements of material stiffness. This paper demonstrates the procedure to characterize the stiffness of living cells using AFM. Key steps including the process of AFM calibration, force-curve acquisition, and data analysis using a MATLAB routine are demonstrated. Limitations of this method are also discussed.
Biophysics, Issue 76, Bioengineering, Cellular Biology, Molecular Biology, Physics, Chemical Engineering, Biomechanics, bioengineering (general), AFM, cell stiffness, microindentation, force spectroscopy, atomic force microscopy, microscopy
50497
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Visualizing Protein-DNA Interactions in Live Bacterial Cells Using Photoactivated Single-molecule Tracking
Authors: Stephan Uphoff, David J. Sherratt, Achillefs N. Kapanidis.
Institutions: University of Oxford, University of Oxford.
Protein-DNA interactions are at the heart of many fundamental cellular processes. For example, DNA replication, transcription, repair, and chromosome organization are governed by DNA-binding proteins that recognize specific DNA structures or sequences. In vitro experiments have helped to generate detailed models for the function of many types of DNA-binding proteins, yet, the exact mechanisms of these processes and their organization in the complex environment of the living cell remain far less understood. We recently introduced a method for quantifying DNA-repair activities in live Escherichia coli cells using Photoactivated Localization Microscopy (PALM) combined with single-molecule tracking. Our general approach identifies individual DNA-binding events by the change in the mobility of a single protein upon association with the chromosome. The fraction of bound molecules provides a direct quantitative measure for the protein activity and abundance of substrates or binding sites at the single-cell level. Here, we describe the concept of the method and demonstrate sample preparation, data acquisition, and data analysis procedures.
Immunology, Issue 85, Super-resolution microscopy, single-particle tracking, Live-cell imaging, DNA-binding proteins, DNA repair, molecular diffusion
51177
Play Button
Characterization of Recombination Effects in a Liquid Ionization Chamber Used for the Dosimetry of a Radiosurgical Accelerator
Authors: Antoine Wagner, Frederik Crop, Thomas Lacornerie, Nick Reynaert.
Institutions: Centre Oscar Lambret.
Most modern radiation therapy devices allow the use of very small fields, either through beamlets in Intensity-Modulated Radiation Therapy (IMRT) or via stereotactic radiotherapy where positioning accuracy allows delivering very high doses per fraction in a small volume of the patient. Dosimetric measurements on medical accelerators are conventionally realized using air-filled ionization chambers. However, in small beams these are subject to nonnegligible perturbation effects. This study focuses on liquid ionization chambers, which offer advantages in terms of spatial resolution and low fluence perturbation. Ion recombination effects are investigated for the microLion detector (PTW) used with the Cyberknife system (Accuray). The method consists of performing a series of water tank measurements at different source-surface distances, and applying corrections to the liquid detector readings based on simultaneous gaseous detector measurements. This approach facilitates isolating the recombination effects arising from the high density of the liquid sensitive medium and obtaining correction factors to apply to the detector readings. The main difficulty resides in achieving a sufficient level of accuracy in the setup to be able to detect small changes in the chamber response.
Physics, Issue 87, Radiation therapy, dosimetry, small fields, Cyberknife, liquid ionization, recombination effects
51296
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
Investigating the Neural Mechanisms of Aware and Unaware Fear Memory with fMRI
Authors: David C. Knight, Kimberly H. Wood.
Institutions: University of Alabama at Birmingham.
Pavlovian fear conditioning is often used in combination with functional magnetic resonance imaging (fMRI) in humans to investigate the neural substrates of associative learning 1-5. In these studies, it is important to provide behavioral evidence of conditioning to verify that differences in brain activity are learning-related and correlated with human behavior. Fear conditioning studies often monitor autonomic responses (e.g. skin conductance response; SCR) as an index of learning and memory 6-8. In addition, other behavioral measures can provide valuable information about the learning process and/or other cognitive functions that influence conditioning. For example, the impact unconditioned stimulus (UCS) expectancies have on the expression of the conditioned response (CR) and unconditioned response (UCR) has been a topic of interest in several recent studies 9-14. SCR and UCS expectancy measures have recently been used in conjunction with fMRI to investigate the neural substrates of aware and unaware fear learning and memory processes 15. Although these cognitive processes can be evaluated to some degree following the conditioning session, post-conditioning assessments cannot measure expectations on a trial-to-trial basis and are susceptible to interference and forgetting, as well as other factors that may distort results 16,17 . Monitoring autonomic and behavioral responses simultaneously with fMRI provides a mechanism by which the neural substrates that mediate complex relationships between cognitive processes and behavioral/autonomic responses can be assessed. However, monitoring autonomic and behavioral responses in the MRI environment poses a number of practical problems. Specifically, 1) standard behavioral and physiological monitoring equipment is constructed of ferrous material that cannot be safely used near the MRI scanner, 2) when this equipment is placed outside of the MRI scanning chamber, the cables projecting to the subject can carry RF noise that produces artifacts in brain images, 3) artifacts can be produced within the skin conductance signal by switching gradients during scanning, 4) the fMRI signal produced by the motor demands of behavioral responses may need to be distinguished from activity related to the cognitive processes of interest. Each of these issues can be resolved with modifications to the setup of physiological monitoring equipment and additional data analysis procedures. Here we present a methodology to simultaneously monitor autonomic and behavioral responses during fMRI, and demonstrate the use of these methods to investigate aware and unaware memory processes during fear conditioning.
Neuroscience, Issue 56, fMRI, conditioning, learning, memory, fear, contingency awareness, neuroscience, skin conductance
3083
Play Button
Use of the Open Field Maze to Measure Locomotor and Anxiety-like Behavior in Mice
Authors: Michael L. Seibenhener, Michael C. Wooten.
Institutions: Auburn University.
Animal models have proven to be invaluable to researchers trying to answer questions regarding the mechanisms of behavior. The Open Field Maze is one of the most commonly used platforms to measure behaviors in animal models. It is a fast and relatively easy test that provides a variety of behavioral information ranging from general ambulatory ability to data regarding the emotionality of the subject animal. As it relates to rodent models, the procedure allows the study of different strains of mice or rats both laboratory bred and wild-captured. The technique also readily lends itself to the investigation of different pharmacological compounds for anxiolytic or anxiogenic effects. Here, a protocol for use of the open field maze to describe mouse behaviors is detailed and a simple analysis of general locomotor ability and anxiety-related emotional behaviors between two strains of C57BL/6 mice is performed. Briefly, using the described protocol we show Wild Type mice exhibited significantly less anxiety related behaviors than did age-matched Knock Out mice while both strains exhibited similar ambulatory ability.
Behavior, Issue 96, Open Field Maze, Behavior, Animal Model, Anxiety, Locomotor Activity, Thigmotaxis, Drug Treatment
52434
Play Button
Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone
Authors: Sandra Zehentmeier, Zoltan Cseresnyes, Juan Escribano Navarro, Raluca A. Niesner, Anja E. Hauser.
Institutions: German Rheumatism Research Center, a Leibniz Institute, German Rheumatism Research Center, a Leibniz Institute, Max-Delbrück Center for Molecular Medicine, Wimasis GmbH, Charité - University of Medicine.
Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data.
Developmental Biology, Issue 98, Image analysis, neighborhood analysis, bone marrow, stromal cells, bone marrow niches, simulation, bone cryosectioning, bone histology
52544
Play Button
The Double-H Maze: A Robust Behavioral Test for Learning and Memory in Rodents
Authors: Robert D. Kirch, Richard C. Pinnell, Ulrich G. Hofmann, Jean-Christophe Cassel.
Institutions: University Hospital Freiburg, UMR 7364 Université de Strasbourg, CNRS, Neuropôle de Strasbourg.
Spatial cognition research in rodents typically employs the use of maze tasks, whose attributes vary from one maze to the next. These tasks vary by their behavioral flexibility and required memory duration, the number of goals and pathways, and also the overall task complexity. A confounding feature in many of these tasks is the lack of control over the strategy employed by the rodents to reach the goal, e.g., allocentric (declarative-like) or egocentric (procedural) based strategies. The double-H maze is a novel water-escape memory task that addresses this issue, by allowing the experimenter to direct the type of strategy learned during the training period. The double-H maze is a transparent device, which consists of a central alleyway with three arms protruding on both sides, along with an escape platform submerged at the extremity of one of these arms. Rats can be trained using an allocentric strategy by alternating the start position in the maze in an unpredictable manner (see protocol 1; §4.7), thus requiring them to learn the location of the platform based on the available allothetic cues. Alternatively, an egocentric learning strategy (protocol 2; §4.8) can be employed by releasing the rats from the same position during each trial, until they learn the procedural pattern required to reach the goal. This task has been proven to allow for the formation of stable memory traces. Memory can be probed following the training period in a misleading probe trial, in which the starting position for the rats alternates. Following an egocentric learning paradigm, rats typically resort to an allocentric-based strategy, but only when their initial view on the extra-maze cues differs markedly from their original position. This task is ideally suited to explore the effects of drugs/perturbations on allocentric/egocentric memory performance, as well as the interactions between these two memory systems.
Behavior, Issue 101, Double-H maze, spatial memory, procedural memory, consolidation, allocentric, egocentric, habits, rodents, video tracking system
52667
Play Button
Intravital Video Microscopy Measurements of Retinal Blood Flow in Mice
Authors: Norman R. Harris, Megan N. Watts, Wendy Leskova.
Institutions: Louisiana State University Health Sciences Center.
Alterations in retinal blood flow can contribute to, or be a consequence of, ocular disease and visual dysfunction. Therefore, quantitation of altered perfusion can aid research into the mechanisms of retinal pathologies. Intravital video microscopy of fluorescent tracers can be used to measure vascular diameters and bloodstream velocities of the retinal vasculature, specifically the arterioles branching from the central retinal artery and of the venules leading into the central retinal vein. Blood flow rates can be calculated from the diameters and velocities, with the summation of arteriolar flow, and separately venular flow, providing values of total retinal blood flow. This paper and associated video describe the methods for applying this technique to mice, which includes 1) the preparation of the eye for intravital microscopy of the anesthetized animal, 2) the intravenous infusion of fluorescent microspheres to measure bloodstream velocity, 3) the intravenous infusion of a high molecular weight fluorescent dextran, to aid the microscopic visualization of the retinal microvasculature, 4) the use of a digital microscope camera to obtain videos of the perfused retina, and 5) the use of image processing software to analyze the video. The same techniques can be used for measuring retinal blood flow rates in rats.
Medicine, Issue 82, mouse, intravital, microscopy, microspheres, retinal vascular diameters, bloodstream velocities, retinal blood flow
51110
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
50680
Play Button
The Generation of Higher-order Laguerre-Gauss Optical Beams for High-precision Interferometry
Authors: Ludovico Carbone, Paul Fulda, Charlotte Bond, Frank Brueckner, Daniel Brown, Mengyao Wang, Deepali Lodhia, Rebecca Palmer, Andreas Freise.
Institutions: University of Birmingham.
Thermal noise in high-reflectivity mirrors is a major impediment for several types of high-precision interferometric experiments that aim to reach the standard quantum limit or to cool mechanical systems to their quantum ground state. This is for example the case of future gravitational wave observatories, whose sensitivity to gravitational wave signals is expected to be limited in the most sensitive frequency band, by atomic vibration of their mirror masses. One promising approach being pursued to overcome this limitation is to employ higher-order Laguerre-Gauss (LG) optical beams in place of the conventionally used fundamental mode. Owing to their more homogeneous light intensity distribution these beams average more effectively over the thermally driven fluctuations of the mirror surface, which in turn reduces the uncertainty in the mirror position sensed by the laser light. We demonstrate a promising method to generate higher-order LG beams by shaping a fundamental Gaussian beam with the help of diffractive optical elements. We show that with conventional sensing and control techniques that are known for stabilizing fundamental laser beams, higher-order LG modes can be purified and stabilized just as well at a comparably high level. A set of diagnostic tools allows us to control and tailor the properties of generated LG beams. This enabled us to produce an LG beam with the highest purity reported to date. The demonstrated compatibility of higher-order LG modes with standard interferometry techniques and with the use of standard spherical optics makes them an ideal candidate for application in a future generation of high-precision interferometry.
Physics, Issue 78, Optics, Astronomy, Astrophysics, Gravitational waves, Laser interferometry, Metrology, Thermal noise, Laguerre-Gauss modes, interferometry
50564
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Setting Limits on Supersymmetry Using Simplified Models
Authors: Christian Gütschow, Zachary Marshall.
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
50419
Play Button
Concurrent Quantitative Conductivity and Mechanical Properties Measurements of Organic Photovoltaic Materials using AFM
Authors: Maxim P. Nikiforov, Seth B. Darling.
Institutions: Argonne National Laboratory, University of Chicago.
Organic photovoltaic (OPV) materials are inherently inhomogeneous at the nanometer scale. Nanoscale inhomogeneity of OPV materials affects performance of photovoltaic devices. Thus, understanding of spatial variations in composition as well as electrical properties of OPV materials is of paramount importance for moving PV technology forward.1,2 In this paper, we describe a protocol for quantitative measurements of electrical and mechanical properties of OPV materials with sub-100 nm resolution. Currently, materials properties measurements performed using commercially available AFM-based techniques (PeakForce, conductive AFM) generally provide only qualitative information. The values for resistance as well as Young's modulus measured using our method on the prototypical ITO/PEDOT:PSS/P3HT:PC61BM system correspond well with literature data. The P3HT:PC61BM blend separates onto PC61BM-rich and P3HT-rich domains. Mechanical properties of PC61BM-rich and P3HT-rich domains are different, which allows for domain attribution on the surface of the film. Importantly, combining mechanical and electrical data allows for correlation of the domain structure on the surface of the film with electrical properties variation measured through the thickness of the film.
Materials Science, Issue 71, Nanotechnology, Mechanical Engineering, Electrical Engineering, Computer Science, Physics, electrical transport properties in solids, condensed matter physics, thin films (theory, deposition and growth), conductivity (solid state), AFM, atomic force microscopy, electrical properties, mechanical properties, organic photovoltaics, microengineering, photovoltaics
50293
Play Button
Angle-resolved Photoemission Spectroscopy At Ultra-low Temperatures
Authors: Sergey V. Borisenko, Volodymyr B. Zabolotnyy, Alexander A. Kordyuk, Danil V. Evtushinsky, Timur K. Kim, Emanuela Carleschi, Bryan P. Doyle, Rosalba Fittipaldi, Mario Cuoco, Antonio Vecchione, Helmut Berger.
Institutions: IFW-Dresden, Institute of Metal Physics of National Academy of Sciences of Ukraine, Diamond Light Source LTD, University of Johannesburg, Università di Salerno, École Polytechnique Fédérale de Lausanne.
The physical properties of a material are defined by its electronic structure. Electrons in solids are characterized by energy (ω) and momentum (k) and the probability to find them in a particular state with given ω and k is described by the spectral function A(k, ω). This function can be directly measured in an experiment based on the well-known photoelectric effect, for the explanation of which Albert Einstein received the Nobel Prize back in 1921. In the photoelectric effect the light shone on a surface ejects electrons from the material. According to Einstein, energy conservation allows one to determine the energy of an electron inside the sample, provided the energy of the light photon and kinetic energy of the outgoing photoelectron are known. Momentum conservation makes it also possible to estimate k relating it to the momentum of the photoelectron by measuring the angle at which the photoelectron left the surface. The modern version of this technique is called Angle-Resolved Photoemission Spectroscopy (ARPES) and exploits both conservation laws in order to determine the electronic structure, i.e. energy and momentum of electrons inside the solid. In order to resolve the details crucial for understanding the topical problems of condensed matter physics, three quantities need to be minimized: uncertainty* in photon energy, uncertainty in kinetic energy of photoelectrons and temperature of the sample. In our approach we combine three recent achievements in the field of synchrotron radiation, surface science and cryogenics. We use synchrotron radiation with tunable photon energy contributing an uncertainty of the order of 1 meV, an electron energy analyzer which detects the kinetic energies with a precision of the order of 1 meV and a He3 cryostat which allows us to keep the temperature of the sample below 1 K. We discuss the exemplary results obtained on single crystals of Sr2RuO4 and some other materials. The electronic structure of this material can be determined with an unprecedented clarity.
Physics, Issue 68, Chemistry, electron energy bands, band structure of solids, superconducting materials, condensed matter physics, ARPES, angle-resolved photoemission synchrotron, imaging
50129
Play Button
Measuring the Subjective Value of Risky and Ambiguous Options using Experimental Economics and Functional MRI Methods
Authors: Ifat Levy, Lior Rosenberg Belmaker, Kirk Manson, Agnieszka Tymula, Paul W. Glimcher.
Institutions: Yale School of Medicine, Yale School of Medicine, New York University , New York University , New York University .
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2, the degree of those aversions vary substantially across individuals, such that the subjective value of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3 to assess the neural representation of the subjective values of risky and ambiguous options4. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations. In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
Neuroscience, Issue 67, Medicine, Molecular Biology, fMRI, magnetic resonance imaging, decision-making, value, uncertainty, risk, ambiguity
3724
Play Button
A Protocol for Computer-Based Protein Structure and Function Prediction
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
3259
Play Button
Atomically Traceable Nanostructure Fabrication
Authors: Josh B. Ballard, Don D. Dick, Stephen J. McDonnell, Maia Bischof, Joseph Fu, James H. G. Owen, William R. Owen, Justin D. Alexander, David L. Jaeger, Pradeep Namboodiri, Ehud Fuchs, Yves J. Chabal, Robert M. Wallace, Richard Reidy, Richard M. Silver, John N. Randall, James Von Ehr.
Institutions: Zyvex Labs, University of Texas at Dallas, University of Texas at Dallas, University of North Texas, National Institute of Standards and Technology.
Reducing the scale of etched nanostructures below the 10 nm range eventually will require an atomic scale understanding of the entire fabrication process being used in order to maintain exquisite control over both feature size and feature density. Here, we demonstrate a method for tracking atomically resolved and controlled structures from initial template definition through final nanostructure metrology, opening up a pathway for top-down atomic control over nanofabrication. Hydrogen depassivation lithography is the first step of the nanoscale fabrication process followed by selective atomic layer deposition of up to 2.8 nm of titania to make a nanoscale etch mask. Contrast with the background is shown, indicating different mechanisms for growth on the desired patterns and on the H passivated background. The patterns are then transferred into the bulk using reactive ion etching to form 20 nm tall nanostructures with linewidths down to ~6 nm. To illustrate the limitations of this process, arrays of holes and lines are fabricated. The various nanofabrication process steps are performed at disparate locations, so process integration is discussed. Related issues are discussed including using fiducial marks for finding nanostructures on a macroscopic sample and protecting the chemically reactive patterned Si(100)-H surface against degradation due to atmospheric exposure.
Engineering, Issue 101, Nanolithography, Scanning Tunneling Microscopy, Atomic Layer Deposition, Reactive Ion Etching
52900
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.