JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
A currency for offsetting energy development impacts: horse-trading sage-grouse on the open market.
PLoS ONE
PUBLISHED: 03-08-2010
Biodiversity offsets provide a mechanism to compensate for unavoidable damages from new energy development as the U.S. increases its domestic production. Proponents argue that offsets provide a partial solution for funding conservation while opponents contend the practice is flawed because offsets are negotiated without the science necessary to backup resulting decisions. Missing in negotiations is a biologically-based currency for estimating sufficiency of offsets and a framework for applying proceeds to maximize conservation benefits.
Authors: Amir Rosenthal, Stephan Kellnberger, Murad Omar, Daniel Razansky, Vasilis Ntziachristos.
Published: 05-11-2014
ABSTRACT
Optical sensors of ultrasound are a promising alternative to piezoelectric techniques, as has been recently demonstrated in the field of optoacoustic imaging. In medical applications, one of the major limitations of optical sensing technology is its susceptibility to environmental conditions, e.g. changes in pressure and temperature, which may saturate the detection. Additionally, the clinical environment often imposes stringent limits on the size and robustness of the sensor. In this work, the combination of pulse interferometry and fiber-based optical sensing is demonstrated for ultrasound detection. Pulse interferometry enables robust performance of the readout system in the presence of rapid variations in the environmental conditions, whereas the use of all-fiber technology leads to a mechanically flexible sensing element compatible with highly demanding medical applications such as intravascular imaging. In order to achieve a short sensor length, a pi-phase-shifted fiber Bragg grating is used, which acts as a resonator trapping light over an effective length of 350 µm. To enable high bandwidth, the sensor is used for sideway detection of ultrasound, which is highly beneficial in circumferential imaging geometries such as intravascular imaging. An optoacoustic imaging setup is used to determine the response of the sensor for acoustic point sources at different positions.
26 Related JoVE Articles!
Play Button
Angle-resolved Photoemission Spectroscopy At Ultra-low Temperatures
Authors: Sergey V. Borisenko, Volodymyr B. Zabolotnyy, Alexander A. Kordyuk, Danil V. Evtushinsky, Timur K. Kim, Emanuela Carleschi, Bryan P. Doyle, Rosalba Fittipaldi, Mario Cuoco, Antonio Vecchione, Helmut Berger.
Institutions: IFW-Dresden, Institute of Metal Physics of National Academy of Sciences of Ukraine, Diamond Light Source LTD, University of Johannesburg, Università di Salerno, École Polytechnique Fédérale de Lausanne.
The physical properties of a material are defined by its electronic structure. Electrons in solids are characterized by energy (ω) and momentum (k) and the probability to find them in a particular state with given ω and k is described by the spectral function A(k, ω). This function can be directly measured in an experiment based on the well-known photoelectric effect, for the explanation of which Albert Einstein received the Nobel Prize back in 1921. In the photoelectric effect the light shone on a surface ejects electrons from the material. According to Einstein, energy conservation allows one to determine the energy of an electron inside the sample, provided the energy of the light photon and kinetic energy of the outgoing photoelectron are known. Momentum conservation makes it also possible to estimate k relating it to the momentum of the photoelectron by measuring the angle at which the photoelectron left the surface. The modern version of this technique is called Angle-Resolved Photoemission Spectroscopy (ARPES) and exploits both conservation laws in order to determine the electronic structure, i.e. energy and momentum of electrons inside the solid. In order to resolve the details crucial for understanding the topical problems of condensed matter physics, three quantities need to be minimized: uncertainty* in photon energy, uncertainty in kinetic energy of photoelectrons and temperature of the sample. In our approach we combine three recent achievements in the field of synchrotron radiation, surface science and cryogenics. We use synchrotron radiation with tunable photon energy contributing an uncertainty of the order of 1 meV, an electron energy analyzer which detects the kinetic energies with a precision of the order of 1 meV and a He3 cryostat which allows us to keep the temperature of the sample below 1 K. We discuss the exemplary results obtained on single crystals of Sr2RuO4 and some other materials. The electronic structure of this material can be determined with an unprecedented clarity.
Physics, Issue 68, Chemistry, electron energy bands, band structure of solids, superconducting materials, condensed matter physics, ARPES, angle-resolved photoemission synchrotron, imaging
50129
Play Button
Reduced Itraconazole Concentration and Durations Are Successful in Treating Batrachochytrium dendrobatidis Infection in Amphibians
Authors: Laura A. Brannelly.
Institutions: James Cook University.
Amphibians are experiencing the greatest decline of any vertebrate class and a leading cause of these declines is a fungal pathogen, Batrachochytrium dendrobatidis (Bd), which causes the disease chytridiomycosis. Captive assurance colonies are important worldwide for threatened amphibian species and may be the only lifeline for those in critical threat of extinction. Maintaining disease free colonies is a priority of captive managers, yet safe and effective treatments for all species and across life stages have not been identified. The most widely used chemotherapeutic treatment is itraconazole, although the dosage commonly used can be harmful to some individuals and species. We performed a clinical treatment trial to assess whether a lower and safer but effective dose of itraconazole could be found to cure Bd infections. We found that by reducing the treatment concentration from 0.01-0.0025% and reducing the treatment duration from 11-6 days of 5 min baths, frogs could be cured of Bd infection with fewer side effects and less treatment-associated mortality.
Immunology, Issue 85, Batrachochytrium dendrobatidis, itraconazole, chytridiomycosis, captive assurance colonies, amphibian conservation
51166
Play Button
An Isolated Retinal Preparation to Record Light Response from Genetically Labeled Retinal Ganglion Cells
Authors: Tiffany M Schmidt, Paulo Kofuji.
Institutions: University of Minnesota.
The first steps in vertebrate vision take place when light stimulates the rod and cone photoreceptors of the retina 1. This information is then segregated into what are known as the ON and OFF pathways. The photoreceptors signal light information to the bipolar cells (BCs), which depolarize in response to increases (On BCs) or decreases (Off BCs) in light intensity. This segregation of light information is maintained at the level of the retinal ganglion cells (RGCs), which have dendrites stratifying in either the Off sublamina of the inner plexiform layer (IPL), where they receive direct excitatory input from Off BCs, or stratifying in the On sublamina of the IPL, where they receive direct excitatory input from On BCs. This segregation of information regarding increases or decreases in illumination (the On and Off pathways) is conserved and signaled to the brain in parallel. The RGCs are the output cells of the retina, and are thus an important cell to study in order to understand how light information is signaled to visual nuclei in the brain. Advances in mouse genetics over recent decades have resulted in a variety of fluorescent reporter mouse lines where specific RGC populations are labeled with a fluorescent protein to allow for identification of RGC subtypes 2 3 4 and specific targeting for electrophysiological recording. Here, we present a method for recording light responses from fluorescently labeled ganglion cells in an intact, isolated retinal preparation. This isolated retinal preparation allows for recordings from RGCs where the dendritic arbor is intact and the inputs across the entire RGC dendritic arbor are preserved. This method is applicable across a variety of ganglion cell subtypes and is amenable to a wide variety of single-cell physiological techniques.
Neuroscience, Issue 47, isolated, retina, ganglion cell, electrophysiology, patch clamp, transgenic, mouse, fluorescent
2367
Play Button
Applications of EEG Neuroimaging Data: Event-related Potentials, Spectral Power, and Multiscale Entropy
Authors: Jennifer J. Heisz, Anthony R. McIntosh.
Institutions: Baycrest.
When considering human neuroimaging data, an appreciation of signal variability represents a fundamental innovation in the way we think about brain signal. Typically, researchers represent the brain's response as the mean across repeated experimental trials and disregard signal fluctuations over time as "noise". However, it is becoming clear that brain signal variability conveys meaningful functional information about neural network dynamics. This article describes the novel method of multiscale entropy (MSE) for quantifying brain signal variability. MSE may be particularly informative of neural network dynamics because it shows timescale dependence and sensitivity to linear and nonlinear dynamics in the data.
Neuroscience, Issue 76, Neurobiology, Anatomy, Physiology, Medicine, Biomedical Engineering, Electroencephalography, EEG, electroencephalogram, Multiscale entropy, sample entropy, MEG, neuroimaging, variability, noise, timescale, non-linear, brain signal, information theory, brain, imaging
50131
Play Button
Flame Experiments at the Advanced Light Source: New Insights into Soot Formation Processes
Authors: Nils Hansen, Scott A. Skeen, Hope A. Michelsen, Kevin R. Wilson, Katharina Kohse-Höinghaus.
Institutions: Sandia National Laboratories, Lawrence Berkeley National Laboratory, Universität Bielefeld.
The following experimental protocols and the accompanying video are concerned with the flame experiments that are performed at the Chemical Dynamics Beamline of the Advanced Light Source (ALS) of the Lawrence Berkeley National Laboratory1-4. This video demonstrates how the complex chemical structures of laboratory-based model flames are analyzed using flame-sampling mass spectrometry with tunable synchrotron-generated vacuum-ultraviolet (VUV) radiation. This experimental approach combines isomer-resolving capabilities with high sensitivity and a large dynamic range5,6. The first part of the video describes experiments involving burner-stabilized, reduced-pressure (20-80 mbar) laminar premixed flames. A small hydrocarbon fuel was used for the selected flame to demonstrate the general experimental approach. It is shown how species’ profiles are acquired as a function of distance from the burner surface and how the tunability of the VUV photon energy is used advantageously to identify many combustion intermediates based on their ionization energies. For example, this technique has been used to study gas-phase aspects of the soot-formation processes, and the video shows how the resonance-stabilized radicals, such as C3H3, C3H5, and i-C4H5, are identified as important intermediates7. The work has been focused on soot formation processes, and, from the chemical point of view, this process is very intriguing because chemical structures containing millions of carbon atoms are assembled from a fuel molecule possessing only a few carbon atoms in just milliseconds. The second part of the video highlights a new experiment, in which an opposed-flow diffusion flame and synchrotron-based aerosol mass spectrometry are used to study the chemical composition of the combustion-generated soot particles4. The experimental results indicate that the widely accepted H-abstraction-C2H2-addition (HACA) mechanism is not the sole molecular growth process responsible for the formation of the observed large polycyclic aromatic hydrocarbons (PAHs).
Physics, Issue 87, Combustion, Flame, Energy Conversion, Mass Spectrometry, Photoionization, Synchrotron, Hydrocarbon, Soot, Aerosol, Isomer
51369
Play Button
Development of a Virtual Reality Assessment of Everyday Living Skills
Authors: Stacy A. Ruse, Vicki G. Davis, Alexandra S. Atkins, K. Ranga R. Krishnan, Kolleen H. Fox, Philip D. Harvey, Richard S.E. Keefe.
Institutions: NeuroCog Trials, Inc., Duke-NUS Graduate Medical Center, Duke University Medical Center, Fox Evaluation and Consulting, PLLC, University of Miami Miller School of Medicine.
Cognitive impairments affect the majority of patients with schizophrenia and these impairments predict poor long term psychosocial outcomes.  Treatment studies aimed at cognitive impairment in patients with schizophrenia not only require demonstration of improvements on cognitive tests, but also evidence that any cognitive changes lead to clinically meaningful improvements.  Measures of “functional capacity” index the extent to which individuals have the potential to perform skills required for real world functioning.  Current data do not support the recommendation of any single instrument for measurement of functional capacity.  The Virtual Reality Functional Capacity Assessment Tool (VRFCAT) is a novel, interactive gaming based measure of functional capacity that uses a realistic simulated environment to recreate routine activities of daily living. Studies are currently underway to evaluate and establish the VRFCAT’s sensitivity, reliability, validity, and practicality. This new measure of functional capacity is practical, relevant, easy to use, and has several features that improve validity and sensitivity of measurement of function in clinical trials of patients with CNS disorders.
Behavior, Issue 86, Virtual Reality, Cognitive Assessment, Functional Capacity, Computer Based Assessment, Schizophrenia, Neuropsychology, Aging, Dementia
51405
Play Button
DNA-based Fish Species Identification Protocol
Authors: Rachel Formosa, Harini Ravi, Scott Happe, Danielle Huffman, Natalia Novoradovskaya, Robert Kincaid, Steve Garrett.
Institutions: Agilent Technologies.
We have developed a fast, simple, and accurate DNA-based screening method to identify the fish species present in fresh and processed seafood samples. This versatile method employs PCR amplification of genomic DNA extracted from fish samples, followed by restriction fragment length polymorphism (RFLP) analysis to generate fragment patterns that can be resolved on the Agilent 2100 Bioanalyzer and matched to the correct species using RFLP pattern matching software. The fish identification method uses a simple, reliable, spin column- based protocol to isolate DNA from fish samples. The samples are treated with proteinase K to release the nucleic acids into solution. DNA is then isolated by suspending the sample in binding buffer and loading onto a micro- spin cup containing a silica- based fiber matrix. The nucleic acids in the sample bind to the fiber matrix. The immobilized nucleic acids are washed to remove contaminants, and total DNA is recovered in a final volume of 100 μl. The isolated DNA is ready for PCR amplification with the provided primers that bind to sequences found in all fish genomes. The PCR products are then digested with three different restriction enzymes and resolved on the Agilent 2100 Bioanalyzer. The fragment lengths produced in the digestion reactions can be used to determine the species of fish from which the DNA sample was prepared, using the RFLP pattern matching software containing a database of experimentally- derived RFLP patterns from commercially relevant fish species.
Cellular Biology, Issue 38, seafood, fish, mislabeling, authenticity, PCR, Bioanalyzer, food, RFLP, identity
1871
Play Button
Conducting Miller-Urey Experiments
Authors: Eric T. Parker, James H. Cleaves, Aaron S. Burton, Daniel P. Glavin, Jason P. Dworkin, Manshui Zhou, Jeffrey L. Bada, Facundo M. Fernández.
Institutions: Georgia Institute of Technology, Tokyo Institute of Technology, Institute for Advanced Study, NASA Johnson Space Center, NASA Goddard Space Flight Center, University of California at San Diego.
In 1953, Stanley Miller reported the production of biomolecules from simple gaseous starting materials, using an apparatus constructed to simulate the primordial Earth's atmosphere-ocean system. Miller introduced 200 ml of water, 100 mmHg of H2, 200 mmHg of CH4, and 200 mmHg of NH3 into the apparatus, then subjected this mixture, under reflux, to an electric discharge for a week, while the water was simultaneously heated. The purpose of this manuscript is to provide the reader with a general experimental protocol that can be used to conduct a Miller-Urey type spark discharge experiment, using a simplified 3 L reaction flask. Since the experiment involves exposing inflammable gases to a high voltage electric discharge, it is worth highlighting important steps that reduce the risk of explosion. The general procedures described in this work can be extrapolated to design and conduct a wide variety of electric discharge experiments simulating primitive planetary environments.
Chemistry, Issue 83, Geosciences (General), Exobiology, Miller-Urey, Prebiotic chemistry, amino acids, spark discharge
51039
Play Button
Live Imaging of Mitosis in the Developing Mouse Embryonic Cortex
Authors: Louis-Jan Pilaz, Debra L. Silver.
Institutions: Duke University Medical Center, Duke University Medical Center.
Although of short duration, mitosis is a complex and dynamic multi-step process fundamental for development of organs including the brain. In the developing cerebral cortex, abnormal mitosis of neural progenitors can cause defects in brain size and function. Hence, there is a critical need for tools to understand the mechanisms of neural progenitor mitosis. Cortical development in rodents is an outstanding model for studying this process. Neural progenitor mitosis is commonly examined in fixed brain sections. This protocol will describe in detail an approach for live imaging of mitosis in ex vivo embryonic brain slices. We will describe the critical steps for this procedure, which include: brain extraction, brain embedding, vibratome sectioning of brain slices, staining and culturing of slices, and time-lapse imaging. We will then demonstrate and describe in detail how to perform post-acquisition analysis of mitosis. We include representative results from this assay using the vital dye Syto11, transgenic mice (histone H2B-EGFP and centrin-EGFP), and in utero electroporation (mCherry-α-tubulin). We will discuss how this procedure can be best optimized and how it can be modified for study of genetic regulation of mitosis. Live imaging of mitosis in brain slices is a flexible approach to assess the impact of age, anatomy, and genetic perturbation in a controlled environment, and to generate a large amount of data with high temporal and spatial resolution. Hence this protocol will complement existing tools for analysis of neural progenitor mitosis.
Neuroscience, Issue 88, mitosis, radial glial cells, developing cortex, neural progenitors, brain slice, live imaging
51298
Play Button
Investigating Protein-protein Interactions in Live Cells Using Bioluminescence Resonance Energy Transfer
Authors: Pelagia Deriziotis, Sarah A. Graham, Sara B. Estruch, Simon E. Fisher.
Institutions: Max Planck Institute for Psycholinguistics, Donders Institute for Brain, Cognition and Behaviour.
Assays based on Bioluminescence Resonance Energy Transfer (BRET) provide a sensitive and reliable means to monitor protein-protein interactions in live cells. BRET is the non-radiative transfer of energy from a 'donor' luciferase enzyme to an 'acceptor' fluorescent protein. In the most common configuration of this assay, the donor is Renilla reniformis luciferase and the acceptor is Yellow Fluorescent Protein (YFP). Because the efficiency of energy transfer is strongly distance-dependent, observation of the BRET phenomenon requires that the donor and acceptor be in close proximity. To test for an interaction between two proteins of interest in cultured mammalian cells, one protein is expressed as a fusion with luciferase and the second as a fusion with YFP. An interaction between the two proteins of interest may bring the donor and acceptor sufficiently close for energy transfer to occur. Compared to other techniques for investigating protein-protein interactions, the BRET assay is sensitive, requires little hands-on time and few reagents, and is able to detect interactions which are weak, transient, or dependent on the biochemical environment found within a live cell. It is therefore an ideal approach for confirming putative interactions suggested by yeast two-hybrid or mass spectrometry proteomics studies, and in addition it is well-suited for mapping interacting regions, assessing the effect of post-translational modifications on protein-protein interactions, and evaluating the impact of mutations identified in patient DNA.
Cellular Biology, Issue 87, Protein-protein interactions, Bioluminescence Resonance Energy Transfer, Live cell, Transfection, Luciferase, Yellow Fluorescent Protein, Mutations
51438
Play Button
Fast Imaging Technique to Study Drop Impact Dynamics of Non-Newtonian Fluids
Authors: Qin Xu, Ivo Peters, Sam Wilken, Eric Brown, Heinrich Jaeger.
Institutions: The University of Chicago, The University of Chicago, Yale University.
In the field of fluid mechanics, many dynamical processes not only occur over a very short time interval but also require high spatial resolution for detailed observation, scenarios that make it challenging to observe with conventional imaging systems. One of these is the drop impact of liquids, which usually happens within one tenth of millisecond. To tackle this challenge, a fast imaging technique is introduced that combines a high-speed camera (capable of up to one million frames per second) with a macro lens with long working distance to bring the spatial resolution of the image down to 10 µm/pixel. The imaging technique enables precise measurement of relevant fluid dynamic quantities, such as the flow field, the spreading distance and the splashing speed, from analysis of the recorded video. To demonstrate the capabilities of this visualization system, the impact dynamics when droplets of non-Newtonian fluids impinge on a flat hard surface are characterized. Two situations are considered: for oxidized liquid metal droplets we focus on the spreading behavior, and for densely packed suspensions we determine the onset of splashing. More generally, the combination of high temporal and spatial imaging resolution introduced here offers advantages for studying fast dynamics across a wide range of microscale phenomena.
Physics, Issue 85, fluid mechanics, fast camera, dense suspension, liquid metal, drop impact, splashing
51249
Play Button
Microfluidic Mixers for Studying Protein Folding
Authors: Steven A. Waldauer, Ling Wu, Shuhuai Yao, Olgica Bakajin, Lisa J. Lapidus.
Institutions: Michigan State University, Hong Kong University of Science and Technology, University of California, Davis .
The process by which a protein folds into its native conformation is highly relevant to biology and human health yet still poorly understood. One reason for this is that folding takes place over a wide range of timescales, from nanoseconds to seconds or longer, depending on the protein1. Conventional stopped-flow mixers have allowed measurement of folding kinetics starting at about 1 ms. We have recently developed a microfluidic mixer that dilutes denaturant ~100-fold in ~8 μs2. Unlike a stopped-flow mixer, this mixer operates in the laminar flow regime in which turbulence does not occur. The absence of turbulence allows precise numeric simulation of all flows within the mixer with excellent agreement to experiment3-4. Laminar flow is achieved for Reynolds numbers Re ≤100. For aqueous solutions, this requires micron scale geometries. We use a hard substrate, such as silicon or fused silica, to make channels 5-10 μm wide and 10 μm deep (See Figure 1). The smallest dimensions, at the entrance to the mixing region, are on the order of 1 μm in size. The chip is sealed with a thin glass or fused silica coverslip for optical access. Typical total linear flow rates are ~1 m/s, yielding Re~10, but the protein consumption is only ~0.5 nL/s or 1.8 μL/hr. Protein concentration depends on the detection method: For tryptophan fluorescence the typical concentration is 100 μM (for 1 Trp/protein) and for FRET the typical concentration is ~100 nM. The folding process is initiated by rapid dilution of denaturant from 6 M to 0.06 M guanidine hydrochloride. The protein in high denaturant flows down a central channel and is met on either side at the mixing region by buffer without denaturant moving ~100 times faster (see Figure 2). This geometry causes rapid constriction of the protein flow into a narrow jet ~100 nm wide. Diffusion of the light denaturant molecules is very rapid, while diffusion of the heavy protein molecules is much slower, diffusing less than 1 μm in 1 ms. The difference in diffusion constant of the denaturant and the protein results in rapid dilution of the denaturant from the protein stream, reducing the effective concentration of the denaturant around the protein. The protein jet flows at a constant rate down the observation channel and fluorescence of the protein during folding can be observed using a scanning confocal microscope5.
Bioengineering, Issue 62, microfluidic mixing, laminar flow, protein folding, fluorescence, FRET
3976
Play Button
Dependence of Laser-induced Breakdown Spectroscopy Results on Pulse Energies and Timing Parameters Using Soil Simulants
Authors: Lauren Kurek, Maya L. Najarian, David A. Cremers, Rosemarie C. Chinni.
Institutions: Alvernia University, Applied Research Associates (ARA), Inc..
The dependence of some LIBS detection capabilities on lower pulse energies (<100 mJ) and timing parameters were examined using synthetic silicate samples. These samples were used as simulants for soil and contained minor and trace elements commonly found in soil at a wide range of concentrations. For this study, over 100 calibration curves were prepared using different pulse energies and timing parameters; detection limits and sensitivities were determined from the calibration curves. Plasma temperatures were also measured using Boltzmann plots for the various energies and the timing parameters tested. The electron density of the plasma was calculated using the full-width half maximum (FWHM) of the hydrogen line at 656.5 nm over the energies tested. Overall, the results indicate that the use of lower pulse energies and non-gated detection do not seriously compromise the analytical results. These results are very relevant to the design of field- and person-portable LIBS instruments.
Chemistry, Issue 79, analytical chemistry, laser research, atomic physics, [LIBS, Laser-induced breakdown spectroscopy, gated and non-gated detection, energy study]
50876
Play Button
Characterization of Electrode Materials for Lithium Ion and Sodium Ion Batteries Using Synchrotron Radiation Techniques
Authors: Marca M. Doeff, Guoying Chen, Jordi Cabana, Thomas J. Richardson, Apurva Mehta, Mona Shirpour, Hugues Duncan, Chunjoong Kim, Kinson C. Kam, Thomas Conry.
Institutions: Lawrence Berkeley National Laboratory, University of Illinois at Chicago, Stanford Synchrotron Radiation Lightsource, Haldor Topsøe A/S, PolyPlus Battery Company.
Intercalation compounds such as transition metal oxides or phosphates are the most commonly used electrode materials in Li-ion and Na-ion batteries. During insertion or removal of alkali metal ions, the redox states of transition metals in the compounds change and structural transformations such as phase transitions and/or lattice parameter increases or decreases occur. These behaviors in turn determine important characteristics of the batteries such as the potential profiles, rate capabilities, and cycle lives. The extremely bright and tunable x-rays produced by synchrotron radiation allow rapid acquisition of high-resolution data that provide information about these processes. Transformations in the bulk materials, such as phase transitions, can be directly observed using X-ray diffraction (XRD), while X-ray absorption spectroscopy (XAS) gives information about the local electronic and geometric structures (e.g. changes in redox states and bond lengths). In situ experiments carried out on operating cells are particularly useful because they allow direct correlation between the electrochemical and structural properties of the materials. These experiments are time-consuming and can be challenging to design due to the reactivity and air-sensitivity of the alkali metal anodes used in the half-cell configurations, and/or the possibility of signal interference from other cell components and hardware. For these reasons, it is appropriate to carry out ex situ experiments (e.g. on electrodes harvested from partially charged or cycled cells) in some cases. Here, we present detailed protocols for the preparation of both ex situ and in situ samples for experiments involving synchrotron radiation and demonstrate how these experiments are done.
Physics, Issue 81, X-Ray Absorption Spectroscopy, X-Ray Diffraction, inorganic chemistry, electric batteries (applications), energy storage, Electrode materials, Li-ion battery, Na-ion battery, X-ray Absorption Spectroscopy (XAS), in situ X-ray diffraction (XRD)
50594
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
Rapid and Low-cost Prototyping of Medical Devices Using 3D Printed Molds for Liquid Injection Molding
Authors: Philip Chung, J. Alex Heller, Mozziyar Etemadi, Paige E. Ottoson, Jonathan A. Liu, Larry Rand, Shuvo Roy.
Institutions: University of California, San Francisco, University of California, San Francisco, University of Southern California.
Biologically inert elastomers such as silicone are favorable materials for medical device fabrication, but forming and curing these elastomers using traditional liquid injection molding processes can be an expensive process due to tooling and equipment costs. As a result, it has traditionally been impractical to use liquid injection molding for low-cost, rapid prototyping applications. We have devised a method for rapid and low-cost production of liquid elastomer injection molded devices that utilizes fused deposition modeling 3D printers for mold design and a modified desiccator as an injection system. Low costs and rapid turnaround time in this technique lower the barrier to iteratively designing and prototyping complex elastomer devices. Furthermore, CAD models developed in this process can be later adapted for metal mold tooling design, enabling an easy transition to a traditional injection molding process. We have used this technique to manufacture intravaginal probes involving complex geometries, as well as overmolding over metal parts, using tools commonly available within an academic research laboratory. However, this technique can be easily adapted to create liquid injection molded devices for many other applications.
Bioengineering, Issue 88, liquid injection molding, reaction injection molding, molds, 3D printing, fused deposition modeling, rapid prototyping, medical devices, low cost, low volume, rapid turnaround time.
51745
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
50645
Play Button
Characterization of Surface Modifications by White Light Interferometry: Applications in Ion Sputtering, Laser Ablation, and Tribology Experiments
Authors: Sergey V. Baryshev, Robert A. Erck, Jerry F. Moore, Alexander V. Zinovev, C. Emil Tripa, Igor V. Veryovkin.
Institutions: Argonne National Laboratory, Argonne National Laboratory, MassThink LLC.
In materials science and engineering it is often necessary to obtain quantitative measurements of surface topography with micrometer lateral resolution. From the measured surface, 3D topographic maps can be subsequently analyzed using a variety of software packages to extract the information that is needed. In this article we describe how white light interferometry, and optical profilometry (OP) in general, combined with generic surface analysis software, can be used for materials science and engineering tasks. In this article, a number of applications of white light interferometry for investigation of surface modifications in mass spectrometry, and wear phenomena in tribology and lubrication are demonstrated. We characterize the products of the interaction of semiconductors and metals with energetic ions (sputtering), and laser irradiation (ablation), as well as ex situ measurements of wear of tribological test specimens. Specifically, we will discuss: Aspects of traditional ion sputtering-based mass spectrometry such as sputtering rates/yields measurements on Si and Cu and subsequent time-to-depth conversion. Results of quantitative characterization of the interaction of femtosecond laser irradiation with a semiconductor surface. These results are important for applications such as ablation mass spectrometry, where the quantities of evaporated material can be studied and controlled via pulse duration and energy per pulse. Thus, by determining the crater geometry one can define depth and lateral resolution versus experimental setup conditions. Measurements of surface roughness parameters in two dimensions, and quantitative measurements of the surface wear that occur as a result of friction and wear tests. Some inherent drawbacks, possible artifacts, and uncertainty assessments of the white light interferometry approach will be discussed and explained.
Materials Science, Issue 72, Physics, Ion Beams (nuclear interactions), Light Reflection, Optical Properties, Semiconductor Materials, White Light Interferometry, Ion Sputtering, Laser Ablation, Femtosecond Lasers, Depth Profiling, Time-of-flight Mass Spectrometry, Tribology, Wear Analysis, Optical Profilometry, wear, friction, atomic force microscopy, AFM, scanning electron microscopy, SEM, imaging, visualization
50260
Play Button
Combining Computer Game-Based Behavioural Experiments With High-Density EEG and Infrared Gaze Tracking
Authors: Keith J. Yoder, Matthew K. Belmonte.
Institutions: Cornell University, University of Chicago, Manesar, India.
Experimental paradigms are valuable insofar as the timing and other parameters of their stimuli are well specified and controlled, and insofar as they yield data relevant to the cognitive processing that occurs under ecologically valid conditions. These two goals often are at odds, since well controlled stimuli often are too repetitive to sustain subjects' motivation. Studies employing electroencephalography (EEG) are often especially sensitive to this dilemma between ecological validity and experimental control: attaining sufficient signal-to-noise in physiological averages demands large numbers of repeated trials within lengthy recording sessions, limiting the subject pool to individuals with the ability and patience to perform a set task over and over again. This constraint severely limits researchers' ability to investigate younger populations as well as clinical populations associated with heightened anxiety or attentional abnormalities. Even adult, non-clinical subjects may not be able to achieve their typical levels of performance or cognitive engagement: an unmotivated subject for whom an experimental task is little more than a chore is not the same, behaviourally, cognitively, or neurally, as a subject who is intrinsically motivated and engaged with the task. A growing body of literature demonstrates that embedding experiments within video games may provide a way between the horns of this dilemma between experimental control and ecological validity. The narrative of a game provides a more realistic context in which tasks occur, enhancing their ecological validity (Chaytor & Schmitter-Edgecombe, 2003). Moreover, this context provides motivation to complete tasks. In our game, subjects perform various missions to collect resources, fend off pirates, intercept communications or facilitate diplomatic relations. In so doing, they also perform an array of cognitive tasks, including a Posner attention-shifting paradigm (Posner, 1980), a go/no-go test of motor inhibition, a psychophysical motion coherence threshold task, the Embedded Figures Test (Witkin, 1950, 1954) and a theory-of-mind (Wimmer & Perner, 1983) task. The game software automatically registers game stimuli and subjects' actions and responses in a log file, and sends event codes to synchronise with physiological data recorders. Thus the game can be combined with physiological measures such as EEG or fMRI, and with moment-to-moment tracking of gaze. Gaze tracking can verify subjects' compliance with behavioural tasks (e.g. fixation) and overt attention to experimental stimuli, and also physiological arousal as reflected in pupil dilation (Bradley et al., 2008). At great enough sampling frequencies, gaze tracking may also help assess covert attention as reflected in microsaccades - eye movements that are too small to foveate a new object, but are as rapid in onset and have the same relationship between angular distance and peak velocity as do saccades that traverse greater distances. The distribution of directions of microsaccades correlates with the (otherwise) covert direction of attention (Hafed & Clark, 2002).
Neuroscience, Issue 46, High-density EEG, ERP, ICA, gaze tracking, computer game, ecological validity
2320
Play Button
Setting Limits on Supersymmetry Using Simplified Models
Authors: Christian Gütschow, Zachary Marshall.
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
50419
Play Button
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Authors: Sergey Rabotyagov, Todd Campbell, Adriana Valcu, Philip Gassman, Manoj Jha, Keith Schilling, Calvin Wolter, Catherine Kling.
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,5,12,20) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7 with a multiobjective evolutionary algorithm SPEA226, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development
4009
Play Button
Use of Arabidopsis eceriferum Mutants to Explore Plant Cuticle Biosynthesis
Authors: Lacey Samuels, Allan DeBono, Patricia Lam, Miao Wen, Reinhard Jetter, Ljerka Kunst.
Institutions: University of British Columbia - UBC, University of British Columbia - UBC.
The plant cuticle is a waxy outer covering on plants that has a primary role in water conservation, but is also an important barrier against the entry of pathogenic microorganisms. The cuticle is made up of a tough crosslinked polymer called "cutin" and a protective wax layer that seals the plant surface. The waxy layer of the cuticle is obvious on many plants, appearing as a shiny film on the ivy leaf or as a dusty outer covering on the surface of a grape or a cabbage leaf thanks to light scattering crystals present in the wax. Because the cuticle is an essential adaptation of plants to a terrestrial environment, understanding the genes involved in plant cuticle formation has applications in both agriculture and forestry. Today, we'll show the analysis of plant cuticle mutants identified by forward and reverse genetics approaches.
Plant Biology, Issue 16, Annual Review, Cuticle, Arabidopsis, Eceriferum Mutants, Cryso-SEM, Gas Chromatography
709
Play Button
Using Learning Outcome Measures to assess Doctoral Nursing Education
Authors: Glenn H. Raup, Jeff King, Romana J. Hughes, Natasha Faidley.
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
2048
Play Button
Brain Banking: Making the Most of your Research Specimens
Authors: Mark W. Burke, Shahin Zangenehpour, Maurice Ptito.
Institutions: University of Montreal, University of Montreal.
Unbiased stereology is a method for accurately and efficiently estimating the total neuron number (or other cell type) in a given area of interest1. To achieve this goal 6-10 systematic sections should be probed covering the entire structure. Typically this involves processing 1/5 sections which leaves a significant amount of material unprocessed. In order to maximize the material, we propose an inexpensive method for preserving fixed tissue as part of a long-term storage research plan. As tissue is sliced and processed for the desired stain or antibody, alternate sections should be systematically placed in antigen preserve at -20°C for future processing. Using 24-well plates, sections can be placed in order for future retrieval. Using this method, tissue can be stored and processed for immunohistochemistry over the course of years.
Neuroscience, Issue 29, brain bank, systematic sampling, stereology, cryostat, antigen preserve
1260
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.