JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
The cultural evolution of democracy: saltational changes in a political regime landscape.
PUBLISHED: 06-07-2011
Transitions to democracy are most often considered the outcome of historical modernization processes. Socio-economic changes, such as increases in per capita GNP, education levels, urbanization and communication, have traditionally been found to be correlates or requisites of democratic reform. However, transition times and the number of reform steps have not been studied comprehensively. Here we show that historically, transitions to democracy have mainly occurred through rapid leaps rather than slow and incremental transition steps, with a median time from autocracy to democracy of 2.4 years, and overnight in the reverse direction. Our results show that autocracy and democracy have acted as peaks in an evolutionary landscape of possible modes of institutional arrangements. Only scarcely have there been slow incremental transitions. We discuss our results in relation to the application of phylogenetic comparative methods in cultural evolution and point out that the evolving unit in this system is the institutional arrangement, not the individual country which is instead better regarded as the host for the political system.
Authors: Kyle S. Hardman, Shayne Bennetts, John E. Debs, Carlos C. N. Kuhn, Gordon D. McDonald, Nick Robins.
Published: 04-24-2014
Since their development in the late 1980s, cheap, reliable external cavity diode lasers (ECDLs) have replaced complex and expensive traditional dye and Titanium Sapphire lasers as the workhorse laser of atomic physics labs1,2. Their versatility and prolific use throughout atomic physics in applications such as absorption spectroscopy and laser cooling1,2 makes it imperative for incoming students to gain a firm practical understanding of these lasers. This publication builds upon the seminal work by Wieman3, updating components, and providing a video tutorial. The setup, frequency locking and performance characterization of an ECDL will be described. Discussion of component selection and proper mounting of both diodes and gratings, the factors affecting mode selection within the cavity, proper alignment for optimal external feedback, optics setup for coarse and fine frequency sensitive measurements, a brief overview of laser locking techniques, and laser linewidth measurements are included.
26 Related JoVE Articles!
Play Button
Submillisecond Conformational Changes in Proteins Resolved by Photothermal Beam Deflection
Authors: Walter G. Gonzalez, Jaroslava Miksovska.
Institutions: Florida International University.
Photothermal beam deflection together with photo-acoustic calorimetry and thermal grating belongs to the family of photothermal methods that monitor the time-profile volume and enthalpy changes of light induced conformational changes in proteins on microsecond to millisecond time-scales that are not accessible using traditional stop-flow instruments. In addition, since overall changes in volume and/or enthalpy are probed, these techniques can be applied to proteins and other biomacromolecules that lack a fluorophore and or a chromophore label. To monitor dynamics and energetics of structural changes associated with Ca2+ binding to calcium transducers, such neuronal calcium sensors, a caged calcium compound, DM-nitrophen, is employed to photo-trigger a fast (τ < 20 μsec) increase in free calcium concentration and the associated volume and enthalpy changes are probed using photothermal beam deflection technique.
Chemistry, Issue 84, photothermal techniques, photothermal beam deflection, volume change, enthalpy change, calcium sensors, potassium channel interaction protein, DM-nitrophen
Play Button
Engineering Platform and Experimental Protocol for Design and Evaluation of a Neurally-controlled Powered Transfemoral Prosthesis
Authors: Fan Zhang, Ming Liu, Stephen Harper, Michael Lee, He Huang.
Institutions: North Carolina State University & University of North Carolina at Chapel Hill, University of North Carolina School of Medicine, Atlantic Prosthetics & Orthotics, LLC.
To enable intuitive operation of powered artificial legs, an interface between user and prosthesis that can recognize the user's movement intent is desired. A novel neural-machine interface (NMI) based on neuromuscular-mechanical fusion developed in our previous study has demonstrated a great potential to accurately identify the intended movement of transfemoral amputees. However, this interface has not yet been integrated with a powered prosthetic leg for true neural control. This study aimed to report (1) a flexible platform to implement and optimize neural control of powered lower limb prosthesis and (2) an experimental setup and protocol to evaluate neural prosthesis control on patients with lower limb amputations. First a platform based on a PC and a visual programming environment were developed to implement the prosthesis control algorithms, including NMI training algorithm, NMI online testing algorithm, and intrinsic control algorithm. To demonstrate the function of this platform, in this study the NMI based on neuromuscular-mechanical fusion was hierarchically integrated with intrinsic control of a prototypical transfemoral prosthesis. One patient with a unilateral transfemoral amputation was recruited to evaluate our implemented neural controller when performing activities, such as standing, level-ground walking, ramp ascent, and ramp descent continuously in the laboratory. A novel experimental setup and protocol were developed in order to test the new prosthesis control safely and efficiently. The presented proof-of-concept platform and experimental setup and protocol could aid the future development and application of neurally-controlled powered artificial legs.
Biomedical Engineering, Issue 89, neural control, powered transfemoral prosthesis, electromyography (EMG), neural-machine interface, experimental setup and protocol
Play Button
Use of Stopped-Flow Fluorescence and Labeled Nucleotides to Analyze the ATP Turnover Cycle of Kinesins
Authors: Jennifer T. Patel, Hannah R. Belsham, Alexandra J. Rathbone, Claire T. Friel.
Institutions: University of Nottingham.
The kinesin superfamily of microtubule associated motor proteins share a characteristic motor domain which both hydrolyses ATP and binds microtubules. Kinesins display differences across the superfamily both in ATP turnover and in microtubule interaction. These differences tailor specific kinesins to various functions such as cargo transport, microtubule sliding, microtubule depolymerization and microtubule stabilization. To understand the mechanism of action of a kinesin it is important to understand how the chemical cycle of ATP turnover is coupled to the mechanical cycle of microtubule interaction. To dissect the ATP turnover cycle, one approach is to utilize fluorescently labeled nucleotides to visualize individual steps in the cycle. Determining the kinetics of each nucleotide transition in the ATP turnover cycle allows the rate-limiting step or steps for the complete cycle to be identified. For a kinesin, it is important to know the rate-limiting step, in the absence of microtubules, as this step is generally accelerated several thousand fold when the kinesin interacts with microtubules. The cycle in the absence of microtubules is then compared to that in the presence of microtubules to fully understand a kinesin’s ATP turnover cycle. The kinetics of individual nucleotide transitions are generally too fast to observe by manually mixing reactants, particularly in the presence of microtubules. A rapid mixing device, such as a stopped-flow fluorimeter, which allows kinetics to be observed on timescales of as little as a few milliseconds, can be used to monitor such transitions. Here, we describe protocols in which rapid mixing of reagents by stopped-flow is used in conjunction with fluorescently labeled nucleotides to dissect the ATP turnover cycle of a kinesin.
Chemistry, Issue 92, Kinesin, ATP turnover, mantATP, mantADP, stopped-flow fluorescence, microtubules, enzyme kinetics, nucleotide
Play Button
Magnetic Tweezers for the Measurement of Twist and Torque
Authors: Jan Lipfert, Mina Lee, Orkide Ordu, Jacob W. J. Kerssemakers, Nynke H. Dekker.
Institutions: Delft University of Technology.
Single-molecule techniques make it possible to investigate the behavior of individual biological molecules in solution in real time. These techniques include so-called force spectroscopy approaches such as atomic force microscopy, optical tweezers, flow stretching, and magnetic tweezers. Amongst these approaches, magnetic tweezers have distinguished themselves by their ability to apply torque while maintaining a constant stretching force. Here, it is illustrated how such a “conventional” magnetic tweezers experimental configuration can, through a straightforward modification of its field configuration to minimize the magnitude of the transverse field, be adapted to measure the degree of twist in a biological molecule. The resulting configuration is termed the freely-orbiting magnetic tweezers. Additionally, it is shown how further modification of the field configuration can yield a transverse field with a magnitude intermediate between that of the “conventional” magnetic tweezers and the freely-orbiting magnetic tweezers, which makes it possible to directly measure the torque stored in a biological molecule. This configuration is termed the magnetic torque tweezers. The accompanying video explains in detail how the conversion of conventional magnetic tweezers into freely-orbiting magnetic tweezers and magnetic torque tweezers can be accomplished, and demonstrates the use of these techniques. These adaptations maintain all the strengths of conventional magnetic tweezers while greatly expanding the versatility of this powerful instrument.
Bioengineering, Issue 87, magnetic tweezers, magnetic torque tweezers, freely-orbiting magnetic tweezers, twist, torque, DNA, single-molecule techniques
Play Button
Electronic Tongue Generating Continuous Recognition Patterns for Protein Analysis
Authors: Yanxia Hou, Maria Genua, Laurie-Amandine Garçon, Arnaud Buhot, Roberto Calemczuk, David Bonnaffé, Hugues Lortat-Jacob, Thierry Livache.
Institutions: Institut Nanosciences et Cryogénie, CEA-Grenoble, Université Paris-Sud, Institut de Biologie Structurale.
In current protocol, a combinatorial approach has been developed to simplify the design and production of sensing materials for the construction of electronic tongues (eT) for protein analysis. By mixing a small number of simple and easily accessible molecules with different physicochemical properties, used as building blocks (BBs), in varying and controlled proportions and allowing the mixtures to self-assemble on the gold surface of a prism, an array of combinatorial surfaces featuring appropriate properties for protein sensing was created. In this way, a great number of cross-reactive receptors can be rapidly and efficiently obtained. By combining such an array of combinatorial cross-reactive receptors (CoCRRs) with an optical detection system such as surface plasmon resonance imaging (SPRi), the obtained eT can monitor the binding events in real-time and generate continuous recognition patterns including 2D continuous evolution profile (CEP) and 3D continuous evolution landscape (CEL) for samples in liquid. Such an eT system is efficient for discrimination of common purified proteins.
Bioengineering, Issue 91, electronic tongue, combinatorial cross-reactive receptor, surface plasmon resonance imaging, pattern recognition, continuous evolution profiles, continuous evolution landscapes, protein analysis
Play Button
Design and Use of Multiplexed Chemostat Arrays
Authors: Aaron W. Miller, Corrie Befort, Emily O. Kerr, Maitreya J. Dunham.
Institutions: University of Washington.
Chemostats are continuous culture systems in which cells are grown in a tightly controlled, chemically constant environment where culture density is constrained by limiting specific nutrients.1,2 Data from chemostats are highly reproducible for the measurement of quantitative phenotypes as they provide a constant growth rate and environment at steady state. For these reasons, chemostats have become useful tools for fine-scale characterization of physiology through analysis of gene expression3-6 and other characteristics of cultures at steady-state equilibrium.7 Long-term experiments in chemostats can highlight specific trajectories that microbial populations adopt during adaptive evolution in a controlled environment. In fact, chemostats have been used for experimental evolution since their invention.8 A common result in evolution experiments is for each biological replicate to acquire a unique repertoire of mutations.9-13 This diversity suggests that there is much left to be discovered by performing evolution experiments with far greater throughput. We present here the design and operation of a relatively simple, low cost array of miniature chemostats—or ministats—and validate their use in determination of physiology and in evolution experiments with yeast. This approach entails growth of tens of chemostats run off a single multiplexed peristaltic pump. The cultures are maintained at a 20 ml working volume, which is practical for a variety of applications. It is our hope that increasing throughput, decreasing expense, and providing detailed building and operation instructions may also motivate research and industrial application of this design as a general platform for functionally characterizing large numbers of strains, species, and growth parameters, as well as genetic or drug libraries.
Genetics, Issue 72, Molecular Biology, Microbiology, Biochemistry, Cellular Biology, Basic Protocols, Genomics, Eukaryota, Bacteria, Biological Phenomena, Metabolic Phenomena, Genetic Phenomena, Microbiological Phenomena, Life sciences, chemostat, evolution, experimental evolution, Ministat, yeast, E. coli., Physiology, Continuous culture, high throughput, arrays, cell culture
Play Button
An Inverse Analysis Approach to the Characterization of Chemical Transport in Paints
Authors: Matthew P. Willis, Shawn M. Stevenson, Thomas P. Pearl, Brent A. Mantooth.
Institutions: U.S. Army Edgewood Chemical Biological Center, OptiMetrics, Inc., a DCS Company.
The ability to directly characterize chemical transport and interactions that occur within a material (i.e., subsurface dynamics) is a vital component in understanding contaminant mass transport and the ability to decontaminate materials. If a material is contaminated, over time, the transport of highly toxic chemicals (such as chemical warfare agent species) out of the material can result in vapor exposure or transfer to the skin, which can result in percutaneous exposure to personnel who interact with the material. Due to the high toxicity of chemical warfare agents, the release of trace chemical quantities is of significant concern. Mapping subsurface concentration distribution and transport characteristics of absorbed agents enables exposure hazards to be assessed in untested conditions. Furthermore, these tools can be used to characterize subsurface reaction dynamics to ultimately design improved decontaminants or decontamination procedures. To achieve this goal, an inverse analysis mass transport modeling approach was developed that utilizes time-resolved mass spectroscopy measurements of vapor emission from contaminated paint coatings as the input parameter for calculation of subsurface concentration profiles. Details are provided on sample preparation, including contaminant and material handling, the application of mass spectrometry for the measurement of emitted contaminant vapor, and the implementation of inverse analysis using a physics-based diffusion model to determine transport properties of live chemical warfare agents including distilled mustard (HD) and the nerve agent VX.
Chemistry, Issue 90, Vacuum, vapor emission, chemical warfare agent, contamination, mass transport, inverse analysis, volatile organic compound, paint, coating
Play Button
2D and 3D Chromosome Painting in Malaria Mosquitoes
Authors: Phillip George, Atashi Sharma, Igor V Sharakhov.
Institutions: Virginia Tech.
Fluorescent in situ hybridization (FISH) of whole arm chromosome probes is a robust technique for mapping genomic regions of interest, detecting chromosomal rearrangements, and studying three-dimensional (3D) organization of chromosomes in the cell nucleus. The advent of laser capture microdissection (LCM) and whole genome amplification (WGA) allows obtaining large quantities of DNA from single cells. The increased sensitivity of WGA kits prompted us to develop chromosome paints and to use them for exploring chromosome organization and evolution in non-model organisms. Here, we present a simple method for isolating and amplifying the euchromatic segments of single polytene chromosome arms from ovarian nurse cells of the African malaria mosquito Anopheles gambiae. This procedure provides an efficient platform for obtaining chromosome paints, while reducing the overall risk of introducing foreign DNA to the sample. The use of WGA allows for several rounds of re-amplification, resulting in high quantities of DNA that can be utilized for multiple experiments, including 2D and 3D FISH. We demonstrated that the developed chromosome paints can be successfully used to establish the correspondence between euchromatic portions of polytene and mitotic chromosome arms in An. gambiae. Overall, the union of LCM and single-chromosome WGA provides an efficient tool for creating significant amounts of target DNA for future cytogenetic and genomic studies.
Immunology, Issue 83, Microdissection, whole genome amplification, malaria mosquito, polytene chromosome, mitotic chromosomes, fluorescence in situ hybridization, chromosome painting
Play Button
Laboratory-determined Phosphorus Flux from Lake Sediments as a Measure of Internal Phosphorus Loading
Authors: Mary E. Ogdahl, Alan D. Steinman, Maggie E. Weinert.
Institutions: Grand Valley State University.
Eutrophication is a water quality issue in lakes worldwide, and there is a critical need to identify and control nutrient sources. Internal phosphorus (P) loading from lake sediments can account for a substantial portion of the total P load in eutrophic, and some mesotrophic, lakes. Laboratory determination of P release rates from sediment cores is one approach for determining the role of internal P loading and guiding management decisions. Two principal alternatives to experimental determination of sediment P release exist for estimating internal load: in situ measurements of changes in hypolimnetic P over time and P mass balance. The experimental approach using laboratory-based sediment incubations to quantify internal P load is a direct method, making it a valuable tool for lake management and restoration. Laboratory incubations of sediment cores can help determine the relative importance of internal vs. external P loads, as well as be used to answer a variety of lake management and research questions. We illustrate the use of sediment core incubations to assess the effectiveness of an aluminum sulfate (alum) treatment for reducing sediment P release. Other research questions that can be investigated using this approach include the effects of sediment resuspension and bioturbation on P release. The approach also has limitations. Assumptions must be made with respect to: extrapolating results from sediment cores to the entire lake; deciding over what time periods to measure nutrient release; and addressing possible core tube artifacts. A comprehensive dissolved oxygen monitoring strategy to assess temporal and spatial redox status in the lake provides greater confidence in annual P loads estimated from sediment core incubations.
Environmental Sciences, Issue 85, Limnology, internal loading, eutrophication, nutrient flux, sediment coring, phosphorus, lakes
Play Button
Experimental Protocol for Manipulating Plant-induced Soil Heterogeneity
Authors: Angela J. Brandt, Gaston A. del Pino, Jean H. Burns.
Institutions: Case Western Reserve University.
Coexistence theory has often treated environmental heterogeneity as being independent of the community composition; however biotic feedbacks such as plant-soil feedbacks (PSF) have large effects on plant performance, and create environmental heterogeneity that depends on the community composition. Understanding the importance of PSF for plant community assembly necessitates understanding of the role of heterogeneity in PSF, in addition to mean PSF effects. Here, we describe a protocol for manipulating plant-induced soil heterogeneity. Two example experiments are presented: (1) a field experiment with a 6-patch grid of soils to measure plant population responses and (2) a greenhouse experiment with 2-patch soils to measure individual plant responses. Soils can be collected from the zone of root influence (soils from the rhizosphere and directly adjacent to the rhizosphere) of plants in the field from conspecific and heterospecific plant species. Replicate collections are used to avoid pseudoreplicating soil samples. These soils are then placed into separate patches for heterogeneous treatments or mixed for a homogenized treatment. Care should be taken to ensure that heterogeneous and homogenized treatments experience the same degree of soil disturbance. Plants can then be placed in these soil treatments to determine the effect of plant-induced soil heterogeneity on plant performance. We demonstrate that plant-induced heterogeneity results in different outcomes than predicted by traditional coexistence models, perhaps because of the dynamic nature of these feedbacks. Theory that incorporates environmental heterogeneity influenced by the assembling community and additional empirical work is needed to determine when heterogeneity intrinsic to the assembling community will result in different assembly outcomes compared with heterogeneity extrinsic to the community composition.
Environmental Sciences, Issue 85, Coexistence, community assembly, environmental drivers, plant-soil feedback, soil heterogeneity, soil microbial communities, soil patch
Play Button
Design and Construction of an Urban Runoff Research Facility
Authors: Benjamin G. Wherley, Richard H. White, Kevin J. McInnes, Charles H. Fontanier, James C. Thomas, Jacqueline A. Aitkenhead-Peterson, Steven T. Kelly.
Institutions: Texas A&M University, The Scotts Miracle-Gro Company.
As the urban population increases, so does the area of irrigated urban landscape. Summer water use in urban areas can be 2-3x winter base line water use due to increased demand for landscape irrigation. Improper irrigation practices and large rainfall events can result in runoff from urban landscapes which has potential to carry nutrients and sediments into local streams and lakes where they may contribute to eutrophication. A 1,000 m2 facility was constructed which consists of 24 individual 33.6 m2 field plots, each equipped for measuring total runoff volumes with time and collection of runoff subsamples at selected intervals for quantification of chemical constituents in the runoff water from simulated urban landscapes. Runoff volumes from the first and second trials had coefficient of variability (CV) values of 38.2 and 28.7%, respectively. CV values for runoff pH, EC, and Na concentration for both trials were all under 10%. Concentrations of DOC, TDN, DON, PO4-P, K+, Mg2+, and Ca2+ had CV values less than 50% in both trials. Overall, the results of testing performed after sod installation at the facility indicated good uniformity between plots for runoff volumes and chemical constituents. The large plot size is sufficient to include much of the natural variability and therefore provides better simulation of urban landscape ecosystems.
Environmental Sciences, Issue 90, urban runoff, landscapes, home lawns, turfgrass, St. Augustinegrass, carbon, nitrogen, phosphorus, sodium
Play Button
Rapid Determination of the Thermal Nociceptive Threshold in Diabetic Rats
Authors: Saeed Alshahrani, Filipe Fernandez-Conti, Amanda Araujo, Mauricio DiFulvio.
Institutions: Wright State University, Universidade São Judas Tadeu.
Painful diabetic neuropathy (PDN) is characterized by hyperalgesia i.e., increased sensitivity to noxious stimulus, and allodynia i.e., hypersensitivity to normally innocuous stimuli1. Hyperalgesia and allodynia have been studied in many different rodent models of diabetes mellitus2. However, as stated by Bölcskei et al, determination of "pain" in animal models is challenging due to its subjective nature3. Moreover, the traditional methods used to determine behavioral responses to noxious thermal stimuli usually lack reproducibility and pharmacological sensitivity3. For instance, by using the hot-plate method of Ankier4, flinch, withdrawal and/or licking of either hind- and/or fore-paws is quantified as reflex latencies at constant high thermal stimuli (52-55 °C). However, animals that are hyperalgesic to thermal stimulus do not reproducibly show differences in reflex latencies using those supra-threshold temperatures3,5. As the recently described method of Bölcskei et al.6, the procedures described here allows for the rapid, sensitive and reproducible determination of thermal nociceptive thresholds (TNTs) in mice and rats. The method uses slowly increasing thermal stimulus applied mostly to the skin of mouse/rat plantar surface. The method is particularly sensitive to study anti-nociception during hyperalgesic states such as PDN. The procedures described bellow are based on the ones published in detail by Almási et al 5 and Bölcskei et al 3. The procedures described here have been approved the Laboratory Animal Care and Use Committee (LACUC), Wright State University.
Neuroscience, Issue 63, Diabetes, painful diabetic neuropathy, nociception, thermal nociceptive threshold, nocifensive behavior
Play Button
Quantification of Proteins Using Peptide Immunoaffinity Enrichment Coupled with Mass Spectrometry
Authors: Lei Zhao, Jeffrey R. Whiteaker, Matthew E. Pope, Eric Kuhn, Angela Jackson, N. Leigh Anderson, Terry W. Pearson, Steven A. Carr, Amanda G. Paulovich.
Institutions: Fred Hutchinson Cancer Research Center - FHCRC, University of Victoria, Broad Institute of MIT and Harvard, University of Victoria, Plasma Proteome Institute.
There is a great need for quantitative assays in measuring proteins. Traditional sandwich immunoassays, largely considered the gold standard in quantitation, are associated with a high cost, long lead time, and are fraught with drawbacks (e.g. heterophilic antibodies, autoantibody interference, 'hook-effect').1 An alternative technique is affinity enrichment of peptides coupled with quantitative mass spectrometry, commonly referred to as SISCAPA (Stable Isotope Standards and Capture by Anti-Peptide Antibodies).2 In this technique, affinity enrichment of peptides with stable isotope dilution and detection by selected/multiple reaction monitoring mass spectrometry (SRM/MRM-MS) provides quantitative measurement of peptides as surrogates for their respective proteins. SRM/MRM-MS is well established for accurate quantitation of small molecules 3, 4 and more recently has been adapted to measure the concentrations of proteins in plasma and cell lysates.5-7 To achieve quantitation of proteins, these larger molecules are digested to component peptides using an enzyme such as trypsin. One or more selected peptides whose sequence is unique to the target protein in that species (i.e. "proteotypic" peptides) are then enriched from the sample using anti-peptide antibodies and measured as quantitative stoichiometric surrogates for protein concentration in the sample. Hence, coupled to stable isotope dilution (SID) methods (i.e. a spiked-in stable isotope labeled peptide standard), SRM/MRM can be used to measure concentrations of proteotypic peptides as surrogates for quantification of proteins in complex biological matrices. The assays have several advantages compared to traditional immunoassays. The reagents are relatively less expensive to generate, the specificity for the analyte is excellent, the assays can be highly multiplexed, enrichment can be performed from neat plasma (no depletion required), and the technique is amenable to a wide array of proteins or modifications of interest.8-13 In this video we demonstrate the basic protocol as adapted to a magnetic bead platform.
Molecular Biology, Issue 53, Mass spectrometry, targeted assay, peptide, MRM, SISCAPA, protein quantitation
Play Button
Measurement and Analysis of Atomic Hydrogen and Diatomic Molecular AlO, C2, CN, and TiO Spectra Following Laser-induced Optical Breakdown
Authors: Christian G. Parigger, Alexander C. Woods, Michael J. Witte, Lauren D. Swafford, David M. Surmick.
Institutions: University of Tennessee Space Institute.
In this work, we present time-resolved measurements of atomic and diatomic spectra following laser-induced optical breakdown. A typical LIBS arrangement is used. Here we operate a Nd:YAG laser at a frequency of 10 Hz at the fundamental wavelength of 1,064 nm. The 14 nsec pulses with anenergy of 190 mJ/pulse are focused to a 50 µm spot size to generate a plasma from optical breakdown or laser ablation in air. The microplasma is imaged onto the entrance slit of a 0.6 m spectrometer, and spectra are recorded using an 1,800 grooves/mm grating an intensified linear diode array and optical multichannel analyzer (OMA) or an ICCD. Of interest are Stark-broadened atomic lines of the hydrogen Balmer series to infer electron density. We also elaborate on temperature measurements from diatomic emission spectra of aluminum monoxide (AlO), carbon (C2), cyanogen (CN), and titanium monoxide (TiO). The experimental procedures include wavelength and sensitivity calibrations. Analysis of the recorded molecular spectra is accomplished by the fitting of data with tabulated line strengths. Furthermore, Monte-Carlo type simulations are performed to estimate the error margins. Time-resolved measurements are essential for the transient plasma commonly encountered in LIBS.
Physics, Issue 84, Laser Induced Breakdown Spectroscopy, Laser Ablation, Molecular Spectroscopy, Atomic Spectroscopy, Plasma Diagnostics
Play Button
Analyzing Protein Dynamics Using Hydrogen Exchange Mass Spectrometry
Authors: Nikolai Hentze, Matthias P. Mayer.
Institutions: University of Heidelberg.
All cellular processes depend on the functionality of proteins. Although the functionality of a given protein is the direct consequence of its unique amino acid sequence, it is only realized by the folding of the polypeptide chain into a single defined three-dimensional arrangement or more commonly into an ensemble of interconverting conformations. Investigating the connection between protein conformation and its function is therefore essential for a complete understanding of how proteins are able to fulfill their great variety of tasks. One possibility to study conformational changes a protein undergoes while progressing through its functional cycle is hydrogen-1H/2H-exchange in combination with high-resolution mass spectrometry (HX-MS). HX-MS is a versatile and robust method that adds a new dimension to structural information obtained by e.g. crystallography. It is used to study protein folding and unfolding, binding of small molecule ligands, protein-protein interactions, conformational changes linked to enzyme catalysis, and allostery. In addition, HX-MS is often used when the amount of protein is very limited or crystallization of the protein is not feasible. Here we provide a general protocol for studying protein dynamics with HX-MS and describe as an example how to reveal the interaction interface of two proteins in a complex.   
Chemistry, Issue 81, Molecular Chaperones, mass spectrometers, Amino Acids, Peptides, Proteins, Enzymes, Coenzymes, Protein dynamics, conformational changes, allostery, protein folding, secondary structure, mass spectrometry
Play Button
Nanomanipulation of Single RNA Molecules by Optical Tweezers
Authors: William Stephenson, Gorby Wan, Scott A. Tenenbaum, Pan T. X. Li.
Institutions: University at Albany, State University of New York, University at Albany, State University of New York, University at Albany, State University of New York, University at Albany, State University of New York, University at Albany, State University of New York.
A large portion of the human genome is transcribed but not translated. In this post genomic era, regulatory functions of RNA have been shown to be increasingly important. As RNA function often depends on its ability to adopt alternative structures, it is difficult to predict RNA three-dimensional structures directly from sequence. Single-molecule approaches show potentials to solve the problem of RNA structural polymorphism by monitoring molecular structures one molecule at a time. This work presents a method to precisely manipulate the folding and structure of single RNA molecules using optical tweezers. First, methods to synthesize molecules suitable for single-molecule mechanical work are described. Next, various calibration procedures to ensure the proper operations of the optical tweezers are discussed. Next, various experiments are explained. To demonstrate the utility of the technique, results of mechanically unfolding RNA hairpins and a single RNA kissing complex are used as evidence. In these examples, the nanomanipulation technique was used to study folding of each structural domain, including secondary and tertiary, independently. Lastly, the limitations and future applications of the method are discussed.
Bioengineering, Issue 90, RNA folding, single-molecule, optical tweezers, nanomanipulation, RNA secondary structure, RNA tertiary structure
Play Button
Designing Silk-silk Protein Alloy Materials for Biomedical Applications
Authors: Xiao Hu, Solomon Duki, Joseph Forys, Jeffrey Hettinger, Justin Buchicchio, Tabbetha Dobbins, Catherine Yang.
Institutions: Rowan University, Rowan University, Cooper Medical School of Rowan University, Rowan University.
Fibrous proteins display different sequences and structures that have been used for various applications in biomedical fields such as biosensors, nanomedicine, tissue regeneration, and drug delivery. Designing materials based on the molecular-scale interactions between these proteins will help generate new multifunctional protein alloy biomaterials with tunable properties. Such alloy material systems also provide advantages in comparison to traditional synthetic polymers due to the materials biodegradability, biocompatibility, and tenability in the body. This article used the protein blends of wild tussah silk (Antheraea pernyi) and domestic mulberry silk (Bombyx mori) as an example to provide useful protocols regarding these topics, including how to predict protein-protein interactions by computational methods, how to produce protein alloy solutions, how to verify alloy systems by thermal analysis, and how to fabricate variable alloy materials including optical materials with diffraction gratings, electric materials with circuits coatings, and pharmaceutical materials for drug release and delivery. These methods can provide important information for designing the next generation multifunctional biomaterials based on different protein alloys.
Bioengineering, Issue 90, protein alloys, biomaterials, biomedical, silk blends, computational simulation, implantable electronic devices
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Characterization of Electrode Materials for Lithium Ion and Sodium Ion Batteries Using Synchrotron Radiation Techniques
Authors: Marca M. Doeff, Guoying Chen, Jordi Cabana, Thomas J. Richardson, Apurva Mehta, Mona Shirpour, Hugues Duncan, Chunjoong Kim, Kinson C. Kam, Thomas Conry.
Institutions: Lawrence Berkeley National Laboratory, University of Illinois at Chicago, Stanford Synchrotron Radiation Lightsource, Haldor Topsøe A/S, PolyPlus Battery Company.
Intercalation compounds such as transition metal oxides or phosphates are the most commonly used electrode materials in Li-ion and Na-ion batteries. During insertion or removal of alkali metal ions, the redox states of transition metals in the compounds change and structural transformations such as phase transitions and/or lattice parameter increases or decreases occur. These behaviors in turn determine important characteristics of the batteries such as the potential profiles, rate capabilities, and cycle lives. The extremely bright and tunable x-rays produced by synchrotron radiation allow rapid acquisition of high-resolution data that provide information about these processes. Transformations in the bulk materials, such as phase transitions, can be directly observed using X-ray diffraction (XRD), while X-ray absorption spectroscopy (XAS) gives information about the local electronic and geometric structures (e.g. changes in redox states and bond lengths). In situ experiments carried out on operating cells are particularly useful because they allow direct correlation between the electrochemical and structural properties of the materials. These experiments are time-consuming and can be challenging to design due to the reactivity and air-sensitivity of the alkali metal anodes used in the half-cell configurations, and/or the possibility of signal interference from other cell components and hardware. For these reasons, it is appropriate to carry out ex situ experiments (e.g. on electrodes harvested from partially charged or cycled cells) in some cases. Here, we present detailed protocols for the preparation of both ex situ and in situ samples for experiments involving synchrotron radiation and demonstrate how these experiments are done.
Physics, Issue 81, X-Ray Absorption Spectroscopy, X-Ray Diffraction, inorganic chemistry, electric batteries (applications), energy storage, Electrode materials, Li-ion battery, Na-ion battery, X-ray Absorption Spectroscopy (XAS), in situ X-ray diffraction (XRD)
Play Button
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Authors: Daniel T. Claiborne, Jessica L. Prince, Eric Hunter.
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro replication of HIV-1 as influenced by the gag gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro replication of chronically derived gag-pro sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Simultaneous Scalp Electroencephalography (EEG), Electromyography (EMG), and Whole-body Segmental Inertial Recording for Multi-modal Neural Decoding
Authors: Thomas C. Bulea, Atilla Kilicarslan, Recep Ozdemir, William H. Paloski, Jose L. Contreras-Vidal.
Institutions: National Institutes of Health, University of Houston, University of Houston, University of Houston, University of Houston.
Recent studies support the involvement of supraspinal networks in control of bipedal human walking. Part of this evidence encompasses studies, including our previous work, demonstrating that gait kinematics and limb coordination during treadmill walking can be inferred from the scalp electroencephalogram (EEG) with reasonably high decoding accuracies. These results provide impetus for development of non-invasive brain-machine-interface (BMI) systems for use in restoration and/or augmentation of gait- a primary goal of rehabilitation research. To date, studies examining EEG decoding of activity during gait have been limited to treadmill walking in a controlled environment. However, to be practically viable a BMI system must be applicable for use in everyday locomotor tasks such as over ground walking and turning. Here, we present a novel protocol for non-invasive collection of brain activity (EEG), muscle activity (electromyography (EMG)), and whole-body kinematic data (head, torso, and limb trajectories) during both treadmill and over ground walking tasks. By collecting these data in the uncontrolled environment insight can be gained regarding the feasibility of decoding unconstrained gait and surface EMG from scalp EEG.
Behavior, Issue 77, Neuroscience, Neurobiology, Medicine, Anatomy, Physiology, Biomedical Engineering, Molecular Biology, Electroencephalography, EEG, Electromyography, EMG, electroencephalograph, gait, brain-computer interface, brain machine interface, neural decoding, over-ground walking, robotic gait, brain, imaging, clinical techniques
Play Button
C. elegans Tracking and Behavioral Measurement
Authors: Jirapat Likitlersuang, Greg Stephens, Konstantine Palanski, William S. Ryu.
Institutions: University of Toronto, Vrije Universiteit, Okinawa Institute of Science and Technology, University of Toronto.
We have developed instrumentation, image processing, and data analysis techniques to quantify the locomotory behavior of C. elegans as it crawls on the surface of an agar plate. For the study of the genetic, biochemical, and neuronal basis of behavior, C. elegans is an ideal organism because it is genetically tractable, amenable to microscopy, and shows a number of complex behaviors, including taxis, learning, and social interaction1,2. Behavioral analysis based on tracking the movements of worms as they crawl on agar plates have been particularly useful in the study of sensory behavior3, locomotion4, and general mutational phenotyping5. Our system works by moving the camera and illumination system as the worms crawls on a stationary agar plate, which ensures no mechanical stimulus is transmitted to the worm. Our tracking system is easy to use and includes a semi-automatic calibration feature. A challenge of all video tracking systems is that it generates an enormous amount of data that is intrinsically high dimensional. Our image processing and data analysis programs deal with this challenge by reducing the worms shape into a set of independent components, which comprehensively reconstruct the worms behavior as a function of only 3-4 dimensions6,7. As an example of the process we show that the worm enters and exits its reversal state in a phase specific manner.
Neuroscience, Issue 69, Physics, Biophysics, Anatomy, Microscopy, Ethology, Behavior, Machine Vision, C. elegans, animal model
Play Button
Light/dark Transition Test for Mice
Authors: Keizo Takao, Tsuyoshi Miyakawa.
Institutions: Graduate School of Medicine, Kyoto University.
Although all of the mouse genome sequences have been determined, we do not yet know the functions of most of these genes. Gene-targeting techniques, however, can be used to delete or manipulate a specific gene in mice. The influence of a given gene on a specific behavior can then be determined by conducting behavioral analyses of the mutant mice. As a test for behavioral phenotyping of mutant mice, the light/dark transition test is one of the most widely used tests to measure anxiety-like behavior in mice. The test is based on the natural aversion of mice to brightly illuminated areas and on their spontaneous exploratory behavior in novel environments. The test is sensitive to anxiolytic drug treatment. The apparatus consists of a dark chamber and a brightly illuminated chamber. Mice are allowed to move freely between the two chambers. The number of entries into the bright chamber and the duration of time spent there are indices of bright-space anxiety in mice. To obtain phenotyping results of a strain of mutant mice that can be readily reproduced and compared with those of other mutants, the behavioral test methods should be as identical as possible between laboratories. The procedural differences that exist between laboratories, however, make it difficult to replicate or compare the results among laboratories. Here, we present our protocol for the light/dark transition test as a movie so that the details of the protocol can be demonstrated. In our laboratory, we have assessed more than 60 strains of mutant mice using the protocol shown in the movie. Those data will be disclosed as a part of a public database that we are now constructing. Visualization of the protocol will facilitate understanding of the details of the entire experimental procedure, allowing for standardization of the protocols used across laboratories and comparisons of the behavioral phenotypes of various strains of mutant mice assessed using this test.
Neuroscience, Issue 1, knockout mice, transgenic mice, behavioral test, phenotyping
Play Button
Nanofabrication of Gate-defined GaAs/AlGaAs Lateral Quantum Dots
Authors: Chloé Bureau-Oxton, Julien Camirand Lemyre, Michel Pioro-Ladrière.
Institutions: Université de Sherbrooke.
A quantum computer is a computer composed of quantum bits (qubits) that takes advantage of quantum effects, such as superposition of states and entanglement, to solve certain problems exponentially faster than with the best known algorithms on a classical computer. Gate-defined lateral quantum dots on GaAs/AlGaAs are one of many avenues explored for the implementation of a qubit. When properly fabricated, such a device is able to trap a small number of electrons in a certain region of space. The spin states of these electrons can then be used to implement the logical 0 and 1 of the quantum bit. Given the nanometer scale of these quantum dots, cleanroom facilities offering specialized equipment- such as scanning electron microscopes and e-beam evaporators- are required for their fabrication. Great care must be taken throughout the fabrication process to maintain cleanliness of the sample surface and to avoid damaging the fragile gates of the structure. This paper presents the detailed fabrication protocol of gate-defined lateral quantum dots from the wafer to a working device. Characterization methods and representative results are also briefly discussed. Although this paper concentrates on double quantum dots, the fabrication process remains the same for single or triple dots or even arrays of quantum dots. Moreover, the protocol can be adapted to fabricate lateral quantum dots on other substrates, such as Si/SiGe.
Physics, Issue 81, Nanostructures, Quantum Dots, Nanotechnology, Electronics, microelectronics, solid state physics, Nanofabrication, Nanoelectronics, Spin qubit, Lateral quantum dot
Play Button
Molecular Evolution of the Tre Recombinase
Authors: Frank Buchholz.
Institutions: Max Plank Institute for Molecular Cell Biology and Genetics, Dresden.
Here we report the generation of Tre recombinase through directed, molecular evolution. Tre recombinase recognizes a pre-defined target sequence within the LTR sequences of the HIV-1 provirus, resulting in the excision and eradication of the provirus from infected human cells. We started with Cre, a 38-kDa recombinase, that recognizes a 34-bp double-stranded DNA sequence known as loxP. Because Cre can effectively eliminate genomic sequences, we set out to tailor a recombinase that could remove the sequence between the 5'-LTR and 3'-LTR of an integrated HIV-1 provirus. As a first step we identified sequences within the LTR sites that were similar to loxP and tested for recombination activity. Initially Cre and mutagenized Cre libraries failed to recombine the chosen loxLTR sites of the HIV-1 provirus. As the start of any directed molecular evolution process requires at least residual activity, the original asymmetric loxLTR sequences were split into subsets and tested again for recombination activity. Acting as intermediates, recombination activity was shown with the subsets. Next, recombinase libraries were enriched through reiterative evolution cycles. Subsequently, enriched libraries were shuffled and recombined. The combination of different mutations proved synergistic and recombinases were created that were able to recombine loxLTR1 and loxLTR2. This was evidence that an evolutionary strategy through intermediates can be successful. After a total of 126 evolution cycles individual recombinases were functionally and structurally analyzed. The most active recombinase -- Tre -- had 19 amino acid changes as compared to Cre. Tre recombinase was able to excise the HIV-1 provirus from the genome HIV-1 infected HeLa cells (see "HIV-1 Proviral DNA Excision Using an Evolved Recombinase", Hauber J., Heinrich-Pette-Institute for Experimental Virology and Immunology, Hamburg, Germany). While still in its infancy, directed molecular evolution will allow the creation of custom enzymes that will serve as tools of "molecular surgery" and molecular medicine.
Cell Biology, Issue 15, HIV-1, Tre recombinase, Site-specific recombination, molecular evolution
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.