JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
The Effects of International Trade on Water Use.
PUBLISHED: 07-14-2015
The growing scarcity of water resources worldwide is conditioned not only by precipitation changes but also by changes to water use patterns; the latter is driven by social contexts such as capital intensity, trade openness, and income. This study explores the determinants of water use by focusing on the effect of trade openness on the degree to which water is withdrawn and consumed. Previous studies have conducted analyses on the determinants of water use but have ignored the endogeneity of trade openness. To deal with this endogeneity problem, we adopt instrumental variable estimation and clarify the determinants of water use. The determinants of water use are divided into scale, technique, and composition effects. Calculating each trade-induced effect, we examine how trade openness affects the degree of water use. Our results show that while trade has a positive effect on water withdrawal/consumption through trade-induced scale effects and direct composition effects, the trade-induced technique and the indirect composition effect, both of which exhibit a negative sign, counteract the scale effect and the direct composition effect, resulting in reduced water withdrawal/consumption. The overall effect induced by trade is calculated as being in the range of -1.00 to -1.52; this means that the overall effect of a 1% increase in the intensity of trade openness reduces the degree of water withdrawal/consumption by roughly 1.0-1.5%, on average. This result indicates that international bilateral trade would promote efficient water use through the diffusion of water-saving technologies and the reformation of industry composition.
Authors: Charmaine Y. Pietersen, Maribel P. Lim, Tsung-Ung W. Woo.
Published: 08-06-2009
We proposed to investigate the gray matter reduction in the superior temporal gyrus seen in schizophrenia patients, by interrogating gene expression profiles of pyramidal neurons in layer III. It is well known that the cerebral cortex is an exceptionally heterogeneous structure comprising diverse regions, layers and cell types, each of which is characterized by distinct cellular and molecular compositions and therefore differential gene expression profiles. To circumvent the confounding effects of tissue heterogeneity, we used laser-capture microdissection (LCM) in order to isolate our specific cell-type i.e pyramidal neurons. Approximately 500 pyramidal neurons stained with the Histogene staining solution were captured using the Arcturus XT LCM system. RNA was then isolated from captured cells and underwent two rounds of T7-based linear amplification using Arcturus/Molecular Devices kits. The Experion LabChip (Bio-Rad) gel and electropherogram indicated good quality a(m)RNA, with a transcript length extending past 600nt required for microarrays. The amount of mRNA obtained averaged 51μg, with acceptable mean sample purity as indicated by the A260/280 ratio, of 2.5. Gene expression was profiled using the Human X3P GeneChip probe array from Affymetrix.
24 Related JoVE Articles!
Play Button
Imaging Effector Memory T cells in the Ear After Induction of Adoptive DTH
Authors: Melanie P. Matheu, Christine Beeton, Ian Parker, K. George Chandy, Michael D. Cahalan.
Institutions: University of California, Irvine (UCI), University of California, Irvine (UCI).
Delayed type hypersensitivity (DTH) is an immune reaction in which the main players are CCR7- effector / memory T lymphocytes. Here, we demonstrate a method for inducing and recording the progress of a DTH reaction in the rat ear. This is followed by a demonstration of the preparation of rat ear tissue for two-photon imaging of the CCR7- effector / memory T cell response. An adoptive DTH is induced by the intraperitoneal injection of GFP-labeled Ova-specific CCR7- effector / memory T cell line (Beeton, C J. Visualized Experiments, Issue 8). Cells are then allowed to equilibrate in the rat for 48 hours before challenge by injecting one ear with saline (control ear) and the other with a 1:1 mix of Ova and Ova conjugated to Texas-Red (Ova-TR) to allow visualization of resident antigen-presenting cells. We describe a method of tissue preparation useful for imaging the motility of cells within the deep dermal layer during an immune response, in conjunction with visualization of collagen fibers by second harmonic generation. Ear tissue is cut into 5 x 5 mm squares (slightly larger is better) and mounted onto plastic cover slips using Vetbond™, which are then secured using silicone grease in an imaging chamber and superfused by oxygen-bubbled tissue culture medium at 37°C.
Immunology, Issue 18, 2-photon imaging, delayed type hypersensitivity, inflammation, T cells, antigen presenting cells, ear, rat,
Play Button
Microwave-assisted Functionalization of Poly(ethylene glycol) and On-resin Peptides for Use in Chain Polymerizations and Hydrogel Formation
Authors: Amy H. Van Hove, Brandon D. Wilson, Danielle S. W. Benoit.
Institutions: University of Rochester, University of Rochester, University of Rochester Medical Center.
One of the main benefits to using poly(ethylene glycol) (PEG) macromers in hydrogel formation is synthetic versatility. The ability to draw from a large variety of PEG molecular weights and configurations (arm number, arm length, and branching pattern) affords researchers tight control over resulting hydrogel structures and properties, including Young’s modulus and mesh size. This video will illustrate a rapid, efficient, solvent-free, microwave-assisted method to methacrylate PEG precursors into poly(ethylene glycol) dimethacrylate (PEGDM). This synthetic method provides much-needed starting materials for applications in drug delivery and regenerative medicine. The demonstrated method is superior to traditional methacrylation methods as it is significantly faster and simpler, as well as more economical and environmentally friendly, using smaller amounts of reagents and solvents. We will also demonstrate an adaptation of this technique for on-resin methacrylamide functionalization of peptides. This on-resin method allows the N-terminus of peptides to be functionalized with methacrylamide groups prior to deprotection and cleavage from resin. This allows for selective addition of methacrylamide groups to the N-termini of the peptides while amino acids with reactive side groups (e.g. primary amine of lysine, primary alcohol of serine, secondary alcohols of threonine, and phenol of tyrosine) remain protected, preventing functionalization at multiple sites. This article will detail common analytical methods (proton Nuclear Magnetic Resonance spectroscopy (;H-NMR) and Matrix Assisted Laser Desorption Ionization Time of Flight mass spectrometry (MALDI-ToF)) to assess the efficiency of the functionalizations. Common pitfalls and suggested troubleshooting methods will be addressed, as will modifications of the technique which can be used to further tune macromer functionality and resulting hydrogel physical and chemical properties. Use of synthesized products for the formation of hydrogels for drug delivery and cell-material interaction studies will be demonstrated, with particular attention paid to modifying hydrogel composition to affect mesh size, controlling hydrogel stiffness and drug release.
Chemistry, Issue 80, Poly(ethylene glycol), peptides, polymerization, polymers, methacrylation, peptide functionalization, 1H-NMR, MALDI-ToF, hydrogels, macromer synthesis
Play Button
A Simple Stimulatory Device for Evoking Point-like Tactile Stimuli: A Searchlight for LFP to Spike Transitions
Authors: Antonio G. Zippo, Sara Nencini, Gian Carlo Caramenti, Maurizio Valente, Riccardo Storchi, Gabriele E.M. Biella.
Institutions: National Research Council, National Research Council, University of Manchester.
Current neurophysiological research has the aim to develop methodologies to investigate the signal route from neuron to neuron, namely in the transitions from spikes to Local Field Potentials (LFPs) and from LFPs to spikes. LFPs have a complex dependence on spike activity and their relation is still poorly understood1. The elucidation of these signal relations would be helpful both for clinical diagnostics (e.g. stimulation paradigms for Deep Brain Stimulation) and for a deeper comprehension of neural coding strategies in normal and pathological conditions (e.g. epilepsy, Parkinson disease, chronic pain). To this aim, one has to solve technical issues related to stimulation devices, stimulation paradigms and computational analyses. Therefore, a custom-made stimulation device was developed in order to deliver stimuli well regulated in space and time that does not incur in mechanical resonance. Subsequently, as an exemplification, a set of reliable LFP-spike relationships was extracted. The performance of the device was investigated by extracellular recordings, jointly spikes and LFP responses to the applied stimuli, from the rat Primary Somatosensory cortex. Then, by means of a multi-objective optimization strategy, a predictive model for spike occurrence based on LFPs was estimated. The application of this paradigm shows that the device is adequately suited to deliver high frequency tactile stimulation, outperforming common piezoelectric actuators. As a proof of the efficacy of the device, the following results were presented: 1) the timing and reliability of LFP responses well match the spike responses, 2) LFPs are sensitive to the stimulation history and capture not only the average response but also the trial-to-trial fluctuations in the spike activity and, finally, 3) by using the LFP signal it is possible to estimate a range of predictive models that capture different aspects of the spike activity.
Neuroscience, Issue 85, LFP, spike, tactile stimulus, Multiobjective function, Neuron, somatosensory cortex
Play Button
Laboratory Estimation of Net Trophic Transfer Efficiencies of PCB Congeners to Lake Trout (Salvelinus namaycush) from Its Prey
Authors: Charles P. Madenjian, Richard R. Rediske, James P. O'Keefe, Solomon R. David.
Institutions: U. S. Geological Survey, Grand Valley State University, Shedd Aquarium.
A technique for laboratory estimation of net trophic transfer efficiency (γ) of polychlorinated biphenyl (PCB) congeners to piscivorous fish from their prey is described herein. During a 135-day laboratory experiment, we fed bloater (Coregonus hoyi) that had been caught in Lake Michigan to lake trout (Salvelinus namaycush) kept in eight laboratory tanks. Bloater is a natural prey for lake trout. In four of the tanks, a relatively high flow rate was used to ensure relatively high activity by the lake trout, whereas a low flow rate was used in the other four tanks, allowing for low lake trout activity. On a tank-by-tank basis, the amount of food eaten by the lake trout on each day of the experiment was recorded. Each lake trout was weighed at the start and end of the experiment. Four to nine lake trout from each of the eight tanks were sacrificed at the start of the experiment, and all 10 lake trout remaining in each of the tanks were euthanized at the end of the experiment. We determined concentrations of 75 PCB congeners in the lake trout at the start of the experiment, in the lake trout at the end of the experiment, and in bloaters fed to the lake trout during the experiment. Based on these measurements, γ was calculated for each of 75 PCB congeners in each of the eight tanks. Mean γ was calculated for each of the 75 PCB congeners for both active and inactive lake trout. Because the experiment was replicated in eight tanks, the standard error about mean γ could be estimated. Results from this type of experiment are useful in risk assessment models to predict future risk to humans and wildlife eating contaminated fish under various scenarios of environmental contamination.
Environmental Sciences, Issue 90, trophic transfer efficiency, polychlorinated biphenyl congeners, lake trout, activity, contaminants, accumulation, risk assessment, toxic equivalents
Play Button
Eye Tracking, Cortisol, and a Sleep vs. Wake Consolidation Delay: Combining Methods to Uncover an Interactive Effect of Sleep and Cortisol on Memory
Authors: Kelly A. Bennion, Katherine R. Mickley Steinmetz, Elizabeth A. Kensinger, Jessica D. Payne.
Institutions: Boston College, Wofford College, University of Notre Dame.
Although rises in cortisol can benefit memory consolidation, as can sleep soon after encoding, there is currently a paucity of literature as to how these two factors may interact to influence consolidation. Here we present a protocol to examine the interactive influence of cortisol and sleep on memory consolidation, by combining three methods: eye tracking, salivary cortisol analysis, and behavioral memory testing across sleep and wake delays. To assess resting cortisol levels, participants gave a saliva sample before viewing negative and neutral objects within scenes. To measure overt attention, participants’ eye gaze was tracked during encoding. To manipulate whether sleep occurred during the consolidation window, participants either encoded scenes in the evening, slept overnight, and took a recognition test the next morning, or encoded scenes in the morning and remained awake during a comparably long retention interval. Additional control groups were tested after a 20 min delay in the morning or evening, to control for time-of-day effects. Together, results showed that there is a direct relation between resting cortisol at encoding and subsequent memory, only following a period of sleep. Through eye tracking, it was further determined that for negative stimuli, this beneficial effect of cortisol on subsequent memory may be due to cortisol strengthening the relation between where participants look during encoding and what they are later able to remember. Overall, results obtained by a combination of these methods uncovered an interactive effect of sleep and cortisol on memory consolidation.
Behavior, Issue 88, attention, consolidation, cortisol, emotion, encoding, glucocorticoids, memory, sleep, stress
Play Button
Fat Preference: A Novel Model of Eating Behavior in Rats
Authors: James M Kasper, Sarah B Johnson, Jonathan D. Hommel.
Institutions: University of Texas Medical Branch.
Obesity is a growing problem in the United States of America, with more than a third of the population classified as obese. One factor contributing to this multifactorial disorder is the consumption of a high fat diet, a behavior that has been shown to increase both caloric intake and body fat content. However, the elements regulating preference for high fat food over other foods remain understudied. To overcome this deficit, a model to quickly and easily test changes in the preference for dietary fat was developed. The Fat Preference model presents rats with a series of choices between foods with differing fat content. Like humans, rats have a natural bias toward consuming high fat food, making the rat model ideal for translational studies. Changes in preference can be ascribed to the effect of either genetic differences or pharmacological interventions. This model allows for the exploration of determinates of fat preference and screening pharmacotherapeutic agents that influence acquisition of obesity.
Behavior, Issue 88, obesity, fat, preference, choice, diet, macronutrient, animal model
Play Button
Experimental Protocol for Manipulating Plant-induced Soil Heterogeneity
Authors: Angela J. Brandt, Gaston A. del Pino, Jean H. Burns.
Institutions: Case Western Reserve University.
Coexistence theory has often treated environmental heterogeneity as being independent of the community composition; however biotic feedbacks such as plant-soil feedbacks (PSF) have large effects on plant performance, and create environmental heterogeneity that depends on the community composition. Understanding the importance of PSF for plant community assembly necessitates understanding of the role of heterogeneity in PSF, in addition to mean PSF effects. Here, we describe a protocol for manipulating plant-induced soil heterogeneity. Two example experiments are presented: (1) a field experiment with a 6-patch grid of soils to measure plant population responses and (2) a greenhouse experiment with 2-patch soils to measure individual plant responses. Soils can be collected from the zone of root influence (soils from the rhizosphere and directly adjacent to the rhizosphere) of plants in the field from conspecific and heterospecific plant species. Replicate collections are used to avoid pseudoreplicating soil samples. These soils are then placed into separate patches for heterogeneous treatments or mixed for a homogenized treatment. Care should be taken to ensure that heterogeneous and homogenized treatments experience the same degree of soil disturbance. Plants can then be placed in these soil treatments to determine the effect of plant-induced soil heterogeneity on plant performance. We demonstrate that plant-induced heterogeneity results in different outcomes than predicted by traditional coexistence models, perhaps because of the dynamic nature of these feedbacks. Theory that incorporates environmental heterogeneity influenced by the assembling community and additional empirical work is needed to determine when heterogeneity intrinsic to the assembling community will result in different assembly outcomes compared with heterogeneity extrinsic to the community composition.
Environmental Sciences, Issue 85, Coexistence, community assembly, environmental drivers, plant-soil feedback, soil heterogeneity, soil microbial communities, soil patch
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Rapid and Low-cost Prototyping of Medical Devices Using 3D Printed Molds for Liquid Injection Molding
Authors: Philip Chung, J. Alex Heller, Mozziyar Etemadi, Paige E. Ottoson, Jonathan A. Liu, Larry Rand, Shuvo Roy.
Institutions: University of California, San Francisco, University of California, San Francisco, University of Southern California.
Biologically inert elastomers such as silicone are favorable materials for medical device fabrication, but forming and curing these elastomers using traditional liquid injection molding processes can be an expensive process due to tooling and equipment costs. As a result, it has traditionally been impractical to use liquid injection molding for low-cost, rapid prototyping applications. We have devised a method for rapid and low-cost production of liquid elastomer injection molded devices that utilizes fused deposition modeling 3D printers for mold design and a modified desiccator as an injection system. Low costs and rapid turnaround time in this technique lower the barrier to iteratively designing and prototyping complex elastomer devices. Furthermore, CAD models developed in this process can be later adapted for metal mold tooling design, enabling an easy transition to a traditional injection molding process. We have used this technique to manufacture intravaginal probes involving complex geometries, as well as overmolding over metal parts, using tools commonly available within an academic research laboratory. However, this technique can be easily adapted to create liquid injection molded devices for many other applications.
Bioengineering, Issue 88, liquid injection molding, reaction injection molding, molds, 3D printing, fused deposition modeling, rapid prototyping, medical devices, low cost, low volume, rapid turnaround time.
Play Button
A Procedure to Observe Context-induced Renewal of Pavlovian-conditioned Alcohol-seeking Behavior in Rats
Authors: Jean-Marie Maddux, Franca Lacroix, Nadia Chaudhri.
Institutions: Concordia University.
Environmental contexts in which drugs of abuse are consumed can trigger craving, a subjective Pavlovian-conditioned response that can facilitate drug-seeking behavior and prompt relapse in abstinent drug users. We have developed a procedure to study the behavioral and neural processes that mediate the impact of context on alcohol-seeking behavior in rats. Following acclimation to the taste and pharmacological effects of 15% ethanol in the home cage, male Long-Evans rats receive Pavlovian discrimination training (PDT) in conditioning chambers. In each daily (Mon-Fri) PDT session, 16 trials each of two different 10 sec auditory conditioned stimuli occur. During one stimulus, the CS+, 0.2 ml of 15% ethanol is delivered into a fluid port for oral consumption. The second stimulus, the CS-, is not paired with ethanol. Across sessions, entries into the fluid port during the CS+ increase, whereas entries during the CS- stabilize at a lower level, indicating that a predictive association between the CS+ and ethanol is acquired. During PDT each chamber is equipped with a specific configuration of visual, olfactory and tactile contextual stimuli. Following PDT, extinction training is conducted in the same chamber that is now equipped with a different configuration of contextual stimuli. The CS+ and CS- are presented as before, but ethanol is withheld, which causes a gradual decline in port entries during the CS+. At test, rats are placed back into the PDT context and presented with the CS+ and CS- as before, but without ethanol. This manipulation triggers a robust and selective increase in the number of port entries made during the alcohol predictive CS+, with no change in responding during the CS-. This effect, referred to as context-induced renewal, illustrates the powerful capacity of contexts associated with alcohol consumption to stimulate alcohol-seeking behavior in response to Pavlovian alcohol cues.
Behavior, Issue 91, Behavioral neuroscience, alcoholism, relapse, addiction, Pavlovian conditioning, ethanol, reinstatement, discrimination, conditioned approach
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Technique and Considerations in the Use of 4x1 Ring High-definition Transcranial Direct Current Stimulation (HD-tDCS)
Authors: Mauricio F. Villamar, Magdalena Sarah Volz, Marom Bikson, Abhishek Datta, Alexandre F. DaSilva, Felipe Fregni.
Institutions: Spaulding Rehabilitation Hospital and Massachusetts General Hospital, Harvard Medical School, Pontifical Catholic University of Ecuador, Charité University Medicine Berlin, The City College of The City University of New York, University of Michigan.
High-definition transcranial direct current stimulation (HD-tDCS) has recently been developed as a noninvasive brain stimulation approach that increases the accuracy of current delivery to the brain by using arrays of smaller "high-definition" electrodes, instead of the larger pad-electrodes of conventional tDCS. Targeting is achieved by energizing electrodes placed in predetermined configurations. One of these is the 4x1-ring configuration. In this approach, a center ring electrode (anode or cathode) overlying the target cortical region is surrounded by four return electrodes, which help circumscribe the area of stimulation. Delivery of 4x1-ring HD-tDCS is capable of inducing significant neurophysiological and clinical effects in both healthy subjects and patients. Furthermore, its tolerability is supported by studies using intensities as high as 2.0 milliamperes for up to twenty minutes. Even though 4x1 HD-tDCS is simple to perform, correct electrode positioning is important in order to accurately stimulate target cortical regions and exert its neuromodulatory effects. The use of electrodes and hardware that have specifically been tested for HD-tDCS is critical for safety and tolerability. Given that most published studies on 4x1 HD-tDCS have targeted the primary motor cortex (M1), particularly for pain-related outcomes, the purpose of this article is to systematically describe its use for M1 stimulation, as well as the considerations to be taken for safe and effective stimulation. However, the methods outlined here can be adapted for other HD-tDCS configurations and cortical targets.
Medicine, Issue 77, Neurobiology, Neuroscience, Physiology, Anatomy, Biomedical Engineering, Biophysics, Neurophysiology, Nervous System Diseases, Diagnosis, Therapeutics, Anesthesia and Analgesia, Investigative Techniques, Equipment and Supplies, Mental Disorders, Transcranial direct current stimulation, tDCS, High-definition transcranial direct current stimulation, HD-tDCS, Electrical brain stimulation, Transcranial electrical stimulation (tES), Noninvasive Brain Stimulation, Neuromodulation, non-invasive, brain, stimulation, clinical techniques
Play Button
Detection and Isolation of Viable Mouse IL-17-Secreting T Cells
Authors: Anna Foerster, Mario Assenmacher, Michaela Niemoeller, Elly Rankin, Mariette Mohaupt, Anne Richter.
Institutions: Miltenyi Biotec,GmbH.
The MACS Cytokine Secretion Assay technology allows detection of secreted cytokines on the single cell level and sensitive isolation of viable cytokine-secreting cells. In order to label IL-17-secreting cells, a single cell suspension of mouse splenocytes is prepared and stimulated at 37°C with PMA/ionomycin to induce cytokine secretion. To stop secretion cells are then placed on ice and are exposed to the IL-17 Catch Reagent a bi-specific antibody that binds to CD45 on the cell surface of leukocytes and to IL-17 as it is secreted and caught near the cell surface. Secretion is then re-started by increasing the temperature to 37°C and IL-17 is trapped by the Catch Reagent. Secretion is then stopped again, by placing cells on ice. To detect the trapped IL-17, cells are incubated with a second IL-17-specific antibody conjugated to biotin and an Anti-Biotin-PE antibody. Cells can now be directly analyzed by flow cytometry or prepared for isolation and enrichment by subsequent labeling with Anti-PE conjugated MicroBeads.
Immunology, Issue 22, Miltenyi, leukocytes, cytokine, IL-17, MACS, FACS, TH17, cell separation
Play Button
Processing the Loblolly Pine PtGen2 cDNA Microarray
Authors: W. Walter Lorenz, Yuan-Sheng Yu, Marta Simões, Jeffrey F. D. Dean.
Institutions: University of Georgia (UGA), Instituto Tecnologia Química e Biológica UNL, Av. da República.
PtGen2 is a 26,496 feature cDNA microarray containing amplified loblolly pine ESTs. The array is produced in our laboratory for use by researchers studying gene expression in pine and other conifer species. PtGen2 was developed as a result of our gene discovery efforts in loblolly pine, and is comprised of sequences identified primarily from root tissues, but also from needle and stem.1,2 PtGen2 has been tested by hybridizing different Cy-dye labeled conifer target cDNAs, using both amplified and non-amplified indirect labeling methods, and also tested with a number of hybridization and washing conditions. This video focuses on the handling and processing of slides before and after pre-hybridization, as well as after hybridization, using some modifications to procedures developed previously.3,4 Also included, in text form only, are the protocols used for the generation, labeling and clean up of target cDNA s, as well as information on software used for downstream data processing. PtGen2 is printed with a proprietary print buffer that contains high concentrations of salt that can be difficult to remove completely. The slides are washed first in a warm SDS solution prior to pre-hybridization. After pre-hybridization, the slides are washed vigorously in several changes of water to complete removal of remaining salts. LifterSlips™ are then cleaned and positioned on the slides and labeled cDNA is carefully loaded onto the microarray by way of capillary action which provides for even distribution of the sample across the slide, and reduces the chance of bubble incorporation. Hybridization of targets to the array is done at 48°C in high humidity conditions. After hybridization, a series of standard washes are done at 53°C and room temperature for extended times. Processing PtGen2 slides using this technique reduces salt and SDS-derived artifacts often seen when the array is processed less rigorously. Hybridizing targets derived from several different conifer RNA sources, this processing protocol yielded fewer artifacts, reduced background, and provided better consistency among different experimental groups of arrays.
Plant Biology, Issue 25, Loblolly pine, P. taeda, cDNA, microarray, slide processing
Play Button
Obtaining Highly Purified Toxoplasma gondii Oocysts by a Discontinuous Cesium Chloride Gradient
Authors: Sarah E. Staggs, Mary Jean See, J P. Dubey, Eric N. Villegas.
Institutions: Dynamac, Inc., University of Cincinnati, McMicken College of Arts and Science, Agricultural Research Service, U.S. Department of Agriculture, US Environmental Protection Agency.
Toxoplasma gondii is an obligate intracellular protozoan pathogen that commonly infects humans. It is a well characterized apicomplexan associated with causing food- and water-borne disease outbreaks. The definitive host is the feline species where sexual replication occurs resulting in the development of the highly infectious and environmentally resistant oocyst. Infection occurs via ingestion of tissue cysts from contaminated meat or oocysts from soil or water. Infection is typically asymptomatic in healthy individuals, but results in a life-long latent infection that can reactivate causing toxoplasmic encephalitis and death if the individual becomes immunocompromised. Meat contaminated with T. gondii cysts have been the primary source of infection in Europe and the United States, but recent changes in animal management and husbandry practices and improved food handling and processing procedures have significantly reduced the prevalence of T. gondii cysts in meat1, 2. Nonetheless, seroprevalence in humans remains relatively high suggesting that exposure from oocyst contaminated soil or water is likely. Indeed, waterborne outbreaks of toxoplasmosis have been reported worldwide supporting the theory exposure to the environmental oocyst form poses a significant health risk3-5. To date, research on understanding the prevalence of T. gondii oocysts in the water and environment are limited due to the lack of tools to detect oocysts in the environment 5, 6. This is primarily due to the lack of efficient purification protocols for obtaining large numbers of highly purified T gondii oocysts from infected cats for research purposes. This study describes the development of a modified CsCl method that easily purifies T. gondii oocysts from feces of infected cats that are suitable for molecular biological and tissue culture manipulation7.
Jove Infectious Diseases, Microbiology, Issue 33, Toxoplasma gondii, cesium chloride, oocysts, discontinuous gradient, apicomplexan
Play Button
Using the Gene Pulser MXcell Electroporation System to Transfect Primary Cells with High Efficiency
Authors: Adam M. McCoy, Michelle L. Collins, Luis A. Ugozzoli.
Institutions: Bio-Rad Laboratories, Inc..
It is becoming increasingly apparent that electroporation is the most effective way to introduce plasmid DNA or siRNA into primary cells. The Gene Pulser MXcell electroporation system and Gene Pulser electroporation buffer (Bio-Rad) were specifically developed to easily transfect nucleic acids into mammalian cells and difficult-to-transfect cells, such as primary and stem cells. We will demonstrate how to perform a simple experiment to quickly identify the best electroporation conditions. We will demonstrate how to run several samples through a range of electroporation conditions so that an experiment can be conducted at the same time as optimization is performed. We will also show how optimal conditions identified using 96-well electroporation plates can be used with standard electroporation cuvettes, facilitating the switch from electroporation plates to electroporation cuvettes while maintaining the same electroporation efficiency. In the video, we will also discuss some of the key factors that can lead to the success or failure of electroporation experiments.
Cellular Biology, Issue 35, Primary cell electroporation, MEF, Bio-Rad, Gene Pulser MXcell, transfection, GFP
Play Button
Selection of Aptamers for Amyloid β-Protein, the Causative Agent of Alzheimer's Disease
Authors: Farid Rahimi, Gal Bitan.
Institutions: David Geffen School of Medicine, University of California, Los Angeles, University of California, Los Angeles.
Alzheimer's disease (AD) is a progressive, age-dependent, neurodegenerative disorder with an insidious course that renders its presymptomatic diagnosis difficult1. Definite AD diagnosis is achieved only postmortem, thus establishing presymptomatic, early diagnosis of AD is crucial for developing and administering effective therapies2,3. Amyloid β-protein (Aβ) is central to AD pathogenesis. Soluble, oligomeric Aβ assemblies are believed to affect neurotoxicity underlying synaptic dysfunction and neuron loss in AD4,5. Various forms of soluble Aβ assemblies have been described, however, their interrelationships and relevance to AD etiology and pathogenesis are complex and not well understood6. Specific molecular recognition tools may unravel the relationships amongst Aβ assemblies and facilitate detection and characterization of these assemblies early in the disease course before symptoms emerge. Molecular recognition commonly relies on antibodies. However, an alternative class of molecular recognition tools, aptamers, offers important advantages relative to antibodies7,8. Aptamers are oligonucleotides generated by in-vitro selection: systematic evolution of ligands by exponential enrichment (SELEX)9,10. SELEX is an iterative process that, similar to Darwinian evolution, allows selection, amplification, enrichment, and perpetuation of a property, e.g., avid, specific, ligand binding (aptamers) or catalytic activity (ribozymes and DNAzymes). Despite emergence of aptamers as tools in modern biotechnology and medicine11, they have been underutilized in the amyloid field. Few RNA or ssDNA aptamers have been selected against various forms of prion proteins (PrP)12-16. An RNA aptamer generated against recombinant bovine PrP was shown to recognize bovine PrP-β17, a soluble, oligomeric, β-sheet-rich conformational variant of full-length PrP that forms amyloid fibrils18. Aptamers generated using monomeric and several forms of fibrillar β2-microglobulin (β2m) were found to bind fibrils of certain other amyloidogenic proteins besides β2m fibrils19. Ylera et al. described RNA aptamers selected against immobilized monomeric Aβ4020. Unexpectedly, these aptamers bound fibrillar Aβ40. Altogether, these data raise several important questions. Why did aptamers selected against monomeric proteins recognize their polymeric forms? Could aptamers against monomeric and/or oligomeric forms of amyloidogenic proteins be obtained? To address these questions, we attempted to select aptamers for covalently-stabilized oligomeric Aβ4021 generated using photo-induced cross-linking of unmodified proteins (PICUP)22,23. Similar to previous findings17,19,20, these aptamers reacted with fibrils of Aβ and several other amyloidogenic proteins likely recognizing a potentially common amyloid structural aptatope21. Here, we present the SELEX methodology used in production of these aptamers21.
Neuroscience, Issue 39, Cellular Biology, Aptamer, RNA, amyloid β-protein, oligomer, amyloid fibrils, protein assembly
Play Button
Western Blotting: Sample Preparation to Detection
Authors: Anna Eslami, Jesse Lujan.
Institutions: EMD Chemicals Inc..
Western blotting is an analytical technique used to detect specific proteins in a given sample of tissue homogenate or extract. It uses gel electrophoresis to separate native or denatured proteins by the length of the polypeptide (denaturing conditions) or by the 3-D structure of the protein (native/ non-denaturing conditions). The proteins are then transferred to a membrane (typically nitrocellulose or PVDF), where they are probed (detected) using antibodies specific to the target protein.
Basic Protocols, Issue 44, western blot, SDS-PAGE, electrophoresis, protein transfer, immunoblot, protein separation, PVDF, nitrocellulose, ECL
Play Button
Magnetic Resonance Imaging Quantification of Pulmonary Perfusion using Calibrated Arterial Spin Labeling
Authors: Tatsuya J. Arai, G. Kim Prisk, Sebastiaan Holverda, Rui Carlos Sá, Rebecca J. Theilmann, A. Cortney Henderson, Matthew V. Cronin, Richard B. Buxton, Susan R. Hopkins.
Institutions: University of California San Diego - UCSD, University of California San Diego - UCSD, University of California San Diego - UCSD.
This demonstrates a MR imaging method to measure the spatial distribution of pulmonary blood flow in healthy subjects during normoxia (inspired O2, fraction (FIO2) = 0.21) hypoxia (FIO2 = 0.125), and hyperoxia (FIO2 = 1.00). In addition, the physiological responses of the subject are monitored in the MR scan environment. MR images were obtained on a 1.5 T GE MRI scanner during a breath hold from a sagittal slice in the right lung at functional residual capacity. An arterial spin labeling sequence (ASL-FAIRER) was used to measure the spatial distribution of pulmonary blood flow 1,2 and a multi-echo fast gradient echo (mGRE) sequence 3 was used to quantify the regional proton (i.e. H2O) density, allowing the quantification of density-normalized perfusion for each voxel (milliliters blood per minute per gram lung tissue). With a pneumatic switching valve and facemask equipped with a 2-way non-rebreathing valve, different oxygen concentrations were introduced to the subject in the MR scanner through the inspired gas tubing. A metabolic cart collected expiratory gas via expiratory tubing. Mixed expiratory O2 and CO2 concentrations, oxygen consumption, carbon dioxide production, respiratory exchange ratio, respiratory frequency and tidal volume were measured. Heart rate and oxygen saturation were monitored using pulse-oximetry. Data obtained from a normal subject showed that, as expected, heart rate was higher in hypoxia (60 bpm) than during normoxia (51) or hyperoxia (50) and the arterial oxygen saturation (SpO2) was reduced during hypoxia to 86%. Mean ventilation was 8.31 L/min BTPS during hypoxia, 7.04 L/min during normoxia, and 6.64 L/min during hyperoxia. Tidal volume was 0.76 L during hypoxia, 0.69 L during normoxia, and 0.67 L during hyperoxia. Representative quantified ASL data showed that the mean density normalized perfusion was 8.86 ml/min/g during hypoxia, 8.26 ml/min/g during normoxia and 8.46 ml/min/g during hyperoxia, respectively. In this subject, the relative dispersion4, an index of global heterogeneity, was increased in hypoxia (1.07 during hypoxia, 0.85 during normoxia, and 0.87 during hyperoxia) while the fractal dimension (Ds), another index of heterogeneity reflecting vascular branching structure, was unchanged (1.24 during hypoxia, 1.26 during normoxia, and 1.26 during hyperoxia). Overview. This protocol will demonstrate the acquisition of data to measure the distribution of pulmonary perfusion noninvasively under conditions of normoxia, hypoxia, and hyperoxia using a magnetic resonance imaging technique known as arterial spin labeling (ASL). Rationale: Measurement of pulmonary blood flow and lung proton density using MR technique offers high spatial resolution images which can be quantified and the ability to perform repeated measurements under several different physiological conditions. In human studies, PET, SPECT, and CT are commonly used as the alternative techniques. However, these techniques involve exposure to ionizing radiation, and thus are not suitable for repeated measurements in human subjects.
Medicine, Issue 51, arterial spin labeling, lung proton density, functional lung imaging, hypoxic pulmonary vasoconstriction, oxygen consumption, ventilation, magnetic resonance imaging
Play Button
A High Throughput in situ Hybridization Method to Characterize mRNA Expression Patterns in the Fetal Mouse Lower Urogenital Tract
Authors: Lisa L. Abler, Vatsal Mehta, Kimberly P. Keil, Pinak S. Joshi, Chelsea-Leigh Flucus, Heather A. Hardin, Christopher T. Schmitz, Chad M. Vezina.
Institutions: University of Wisconsin-Madison.
Development of the lower urogenital tract (LUT) is an intricate process. This complexity is evidenced during formation of the prostate from the fetal male urethra, which relies on androgenic signals and epithelial-mesenchymal interactions1,2. Understanding the molecular mechanisms responsible for prostate development may reveal growth mechanisms that are inappropriately reawakened later in life to give rise to prostate diseases such as benign prostatic hyperplasia and prostate cancer. The developing LUT is anatomically complex. By the time prostatic budding begins on 16.5 days post conception (dpc), numerous cell types are present. Vasculature, nerves and smooth muscle reside within the mesenchymal stroma3. This stroma surrounds a multilayered epithelium and gives rise to the fetal prostate through androgen receptor-dependent paracrine signals4. The identity of the stromal androgen receptor-responsive genes required for prostate development and the mechanism by which prostate ductal epithelium forms in response to these genes is not fully understood. The ability to precisely identify cell types and localize expression of specific factors within them is imperative to further understand prostate development. In situ hybridization (ISH) allows for localization of mRNAs within a tissue. Thus, this method can be used to identify pattern and timing of expression of signaling molecules and their receptors, thereby elucidating potential prostate developmental regulators. Here, we describe a high throughput ISH technique to identify mRNA expression patterns in the fetal mouse LUT using vibrating microtome-cut sections. This method offers several advantages over other ISH protocols. Performing ISH on thin sections adhered to a slide is technically difficult; cryosections frequently have poor structural quality while both cryosections and paraffin sections often result in weak signal resolution. Performing ISH on whole mount tissues can result in probe trapping. In contrast, our high throughput technique utilizes thick-cut sections that reveal detailed tissue architecture. Modified microfuge tubes allow easy handling of sections during the ISH procedure. A maximum of 4 mRNA transcripts can be screened from a single 17.5dpc LUT with up to 24 mRNA transcripts detected in a single run, thereby reducing cost and maximizing efficiency. This method allows multiple treatment groups to be processed identically and as a single unit, thereby removing any bias for interpreting data. Most pertinently for prostate researchers, this method provides a spatial and temporal location of low and high abundance mRNA transcripts in the fetal mouse urethra that gives rise to the prostate ductal network.
Developmental Biology, Issue 54, Urogenital, prostate, lower urinary tract, urethra, in situ hybridization
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.