JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Factors influencing contrast sensitivity function in myopic eyes.
PUBLISHED: 01-01-2014
To evaluate the factors affecting the area under the log contrast sensitivity function (AULCSF) in healthy myopic eyes.
In order to follow optic neuritis patients and evaluate the effectiveness of their treatment, a handy, accurate and quantifiable tool is required to assess changes in myelination at the central nervous system (CNS). However, standard measurements, including routine visual tests and MRI scans, are not sensitive enough for this purpose. We present two visual tests addressing dynamic monocular and binocular functions which may closely associate with the extent of myelination along visual pathways. These include Object From Motion (OFM) extraction and Time-constrained stereo protocols. In the OFM test, an array of dots compose an object, by moving the dots within the image rightward while moving the dots outside the image leftward or vice versa. The dot pattern generates a camouflaged object that cannot be detected when the dots are stationary or moving as a whole. Importantly, object recognition is critically dependent on motion perception. In the Time-constrained Stereo protocol, spatially disparate images are presented for a limited length of time, challenging binocular 3-dimensional integration in time. Both tests are appropriate for clinical usage and provide a simple, yet powerful, way to identify and quantify processes of demyelination and remyelination along visual pathways. These protocols may be efficient to diagnose and follow optic neuritis and multiple sclerosis patients. In the diagnostic process, these protocols may reveal visual deficits that cannot be identified via current standard visual measurements. Moreover, these protocols sensitively identify the basis of the currently unexplained continued visual complaints of patients following recovery of visual acuity. In the longitudinal follow up course, the protocols can be used as a sensitive marker of demyelinating and remyelinating processes along time. These protocols may therefore be used to evaluate the efficacy of current and evolving therapeutic strategies, targeting myelination of the CNS.
23 Related JoVE Articles!
Play Button
Simultaneous Quantification of T-Cell Receptor Excision Circles (TRECs) and K-Deleting Recombination Excision Circles (KRECs) by Real-time PCR
Authors: Alessandra Sottini, Federico Serana, Diego Bertoli, Marco Chiarini, Monica Valotti, Marion Vaglio Tessitore, Luisa Imberti.
Institutions: Spedali Civili di Brescia.
T-cell receptor excision circles (TRECs) and K-deleting recombination excision circles (KRECs) are circularized DNA elements formed during recombination process that creates T- and B-cell receptors. Because TRECs and KRECs are unable to replicate, they are diluted after each cell division, and therefore persist in the cell. Their quantity in peripheral blood can be considered as an estimation of thymic and bone marrow output. By combining well established and commonly used TREC assay with a modified version of KREC assay, we have developed a duplex quantitative real-time PCR that allows quantification of both newly-produced T and B lymphocytes in a single assay. The number of TRECs and KRECs are obtained using a standard curve prepared by serially diluting TREC and KREC signal joints cloned in a bacterial plasmid, together with a fragment of T-cell receptor alpha constant gene that serves as reference gene. Results are reported as number of TRECs and KRECs/106 cells or per ml of blood. The quantification of these DNA fragments have been proven useful for monitoring immune reconstitution following bone marrow transplantation in both children and adults, for improved characterization of immune deficiencies, or for better understanding of certain immunomodulating drug activity.
Immunology, Issue 94, B lymphocytes, primary immunodeficiency, real-time PCR, immune recovery, T-cell homeostasis, T lymphocytes, thymic output, bone marrow output
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Play Button
A Laser-induced Mouse Model of Chronic Ocular Hypertension to Characterize Visual Defects
Authors: Liang Feng, Hui Chen, Genn Suyeoka, Xiaorong Liu.
Institutions: Northwestern University, Northwestern University.
Glaucoma, frequently associated with elevated intraocular pressure (IOP), is one of the leading causes of blindness. We sought to establish a mouse model of ocular hypertension to mimic human high-tension glaucoma. Here laser illumination is applied to the corneal limbus to photocoagulate the aqueous outflow, inducing angle closure. The changes of IOP are monitored using a rebound tonometer before and after the laser treatment. An optomotor behavioral test is used to measure corresponding changes in visual capacity. The representative result from one mouse which developed sustained IOP elevation after laser illumination is shown. A decreased visual acuity and contrast sensitivity is observed in this ocular hypertensive mouse. Together, our study introduces a valuable model system to investigate neuronal degeneration and the underlying molecular mechanisms in glaucomatous mice.
Medicine, Issue 78, Biomedical Engineering, Neurobiology, Anatomy, Physiology, Neuroscience, Cellular Biology, Molecular Biology, Ophthalmology, Retinal Neurons, Retinal Neurons, Retinal Ganglion Cells, Neurodegenerative Diseases, Ocular Hypertension, Retinal Degeneration, Vision Tests, Visual Acuity, Eye Diseases, Retinal Ganglion Cell (RGC), Ocular Hypertension, Laser Photocoagulation, Intraocular pressure (IOP), Tonometer; Visual Acuity, Contrast Sensitivity, Optomotor, animal model
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Measuring Oral Fatty Acid Thresholds, Fat Perception, Fatty Food Liking, and Papillae Density in Humans
Authors: Rivkeh Y. Haryono, Madeline A. Sprajcer, Russell S. J. Keast.
Institutions: Deakin University.
Emerging evidence from a number of laboratories indicates that humans have the ability to identify fatty acids in the oral cavity, presumably via fatty acid receptors housed on taste cells. Previous research has shown that an individual's oral sensitivity to fatty acid, specifically oleic acid (C18:1) is associated with body mass index (BMI), dietary fat consumption, and the ability to identify fat in foods. We have developed a reliable and reproducible method to assess oral chemoreception of fatty acids, using a milk and C18:1 emulsion, together with an ascending forced choice triangle procedure. In parallel, a food matrix has been developed to assess an individual's ability to perceive fat, in addition to a simple method to assess fatty food liking. As an added measure tongue photography is used to assess papillae density, with higher density often being associated with increased taste sensitivity.
Neuroscience, Issue 88, taste, overweight and obesity, dietary fat, fatty acid, diet, fatty food liking, detection threshold
Play Button
Analysis of Oxidative Stress in Zebrafish Embryos
Authors: Vera Mugoni, Annalisa Camporeale, Massimo M. Santoro.
Institutions: University of Torino, Vesalius Research Center, VIB.
High levels of reactive oxygen species (ROS) may cause a change of cellular redox state towards oxidative stress condition. This situation causes oxidation of molecules (lipid, DNA, protein) and leads to cell death. Oxidative stress also impacts the progression of several pathological conditions such as diabetes, retinopathies, neurodegeneration, and cancer. Thus, it is important to define tools to investigate oxidative stress conditions not only at the level of single cells but also in the context of whole organisms. Here, we consider the zebrafish embryo as a useful in vivo system to perform such studies and present a protocol to measure in vivo oxidative stress. Taking advantage of fluorescent ROS probes and zebrafish transgenic fluorescent lines, we develop two different methods to measure oxidative stress in vivo: i) a “whole embryo ROS-detection method” for qualitative measurement of oxidative stress and ii) a “single-cell ROS detection method” for quantitative measurements of oxidative stress. Herein, we demonstrate the efficacy of these procedures by increasing oxidative stress in tissues by oxidant agents and physiological or genetic methods. This protocol is amenable for forward genetic screens and it will help address cause-effect relationships of ROS in animal models of oxidative stress-related pathologies such as neurological disorders and cancer.
Developmental Biology, Issue 89, Danio rerio, zebrafish embryos, endothelial cells, redox state analysis, oxidative stress detection, in vivo ROS measurements, FACS (fluorescence activated cell sorter), molecular probes
Play Button
A Simple Behavioral Assay for Testing Visual Function in Xenopus laevis
Authors: Andrea S. Viczian, Michael E. Zuber.
Institutions: Center for Vision Research, SUNY Eye Institute, Upstate Medical University.
Measurement of the visual function in the tadpoles of the frog, Xenopus laevis, allows screening for blindness in live animals. The optokinetic response is a vision-based, reflexive behavior that has been observed in all vertebrates tested. Tadpole eyes are small so the tail flip response was used as alternative measure, which requires a trained technician to record the subtle response. We developed an alternative behavior assay based on the fact that tadpoles prefer to swim on the white side of a tank when placed in a tank with both black and white sides. The assay presented here is an inexpensive, simple alternative that creates a response that is easily measured. The setup consists of a tripod, webcam and nested testing tanks, readily available in most Xenopus laboratories. This article includes a movie showing the behavior of tadpoles, before and after severing the optic nerve. In order to test the function of one eye, we also include representative results of a tadpole in which each eye underwent retinal axotomy on consecutive days. Future studies could develop an automated version of this assay for testing the vision of many tadpoles at once.
Neuroscience, Issue 88, eye, retina, vision, color preference, Xenopus laevis, behavior, light, guidance, visual assay
Play Button
Bladder Smooth Muscle Strip Contractility as a Method to Evaluate Lower Urinary Tract Pharmacology
Authors: F. Aura Kullmann, Stephanie L. Daugherty, William C. de Groat, Lori A. Birder.
Institutions: University of Pittsburgh School of Medicine, University of Pittsburgh School of Medicine.
We describe an in vitro method to measure bladder smooth muscle contractility, and its use for investigating physiological and pharmacological properties of the smooth muscle as well as changes induced by pathology. This method provides critical information for understanding bladder function while overcoming major methodological difficulties encountered in in vivo experiments, such as surgical and pharmacological manipulations that affect stability and survival of the preparations, the use of human tissue, and/or the use of expensive chemicals. It also provides a way to investigate the properties of each bladder component (i.e. smooth muscle, mucosa, nerves) in healthy and pathological conditions. The urinary bladder is removed from an anesthetized animal, placed in Krebs solution and cut into strips. Strips are placed into a chamber filled with warm Krebs solution. One end is attached to an isometric tension transducer to measure contraction force, the other end is attached to a fixed rod. Tissue is stimulated by directly adding compounds to the bath or by electric field stimulation electrodes that activate nerves, similar to triggering bladder contractions in vivo. We demonstrate the use of this method to evaluate spontaneous smooth muscle contractility during development and after an experimental spinal cord injury, the nature of neurotransmission (transmitters and receptors involved), factors involved in modulation of smooth muscle activity, the role of individual bladder components, and species and organ differences in response to pharmacological agents. Additionally, it could be used for investigating intracellular pathways involved in contraction and/or relaxation of the smooth muscle, drug structure-activity relationships and evaluation of transmitter release. The in vitro smooth muscle contractility method has been used extensively for over 50 years, and has provided data that significantly contributed to our understanding of bladder function as well as to pharmaceutical development of compounds currently used clinically for bladder management.
Medicine, Issue 90, Krebs, species differences, in vitro, smooth muscle contractility, neural stimulation
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
Play Button
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Authors: Phoebe Spetsieris, Yilong Ma, Shichun Peng, Ji Hyun Ko, Vijay Dhawan, Chris C. Tang, David Eidelberg.
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
An Experimental Paradigm for the Prediction of Post-Operative Pain (PPOP)
Authors: Ruth Landau, John C. Kraft, Lisa Y. Flint, Brendan Carvalho, Philippe Richebé, Monica Cardoso, Patricia Lavand'homme, Michal Granot, David Yarnitsky, Alex Cahana.
Institutions: University of Washington School of Medicine.
Many women undergo cesarean delivery without problems, however some experience significant pain after cesarean section. Pain is associated with negative short-term and long-term effects on the mother. Prior to women undergoing surgery, can we predict who is at risk for developing significant postoperative pain and potentially prevent or minimize its negative consequences? These are the fundamental questions that a team from the University of Washington, Stanford University, the Catholic University in Brussels, Belgium, Santa Joana Women's Hospital in São Paulo, Brazil, and Rambam Medical Center in Israel is currently evaluating in an international research collaboration. The ultimate goal of this project is to provide optimal pain relief during and after cesarean section by offering individualized anesthetic care to women who appear to be more 'susceptible' to pain after surgery. A significant number of women experience moderate or severe acute post-partum pain after vaginal and cesarean deliveries. 1 Furthermore, 10-15% of women suffer chronic persistent pain after cesarean section. 2 With constant increase in cesarean rates in the US 3 and the already high rate in Brazil, this is bound to create a significant public health problem. When questioning women's fears and expectations from cesarean section, pain during and after it is their greatest concern. 4 Individual variability in severity of pain after vaginal or operative delivery is influenced by multiple factors including sensitivity to pain, psychological factors, age, and genetics. The unique birth experience leads to unpredictable requirements for analgesics, from 'none at all' to 'very high' doses of pain medication. Pain after cesarean section is an excellent model to study post-operative pain because it is performed on otherwise young and healthy women. Therefore, it is recommended to attenuate the pain during the acute phase because this may lead to chronic pain disorders. The impact of developing persistent pain is immense, since it may impair not only the ability of women to care for their child in the immediate postpartum period, but also their own well being for a long period of time. In a series of projects, an international research network is currently investigating the effect of pregnancy on pain modulation and ways to predict who will suffer acute severe pain and potentially chronic pain, by using simple pain tests and questionnaires in combination with genetic analysis. A relatively recent approach to investigate pain modulation is via the psychophysical measure of Diffuse Noxious Inhibitory Control (DNIC). This pain-modulating process is the neurophysiological basis for the well-known phenomenon of 'pain inhibits pain' from remote areas of the body. The DNIC paradigm has evolved recently into a clinical tool and simple test and has been shown to be a predictor of post-operative pain.5 Since pregnancy is associated with decreased pain sensitivity and/or enhanced processes of pain modulation, using tests that investigate pain modulation should provide a better understanding of the pathways involved with pregnancy-induced analgesia and may help predict pain outcomes during labor and delivery. For those women delivering by cesarean section, a DNIC test performed prior to surgery along with psychosocial questionnaires and genetic tests should enable one to identify women prone to suffer severe post-cesarean pain and persistent pain. These clinical tests should allow anesthesiologists to offer not only personalized medicine to women with the promise to improve well-being and satisfaction, but also a reduction in the overall cost of perioperative and long term care due to pain and suffering. On a larger scale, these tests that explore pain modulation may become bedside screening tests to predict the development of pain disorders following surgery.
JoVE Medicine, Issue 35, diffuse noxious inhibitory control, DNIC, temporal summation, TS, psychophysical testing, endogenous analgesia, pain modulation, pregnancy-induced analgesia, cesarean section, post-operative pain, prediction
Play Button
Using the optokinetic response to study visual function of zebrafish
Authors: Su-Qi Zou, Wu Yin, Ming-Jing Zhang, Chun-Rui Hu, Yu-Bin Huang, Bing Hu.
Institutions: University of Science and Technology of China (USTC).
Optokinetic response (OKR) is a behavior that an animal vibrates its eyes to follow a rotating grating around it. It has been widely used to assess the visual functions of larval zebrafish1-5. Nevertheless, the standard protocol for larval fish is not yet readily applicable in adult zabrafish. Here, we introduce how to measure the OKR of adult zebrafish with our simple custom-built apparatus using a new protocol which is established in our lab. Both our apparatus and step-by-step procedure of OKR in adult zebrafish are illustrated in this video. In addition, the measurements of the larval OKR, as well as the optomotor response (OMR) test of adult zebrafish, are also demonstrated in this video. This OKR assay of adult zebrafish in our experiment may last for up to 4 hours. Such OKR test applied in adult fish will benefit to visual function investigation more efficiently when the adult fish vision system is manipulated. Su-Qi Zou and Wu Yin contributed equally to this paper.
Neuroscience, Issue 36, Zebrafish, OKR, OMR, behavior, optokinetic, vision
Play Button
How to Create and Use Binocular Rivalry
Authors: David Carmel, Michael Arcaro, Sabine Kastner, Uri Hasson.
Institutions: New York University, New York University, Princeton University, Princeton University.
Each of our eyes normally sees a slightly different image of the world around us. The brain can combine these two images into a single coherent representation. However, when the eyes are presented with images that are sufficiently different from each other, an interesting thing happens: Rather than fusing the two images into a combined conscious percept, what transpires is a pattern of perceptual alternations where one image dominates awareness while the other is suppressed; dominance alternates between the two images, typically every few seconds. This perceptual phenomenon is known as binocular rivalry. Binocular rivalry is considered useful for studying perceptual selection and awareness in both human and animal models, because unchanging visual input to each eye leads to alternations in visual awareness and perception. To create a binocular rivalry stimulus, all that is necessary is to present each eye with a different image at the same perceived location. There are several ways of doing this, but newcomers to the field are often unsure which method would best suit their specific needs. The purpose of this article is to describe a number of inexpensive and straightforward ways to create and use binocular rivalry. We detail methods that do not require expensive specialized equipment and describe each method's advantages and disadvantages. The methods described include the use of red-blue goggles, mirror stereoscopes and prism goggles.
Neuroscience, Issue 45, Binocular rivalry, continuous flash suppression, vision, visual awareness, perceptual competition, unconscious processing, neuroimaging
Play Button
Membrane Potentials, Synaptic Responses, Neuronal Circuitry, Neuromodulation and Muscle Histology Using the Crayfish: Student Laboratory Exercises
Authors: Brittany Baierlein, Alison L. Thurow, Harold L. Atwood, Robin L. Cooper.
Institutions: University of Kentucky, University of Toronto.
The purpose of this report is to help develop an understanding of the effects caused by ion gradients across a biological membrane. Two aspects that influence a cell's membrane potential and which we address in these experiments are: (1) Ion concentration of K+ on the outside of the membrane, and (2) the permeability of the membrane to specific ions. The crayfish abdominal extensor muscles are in groupings with some being tonic (slow) and others phasic (fast) in their biochemical and physiological phenotypes, as well as in their structure; the motor neurons that innervate these muscles are correspondingly different in functional characteristics. We use these muscles as well as the superficial, tonic abdominal flexor muscle to demonstrate properties in synaptic transmission. In addition, we introduce a sensory-CNS-motor neuron-muscle circuit to demonstrate the effect of cuticular sensory stimulation as well as the influence of neuromodulators on certain aspects of the circuit. With the techniques obtained in this exercise, one can begin to answer many questions remaining in other experimental preparations as well as in physiological applications related to medicine and health. We have demonstrated the usefulness of model invertebrate preparations to address fundamental questions pertinent to all animals.
Neuroscience, Issue 47, Invertebrate, Crayfish, neurophysiology, muscle, anatomy, electrophysiology
Play Button
Experimental Manipulation of Body Size to Estimate Morphological Scaling Relationships in Drosophila
Authors: R. Craig Stillwell, Ian Dworkin, Alexander W. Shingleton, W. Anthony Frankino.
Institutions: University of Houston, Michigan State University.
The scaling of body parts is a central feature of animal morphology1-7. Within species, morphological traits need to be correctly proportioned to the body for the organism to function; larger individuals typically have larger body parts and smaller individuals generally have smaller body parts, such that overall body shape is maintained across a range of adult body sizes. The requirement for correct proportions means that individuals within species usually exhibit low variation in relative trait size. In contrast, relative trait size can vary dramatically among species and is a primary mechanism by which morphological diversity is produced. Over a century of comparative work has established these intra- and interspecific patterns3,4. Perhaps the most widely used approach to describe this variation is to calculate the scaling relationship between the size of two morphological traits using the allometric equation y=bxα, where x and y are the size of the two traits, such as organ and body size8,9. This equation describes the within-group (e.g., species, population) scaling relationship between two traits as both vary in size. Log-transformation of this equation produces a simple linear equation, log(y) = log(b) + αlog(x) and log-log plots of the size of different traits among individuals of the same species typically reveal linear scaling with an intercept of log(b) and a slope of α, called the 'allometric coefficient'9,10. Morphological variation among groups is described by differences in scaling relationship intercepts or slopes for a given trait pair. Consequently, variation in the parameters of the allometric equation (b and α) elegantly describes the shape variation captured in the relationship between organ and body size within and among biological groups (see 11,12). Not all traits scale linearly with each other or with body size (e.g., 13,14) Hence, morphological scaling relationships are most informative when the data are taken from the full range of trait sizes. Here we describe how simple experimental manipulation of diet can be used to produce the full range of body size in insects. This permits an estimation of the full scaling relationship for any given pair of traits, allowing a complete description of how shape covaries with size and a robust comparison of scaling relationship parameters among biological groups. Although we focus on Drosophila, our methodology should be applicable to nearly any fully metamorphic insect.
Developmental Biology, Issue 56, Drosophila, allometry, morphology, body size, scaling, insect
Play Button
Multimodal Imaging of Stem Cell Implantation in the Central Nervous System of Mice
Authors: Nathalie De Vocht, Kristien Reekmans, Irene Bergwerf, Jelle Praet, Chloé Hoornaert, Debbie Le Blon, Jasmijn Daans, Zwi Berneman, Annemie Van der Linden, Peter Ponsaerts.
Institutions: University of Antwerp, University of Antwerp.
During the past decade, stem cell transplantation has gained increasing interest as primary or secondary therapeutic modality for a variety of diseases, both in preclinical and clinical studies. However, to date results regarding functional outcome and/or tissue regeneration following stem cell transplantation are quite diverse. Generally, a clinical benefit is observed without profound understanding of the underlying mechanism(s)1. Therefore, multiple efforts have led to the development of different molecular imaging modalities to monitor stem cell grafting with the ultimate aim to accurately evaluate survival, fate and physiology of grafted stem cells and/or their micro-environment. Changes observed in one or more parameters determined by molecular imaging might be related to the observed clinical effect. In this context, our studies focus on the combined use of bioluminescence imaging (BLI), magnetic resonance imaging (MRI) and histological analysis to evaluate stem cell grafting. BLI is commonly used to non-invasively perform cell tracking and monitor cell survival in time following transplantation2-7, based on a biochemical reaction where cells expressing the Luciferase-reporter gene are able to emit light following interaction with its substrate (e.g. D-luciferin)8, 9. MRI on the other hand is a non-invasive technique which is clinically applicable10 and can be used to precisely locate cellular grafts with very high resolution11-15, although its sensitivity highly depends on the contrast generated after cell labeling with an MRI contrast agent. Finally, post-mortem histological analysis is the method of choice to validate research results obtained with non-invasive techniques with highest resolution and sensitivity. Moreover end-point histological analysis allows us to perform detailed phenotypic analysis of grafted cells and/or the surrounding tissue, based on the use of fluorescent reporter proteins and/or direct cell labeling with specific antibodies. In summary, we here visually demonstrate the complementarities of BLI, MRI and histology to unravel different stem cell- and/or environment-associated characteristics following stem cell grafting in the CNS of mice. As an example, bone marrow-derived stromal cells, genetically engineered to express the enhanced Green Fluorescent Protein (eGFP) and firefly Luciferase (fLuc), and labeled with blue fluorescent micron-sized iron oxide particles (MPIOs), will be grafted in the CNS of immune-competent mice and outcome will be monitored by BLI, MRI and histology (Figure 1).
Neuroscience, Issue 64, Stem cell biology, Cell labeling, Cell Transplantation, Brain, Bioluminescence Imaging, Magnetic Resonance Imaging, Histology
Play Button
The Measurement and Treatment of Suppression in Amblyopia
Authors: Joanna M. Black, Robert F. Hess, Jeremy R. Cooperstock, Long To, Benjamin Thompson.
Institutions: University of Auckland, McGill University , McGill University .
Amblyopia, a developmental disorder of the visual cortex, is one of the leading causes of visual dysfunction in the working age population. Current estimates put the prevalence of amblyopia at approximately 1-3%1-3, the majority of cases being monocular2. Amblyopia is most frequently caused by ocular misalignment (strabismus), blur induced by unequal refractive error (anisometropia), and in some cases by form deprivation. Although amblyopia is initially caused by abnormal visual input in infancy, once established, the visual deficit often remains when normal visual input has been restored using surgery and/or refractive correction. This is because amblyopia is the result of abnormal visual cortex development rather than a problem with the amblyopic eye itself4,5 . Amblyopia is characterized by both monocular and binocular deficits6,7 which include impaired visual acuity and poor or absent stereopsis respectively. The visual dysfunction in amblyopia is often associated with a strong suppression of the inputs from the amblyopic eye under binocular viewing conditions8. Recent work has indicated that suppression may play a central role in both the monocular and binocular deficits associated with amblyopia9,10 . Current clinical tests for suppression tend to verify the presence or absence of suppression rather than giving a quantitative measurement of the degree of suppression. Here we describe a technique for measuring amblyopic suppression with a compact, portable device11,12 . The device consists of a laptop computer connected to a pair of virtual reality goggles. The novelty of the technique lies in the way we present visual stimuli to measure suppression. Stimuli are shown to the amblyopic eye at high contrast while the contrast of the stimuli shown to the non-amblyopic eye are varied. Patients perform a simple signal/noise task that allows for a precise measurement of the strength of excitatory binocular interactions. The contrast offset at which neither eye has a performance advantage is a measure of the "balance point" and is a direct measure of suppression. This technique has been validated psychophysically both in control13,14 and patient6,9,11 populations. In addition to measuring suppression this technique also forms the basis of a novel form of treatment to decrease suppression over time and improve binocular and often monocular function in adult patients with amblyopia12,15,16 . This new treatment approach can be deployed either on the goggle system described above or on a specially modified iPod touch device15.
Medicine, Issue 70, Ophthalmology, Neuroscience, Anatomy, Physiology, Amblyopia, suppression, visual cortex, binocular vision, plasticity, strabismus, anisometropia
Play Button
Multifocal Electroretinograms
Authors: Donnell J. Creel.
Institutions: University of Utah.
A limitation of traditional full-field electroretinograms (ERG) for the diagnosis of retinopathy is lack of sensitivity. Generally, ERG results are normal unless more than approximately 20% of the retina is affected. In practical terms, a patient might be legally blind as a result of macular degeneration or other scotomas and still appear normal, according to traditional full field ERG. An important development in ERGs is the multifocal ERG (mfERG). Erich Sutter adapted the mathematical sequences called binary m-sequences enabling the isolation from a single electrical signal an electroretinogram representing less than each square millimeter of retina in response to a visual stimulus1. Results that are generated by mfERG appear similar to those generated by flash ERG. In contrast to flash ERG, which best generates data appropriate for whole-eye disorders. The basic mfERG result is based on the calculated mathematical average of an approximation of the positive deflection component of traditional ERG response, known as the b-wave1. Multifocal ERG programs measure electrical activity from more than a hundred retinal areas per eye, in a few minutes. The enhanced spatial resolution enables scotomas and retinal dysfunction to be mapped and quantified. In the protocol below, we describe the recording of mfERGs using a bipolar speculum contact lens. Components of mfERG systems vary between manufacturers. For the presentation of visible stimulus, some suitable CRT monitors are available but most systems have adopted the use of flat-panel liquid crystal displays (LCD). The visual stimuli depicted here, were produced by a LCD microdisplay subtending 35 - 40 degrees horizontally and 30 - 35 degrees vertically of visual field, and calibrated to produce multifocal flash intensities of 2.7 cd s m-2. Amplification was 50K. Lower and upper bandpass limits were 10 and 300 Hz. The software packages used were VERIS versions 5 and 6.
Medicine, Issue 58, Multifocal electroretinogram, mfERG, electroretinogram, ERG
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Play Button
Genetic Studies of Human DNA Repair Proteins Using Yeast as a Model System
Authors: Monika Aggarwal, Robert M. Brosh Jr..
Institutions: National Institute on Aging, NIH.
Understanding the roles of human DNA repair proteins in genetic pathways is a formidable challenge to many researchers. Genetic studies in mammalian systems have been limited due to the lack of readily available tools including defined mutant genetic cell lines, regulatory expression systems, and appropriate selectable markers. To circumvent these difficulties, model genetic systems in lower eukaryotes have become an attractive choice for the study of functionally conserved DNA repair proteins and pathways. We have developed a model yeast system to study the poorly defined genetic functions of the Werner syndrome helicase-nuclease (WRN) in nucleic acid metabolism. Cellular phenotypes associated with defined genetic mutant backgrounds can be investigated to clarify the cellular and molecular functions of WRN through its catalytic activities and protein interactions. The human WRN gene and associated variants, cloned into DNA plasmids for expression in yeast, can be placed under the control of a regulatory plasmid element. The expression construct can then be transformed into the appropriate yeast mutant background, and genetic function assayed by a variety of methodologies. Using this approach, we determined that WRN, like its related RecQ family members BLM and Sgs1, operates in a Top3-dependent pathway that is likely to be important for genomic stability. This is described in our recent publication [1] at Detailed methods of specific assays for genetic complementation studies in yeast are provided in this paper.
Microbiology, Issue 37, Werner syndrome, helicase, topoisomerase, RecQ, Bloom's syndrome, Sgs1, genomic instability, genetics, DNA repair, yeast
Play Button
Phase Contrast and Differential Interference Contrast (DIC) Microscopy
Authors: Victoria Centonze Frohlich.
Institutions: University of Texas Health Science Center at San Antonio (UTHSCSA).
Phase-contrast microscopy is often used to produce contrast for transparent, non light-absorbing, biological specimens. The technique was discovered by Zernike, in 1942, who received the Nobel prize for his achievement. DIC microscopy, introduced in the late 1960s, has been popular in biomedical research because it highlights edges of specimen structural detail, provides high-resolution optical sections of thick specimens including tissue cells, eggs, and embryos and does not suffer from the phase halos typical of phase-contrast images. This protocol highlights the principles and practical applications of these microscopy techniques.
Basic protocols, Issue 18, Current Protocols Wiley, Microscopy, Phase Contrast, Difference Interference Contrast
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.