JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Refractive error, visual acuity and causes of vision loss in children in shandong, china. The shandong children eye study.
PLoS ONE
PUBLISHED: 01-01-2013
To examine the prevalence of refractive errors and prevalence and causes of vision loss among preschool and school children in East China.
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Published: 03-30-2014
ABSTRACT
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
20 Related JoVE Articles!
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
The Measurement and Treatment of Suppression in Amblyopia
Authors: Joanna M. Black, Robert F. Hess, Jeremy R. Cooperstock, Long To, Benjamin Thompson.
Institutions: University of Auckland, McGill University , McGill University .
Amblyopia, a developmental disorder of the visual cortex, is one of the leading causes of visual dysfunction in the working age population. Current estimates put the prevalence of amblyopia at approximately 1-3%1-3, the majority of cases being monocular2. Amblyopia is most frequently caused by ocular misalignment (strabismus), blur induced by unequal refractive error (anisometropia), and in some cases by form deprivation. Although amblyopia is initially caused by abnormal visual input in infancy, once established, the visual deficit often remains when normal visual input has been restored using surgery and/or refractive correction. This is because amblyopia is the result of abnormal visual cortex development rather than a problem with the amblyopic eye itself4,5 . Amblyopia is characterized by both monocular and binocular deficits6,7 which include impaired visual acuity and poor or absent stereopsis respectively. The visual dysfunction in amblyopia is often associated with a strong suppression of the inputs from the amblyopic eye under binocular viewing conditions8. Recent work has indicated that suppression may play a central role in both the monocular and binocular deficits associated with amblyopia9,10 . Current clinical tests for suppression tend to verify the presence or absence of suppression rather than giving a quantitative measurement of the degree of suppression. Here we describe a technique for measuring amblyopic suppression with a compact, portable device11,12 . The device consists of a laptop computer connected to a pair of virtual reality goggles. The novelty of the technique lies in the way we present visual stimuli to measure suppression. Stimuli are shown to the amblyopic eye at high contrast while the contrast of the stimuli shown to the non-amblyopic eye are varied. Patients perform a simple signal/noise task that allows for a precise measurement of the strength of excitatory binocular interactions. The contrast offset at which neither eye has a performance advantage is a measure of the "balance point" and is a direct measure of suppression. This technique has been validated psychophysically both in control13,14 and patient6,9,11 populations. In addition to measuring suppression this technique also forms the basis of a novel form of treatment to decrease suppression over time and improve binocular and often monocular function in adult patients with amblyopia12,15,16 . This new treatment approach can be deployed either on the goggle system described above or on a specially modified iPod touch device15.
Medicine, Issue 70, Ophthalmology, Neuroscience, Anatomy, Physiology, Amblyopia, suppression, visual cortex, binocular vision, plasticity, strabismus, anisometropia
3927
Play Button
Measurement Of Neuromagnetic Brain Function In Pre-school Children With Custom Sized MEG
Authors: Graciela Tesan, Blake W. Johnson, Melanie Reid, Rosalind Thornton, Stephen Crain.
Institutions: Macquarie University.
Magnetoencephalography is a technique that detects magnetic fields associated with cortical activity [1]. The electrophysiological activity of the brain generates electric fields - that can be recorded using electroencephalography (EEG)- and their concomitant magnetic fields - detected by MEG. MEG signals are detected by specialized sensors known as superconducting quantum interference devices (SQUIDs). Superconducting sensors require cooling with liquid helium at -270 °C. They are contained inside a vacumm-insulated helmet called a dewar, which is filled with liquid. SQUIDS are placed in fixed positions inside the helmet dewar in the helium coolant, and a subject's head is placed inside the helmet dewar for MEG measurements. The helmet dewar must be sized to satisfy opposing constraints. Clearly, it must be large enough to fit most or all of the heads in the population that will be studied. However, the helmet must also be small enough to keep most of the SQUID sensors within range of the tiny cerebral fields that they are to measure. Conventional whole-head MEG systems are designed to accommodate more than 90% of adult heads. However adult systems are not well suited for measuring brain function in pre-school chidren whose heads have a radius several cm smaller than adults. The KIT-Macquarie Brain Research Laboratory at Macquarie University uses a MEG system custom sized to fit the heads of pre-school children. This child system has 64 first-order axial gradiometers with a 50 mm baseline[2] and is contained inside a magnetically-shielded room (MSR) together with a conventional adult-sized MEG system [3,4]. There are three main advantages of the customized helmet dewar for studying children. First, the smaller radius of the sensor configuration brings the SQUID sensors into range of the neuromagnetic signals of children's heads. Second, the smaller helmet allows full insertion of a child's head into the dewar. Full insertion is prevented in adult dewar helmets because of the smaller crown to shoulder distance in children. These two factors are fundamental in recording brain activity using MEG because neuromagnetic signals attenuate rapidly with distance. Third, the customized child helmet aids in the symmetric positioning of the head and limits the freedom of movement of the child's head within the dewar. When used with a protocol that aligns the requirements of data collection with the motivational and behavioral capacities of children, these features significantly facilitate setup, positioning, and measurement of MEG signals.
Neuroscience, Issue 36, Magnetoencephalography, Pediatrics, Brain Mapping, Language, Brain Development, Cognitive Neuroscience, Language Acquisition, Linguistics
1693
Play Button
Clinical Examination Protocol to Detect Atypical and Classical Scrapie in Sheep
Authors: Timm Konold, Laura Phelan.
Institutions: Animal Health and Veterinary Laboratories Agency Weybridge.
The diagnosis of scrapie, a transmissible spongiform encephalopathy (TSEs) of sheep and goats, is currently based on the detection of disease-associated prion protein by post mortem tests. Unless a random sample of the sheep or goat population is actively monitored for scrapie, identification of scrapie cases relies on the reporting of clinical suspects, which is dependent on the individual's familiarization with the disease and ability to recognize clinical signs associated with scrapie. Scrapie may not be considered in the differential diagnosis of neurological diseases in small ruminants, particularly in countries with low scrapie prevalence, or not recognized if it presents as nonpruritic form like atypical scrapie. To aid in the identification of clinical suspects, a short examination protocol is presented to assess the display of specific clinical signs associated with pruritic and nonpruritic forms of TSEs in sheep, which could also be applied to goats. This includes assessment of behavior, vision (by testing of the menace response), pruritus (by testing the response to scratching), and movement (with and without blindfolding). This may lead to a more detailed neurologic examination of reporting animals as scrapie suspects. It could also be used in experimental TSE studies of sheep or goats to evaluate disease progression or to identify clinical end-point.
Infectious Diseases, Issue 83, transmissible spongiform encephalopathy, sheep, atypical scrapie, classical scrapie, neurologic examination, scratch test, menace response, blindfolding
51101
Play Button
Obtaining Highly Purified Toxoplasma gondii Oocysts by a Discontinuous Cesium Chloride Gradient
Authors: Sarah E. Staggs, Mary Jean See, J P. Dubey, Eric N. Villegas.
Institutions: Dynamac, Inc., University of Cincinnati, McMicken College of Arts and Science, Agricultural Research Service, U.S. Department of Agriculture, US Environmental Protection Agency.
Toxoplasma gondii is an obligate intracellular protozoan pathogen that commonly infects humans. It is a well characterized apicomplexan associated with causing food- and water-borne disease outbreaks. The definitive host is the feline species where sexual replication occurs resulting in the development of the highly infectious and environmentally resistant oocyst. Infection occurs via ingestion of tissue cysts from contaminated meat or oocysts from soil or water. Infection is typically asymptomatic in healthy individuals, but results in a life-long latent infection that can reactivate causing toxoplasmic encephalitis and death if the individual becomes immunocompromised. Meat contaminated with T. gondii cysts have been the primary source of infection in Europe and the United States, but recent changes in animal management and husbandry practices and improved food handling and processing procedures have significantly reduced the prevalence of T. gondii cysts in meat1, 2. Nonetheless, seroprevalence in humans remains relatively high suggesting that exposure from oocyst contaminated soil or water is likely. Indeed, waterborne outbreaks of toxoplasmosis have been reported worldwide supporting the theory exposure to the environmental oocyst form poses a significant health risk3-5. To date, research on understanding the prevalence of T. gondii oocysts in the water and environment are limited due to the lack of tools to detect oocysts in the environment 5, 6. This is primarily due to the lack of efficient purification protocols for obtaining large numbers of highly purified T gondii oocysts from infected cats for research purposes. This study describes the development of a modified CsCl method that easily purifies T. gondii oocysts from feces of infected cats that are suitable for molecular biological and tissue culture manipulation7.
Jove Infectious Diseases, Microbiology, Issue 33, Toxoplasma gondii, cesium chloride, oocysts, discontinuous gradient, apicomplexan
1420
Play Button
Recognition of Epidermal Transglutaminase by IgA and Tissue Transglutaminase 2 Antibodies in a Rare Case of Rhesus Dermatitis
Authors: Karol Sestak, Kaushiki Mazumdar, Cecily C. Midkiff, Jason Dufour, Juan T. Borda, Xavier Alvarez.
Institutions: Tulane National Primate Research Center, Tulane National Primate Research Center, Tulane National Primate Research Center.
Tissue transglutaminase 2 (tTG2) is an intestinal digestive enzyme which deamidates already partially digested dietary gluten e.g. gliadin peptides. In genetically predisposed individuals, tTG2 triggers autoimmune responses that are characterized by the production of tTG2 antibodies and their direct deposition into small intestinal wall 1,2. The presence of such antibodies constitutes one of the major hallmarks of the celiac disease (CD). Epidermal transglutaminase (eTG) is another member of the transglutaminase family that can also function as an autoantigen in a small minority of CD patients. In these relatively rare cases, eTG triggers an autoimmune reaction (a skin rash) clinically known as dermatitis herpetiformis (DH). Although the exact mechanism of CD and DH pathogenesis is not well understood, it is known that tTG2 and eTG share antigenic epitopes that can be recognized by serum antibodies from both CD and DH patients 3,4. In this study, the confocal microscopy examination of biopsy samples from skin lesions of two rhesus macaques (Macaca mulatta) with dermatitis (Table 1, Fig. 1 and 2) was used to study the affected tissues. In one animal (EM96) a spectral overlap of IgA and tTG2 antibodies (Fig. 3) was demonstrated. The presence of double-positive tTG2+IgA+ cells was focused in the deep epidermis, around the dermal papillae. This is consistent with lesions described in DH patients 3. When EM96 was placed on a gluten-free diet, the dermatitis, as well as tTG2+IgA+ deposits disappeared and were no longer detectable (Figs. 1-3). Dermatitis reappeared however, based on re-introduction of dietary gluten in EM96 (not shown). In other macaques including animal with unrelated dermatitis, the tTG2+IgA+ deposits were not detected. Gluten-free diet-dependent remission of dermatitis in EM96 together with presence of tTG2+IgA+ cells in its skin suggest an autoimmune, DH-like mechanism for the development of this condition. This is the first report of DH-like dermatitis in any non-human primate.
Immunology, Issue 58, Gluten sensitivity, transglutaminase, autoimmunity, dermatitis, confocal microscopy, skin, rhesus monkey, Macaca mulatta
3154
Play Button
The Optokinetic Response as a Quantitative Measure of Visual Acuity in Zebrafish
Authors: Donald Joshua Cameron, Faydim Rassamdana, Peony Tam, Kathleen Dang, Carolina Yanez, Saman Ghaemmaghami, Mahsa Iranpour Dehkordi.
Institutions: Western University of Health Sciences, Western University of Health Sciences, Western University of Health Sciences.
Zebrafish are a proven model for vision research, however many of the earlier methods generally focused on larval fish or demonstrated a simple response. More recently adult visual behavior in zebrafish has become of interest, but methods to measure specific responses are new coming. To address this gap, we set out to develop a methodology to repeatedly and accurately utilize the optokinetic response (OKR) to measure visual acuity in adult zebrafish. Here we show that the adult zebrafish's visual acuity can be measured, including both binocular and monocular acuities. Because the fish is not harmed during the procedure, the visual acuity can be measured and compared over short or long periods of time. The visual acuity measurements described here can also be done quickly allowing for high throughput and for additional visual procedures if desired. This type of analysis is conducive to drug intervention studies or investigations of disease progression.
Neuroscience, Issue 80, Zebrafish, Eye Movements, Visual Acuity, optokinetic, behavior, adult
50832
Play Button
Eye Tracking Young Children with Autism
Authors: Noah J. Sasson, Jed T. Elison.
Institutions: University of Texas at Dallas, University of North Carolina at Chapel Hill.
The rise of accessible commercial eye-tracking systems has fueled a rapid increase in their use in psychological and psychiatric research. By providing a direct, detailed and objective measure of gaze behavior, eye-tracking has become a valuable tool for examining abnormal perceptual strategies in clinical populations and has been used to identify disorder-specific characteristics1, promote early identification2, and inform treatment3. In particular, investigators of autism spectrum disorders (ASD) have benefited from integrating eye-tracking into their research paradigms4-7. Eye-tracking has largely been used in these studies to reveal mechanisms underlying impaired task performance8 and abnormal brain functioning9, particularly during the processing of social information1,10-11. While older children and adults with ASD comprise the preponderance of research in this area, eye-tracking may be especially useful for studying young children with the disorder as it offers a non-invasive tool for assessing and quantifying early-emerging developmental abnormalities2,12-13. Implementing eye-tracking with young children with ASD, however, is associated with a number of unique challenges, including issues with compliant behavior resulting from specific task demands and disorder-related psychosocial considerations. In this protocol, we detail methodological considerations for optimizing research design, data acquisition and psychometric analysis while eye-tracking young children with ASD. The provided recommendations are also designed to be more broadly applicable for eye-tracking children with other developmental disabilities. By offering guidelines for best practices in these areas based upon lessons derived from our own work, we hope to help other investigators make sound research design and analysis choices while avoiding common pitfalls that can compromise data acquisition while eye-tracking young children with ASD or other developmental difficulties.
Medicine, Issue 61, eye tracking, autism, neurodevelopmental disorders, toddlers, perception, attention, social cognition
3675
Play Button
Methylnitrosourea (MNU)-induced Retinal Degeneration and Regeneration in the Zebrafish: Histological and Functional Characteristics
Authors: Ellinor Maurer, Markus Tschopp, Christoph Tappeiner, Pauline Sallin, Anna Jazwinska, Volker Enzmann.
Institutions: University of Bern, University Hospital of Basel, University of Fribourg.
Retinal degenerative diseases, e.g. retinitis pigmentosa, with resulting photoreceptor damage account for the majority of vision loss in the industrial world. Animal models are of pivotal importance to study such diseases. In this regard the photoreceptor-specific toxin N-methyl-N-nitrosourea (MNU) has been widely used in rodents to pharmacologically induce retinal degeneration. Previously, we have established a MNU-induced retinal degeneration model in the zebrafish, another popular model system in visual research. A fascinating difference to mammals is the persistent neurogenesis in the adult zebrafish retina and its regeneration after damage. To quantify this observation we have employed visual acuity measurements in the adult zebrafish. Thereby, the optokinetic reflex was used to follow functional changes in non-anesthetized fish. This was supplemented with histology as well as immunohistochemical staining for apoptosis (TUNEL) and proliferation (PCNA) to correlate the developing morphological changes. In summary, apoptosis of photoreceptors occurs three days after MNU treatment, which is followed by a marked reduction of cells in the outer nuclear layer (ONL). Thereafter, proliferation of cells in the inner nuclear layer (INL) and ONL is observed. Herein, we reveal that not only a complete histological but also a functional regeneration occurs over a time course of 30 days. Now we illustrate the methods to quantify and follow up zebrafish retinal de- and regeneration using MNU in a video-format.
Cellular Biology, Issue 92, N-methyl-N-nitrosourea (MNU), retina, degeneration, photoreceptors, Müller cells, regeneration, zebrafish, visual function
51909
Play Button
Using the optokinetic response to study visual function of zebrafish
Authors: Su-Qi Zou, Wu Yin, Ming-Jing Zhang, Chun-Rui Hu, Yu-Bin Huang, Bing Hu.
Institutions: University of Science and Technology of China (USTC).
Optokinetic response (OKR) is a behavior that an animal vibrates its eyes to follow a rotating grating around it. It has been widely used to assess the visual functions of larval zebrafish1-5. Nevertheless, the standard protocol for larval fish is not yet readily applicable in adult zabrafish. Here, we introduce how to measure the OKR of adult zebrafish with our simple custom-built apparatus using a new protocol which is established in our lab. Both our apparatus and step-by-step procedure of OKR in adult zebrafish are illustrated in this video. In addition, the measurements of the larval OKR, as well as the optomotor response (OMR) test of adult zebrafish, are also demonstrated in this video. This OKR assay of adult zebrafish in our experiment may last for up to 4 hours. Such OKR test applied in adult fish will benefit to visual function investigation more efficiently when the adult fish vision system is manipulated. Su-Qi Zou and Wu Yin contributed equally to this paper.
Neuroscience, Issue 36, Zebrafish, OKR, OMR, behavior, optokinetic, vision
1742
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
3705
Play Button
Automated Visual Cognitive Tasks for Recording Neural Activity Using a Floor Projection Maze
Authors: Tara K. Jacobson, Jonathan W. Ho, Brendon W. Kent, Fang-Chi Yang, Rebecca D. Burwell.
Institutions: Brown University, Brown University.
Neuropsychological tasks used in primates to investigate mechanisms of learning and memory are typically visually guided cognitive tasks. We have developed visual cognitive tasks for rats using the Floor Projection Maze1,2 that are optimized for visual abilities of rats permitting stronger comparisons of experimental findings with other species. In order to investigate neural correlates of learning and memory, we have integrated electrophysiological recordings into fully automated cognitive tasks on the Floor Projection Maze1,2. Behavioral software interfaced with an animal tracking system allows monitoring of the animal's behavior with precise control of image presentation and reward contingencies for better trained animals. Integration with an in vivo electrophysiological recording system enables examination of behavioral correlates of neural activity at selected epochs of a given cognitive task. We describe protocols for a model system that combines automated visual presentation of information to rodents and intracranial reward with electrophysiological approaches. Our model system offers a sophisticated set of tools as a framework for other cognitive tasks to better isolate and identify specific mechanisms contributing to particular cognitive processes.
Neurobiology, Issue 84, Rat behavioral tasks, visual discrimination, chronic electrophysiological recordings, Floor Projection Maze, neuropsychology, learning, memory
51316
Play Button
Cell-based Assay Protocol for the Prognostic Prediction of Idiopathic Scoliosis Using Cellular Dielectric Spectroscopy
Authors: Marie-Yvonne Akoume, Anita Franco, Alain Moreau.
Institutions: Sainte-Justine University Hospital Research Center, Université de Montréal.
This protocol details the experimental and analytical procedure for a cell-based assay developed in our laboratory as a functional test to predict the prognosis of idiopathic scoliosis in asymptomatic and affected children. The assay consists of the evaluation of the functional status of Gi and Gs proteins in peripheral blood mononuclear cells (PBMCs) by cellular dielectric spectroscopy (CDS), using an automated CDS-based instrument, and the classification of children into three functional groups (FG1, FG2, FG3) with respect to the profile of imbalance between the degree of response to Gi and Gs proteins stimulation. The classification is further confirmed by the differential effect of osteopontin (OPN) on response to Gi stimulation among groups and the severe progression of disease is referenced by FG2. Approximately, a volume of 10 ml of blood is required to extract PBMCs by Ficoll-gradient and cells are then stored in liquid nitrogen. The adequate number of PBMCs to perform the assay is obtained after two days of cell culture. Essentially, cells are first incubated with phytohemmaglutinin (PHA). After 24 hr incubation, medium is replaced by a PHA-free culture medium for an additional 24 hr prior to cell seeding and OPN treatment. Cells are then spectroscopically screened for their responses to somatostatin and isoproterenol, which respectively activate Gi and Gs proteins through their cognate receptors. Both somatostatin and isoproterenol are simultaneously injected with an integrated fluidics system and the cells' responses are monitored for 15 min. The assay can be performed with fresh or frozen PBMCs and the procedure is completed within 4 days.
Medicine, Issue 80, Blood Cells, Lymphocytes, Spinal Diseases, Diagnostic Techniques and Procedures, Clinical Laboratory Techniques, Dielectric Spectroscopy, Musculoskeletal Diseases, Idiopathic scoliosis, classification, prognosis, G proteins, cellular dielectric spectroscopy, PBMCs
50768
Play Button
EEG Mu Rhythm in Typical and Atypical Development
Authors: Raphael Bernier, Benjamin Aaronson, Anna Kresse.
Institutions: University of Washington, University of Washington.
Electroencephalography (EEG) is an effective, efficient, and noninvasive method of assessing and recording brain activity. Given the excellent temporal resolution, EEG can be used to examine the neural response related to specific behaviors, states, or external stimuli. An example of this utility is the assessment of the mirror neuron system (MNS) in humans through the examination of the EEG mu rhythm. The EEG mu rhythm, oscillatory activity in the 8-12 Hz frequency range recorded from centrally located electrodes, is suppressed when an individual executes, or simply observes, goal directed actions. As such, it has been proposed to reflect activity of the MNS. It has been theorized that dysfunction in the mirror neuron system (MNS) plays a contributing role in the social deficits of autism spectrum disorder (ASD). The MNS can then be noninvasively examined in clinical populations by using EEG mu rhythm attenuation as an index for its activity. The described protocol provides an avenue to examine social cognitive functions theoretically linked to the MNS in individuals with typical and atypical development, such as ASD. 
Medicine, Issue 86, Electroencephalography (EEG), mu rhythm, imitation, autism spectrum disorder, social cognition, mirror neuron system
51412
Play Button
Corneal Donor Tissue Preparation for Descemet's Membrane Endothelial Keratoplasty
Authors: Hassan N. Tausif, Lauren Johnson, Michael Titus, Kyle Mavin, Navasuja Chandrasekaran, Maria A. Woodward, Roni M. Shtein, Shahzad I. Mian.
Institutions: University of Michigan, MidWest Eye Banks.
Descemet’s Membrane Endothelial Keratoplasty (DMEK) is a form of corneal transplantation in which only a single cell layer, the corneal endothelium, along with its basement membrane (Descemet's membrane) is introduced onto the recipient's posterior stroma3. Unlike Descemet’s Stripping Automated Endothelial Keratoplasty (DSAEK), where additional donor stroma is introduced, no unnatural stroma-to-stroma interface is created. As a result, the natural anatomy of the cornea is preserved as much as possible allowing for improved recovery time and visual acuity4. Endothelial Keratoplasty (EK) is the procedure of choice for treatment of endothelial dysfunction. The advantages of EK include rapid recovery of vision, preservation of ocular integrity and minimal refractive change due to use of a small, peripheral incision1. DSAEK utilizes donor tissue prepared with partial thickness stroma and endothelium. The rapid success and utilization of this procedure can be attributed to availability of eye-bank prepared precut tissue. The benefits of eye-bank preparation of donor tissue include elimination of need for specialized equipment in the operating room and availability of back up donor tissue in case of tissue perforation during preparation. In addition, high volume preparation of donor tissue by eye-bank technicians may provide improved quality of donor tissue. DSAEK may have limited best corrected visual acuity due to creation of a stromal interface between the donor and recipient cornea. Elimination of this interface with transplantation of only donor Descemet's membrane and endothelium in DMEK may improve visual outcomes and reduce complications after EK5. Similar to DSAEK, long term success and acceptance of DMEK is dependent on ease of availability of precut, eye-bank prepared donor tissue. Here we present a stepwise approach to donor tissue preparation which may reduce some barriers eye-banks face in providing DMEK grafts.
Medicine, Issue 91, DMEK, EK, endothelial keratoplasty, Descemet’s membrane endothelial keratoplasty, corneal transplantation, eye bank, donor tissue preparation
51919
Play Button
Dynamic Visual Tests to Identify and Quantify Visual Damage and Repair Following Demyelination in Optic Neuritis Patients
Authors: Noa Raz, Michal Hallak, Tamir Ben-Hur, Netta Levin.
Institutions: Hadassah Hebrew-University Medical Center.
In order to follow optic neuritis patients and evaluate the effectiveness of their treatment, a handy, accurate and quantifiable tool is required to assess changes in myelination at the central nervous system (CNS). However, standard measurements, including routine visual tests and MRI scans, are not sensitive enough for this purpose. We present two visual tests addressing dynamic monocular and binocular functions which may closely associate with the extent of myelination along visual pathways. These include Object From Motion (OFM) extraction and Time-constrained stereo protocols. In the OFM test, an array of dots compose an object, by moving the dots within the image rightward while moving the dots outside the image leftward or vice versa. The dot pattern generates a camouflaged object that cannot be detected when the dots are stationary or moving as a whole. Importantly, object recognition is critically dependent on motion perception. In the Time-constrained Stereo protocol, spatially disparate images are presented for a limited length of time, challenging binocular 3-dimensional integration in time. Both tests are appropriate for clinical usage and provide a simple, yet powerful, way to identify and quantify processes of demyelination and remyelination along visual pathways. These protocols may be efficient to diagnose and follow optic neuritis and multiple sclerosis patients. In the diagnostic process, these protocols may reveal visual deficits that cannot be identified via current standard visual measurements. Moreover, these protocols sensitively identify the basis of the currently unexplained continued visual complaints of patients following recovery of visual acuity. In the longitudinal follow up course, the protocols can be used as a sensitive marker of demyelinating and remyelinating processes along time. These protocols may therefore be used to evaluate the efficacy of current and evolving therapeutic strategies, targeting myelination of the CNS.
Medicine, Issue 86, Optic neuritis, visual impairment, dynamic visual functions, motion perception, stereopsis, demyelination, remyelination
51107
Play Button
A Standardized Obstacle Course for Assessment of Visual Function in Ultra Low Vision and Artificial Vision
Authors: Amy Catherine Nau, Christine Pintar, Christopher Fisher, Jong-Hyeon Jeong, KwonHo Jeong.
Institutions: University of Pittsburgh, University of Pittsburgh.
We describe an indoor, portable, standardized course that can be used to evaluate obstacle avoidance in persons who have ultralow vision. Six sighted controls and 36 completely blind but otherwise healthy adult male (n=29) and female (n=13) subjects (age range 19-85 years), were enrolled in one of three studies involving testing of the BrainPort sensory substitution device. Subjects were asked to navigate the course prior to, and after, BrainPort training. They completed a total of 837 course runs in two different locations. Means and standard deviations were calculated across control types, courses, lights, and visits. We used a linear mixed effects model to compare different categories in the PPWS (percent preferred walking speed) and error percent data to show that the course iterations were properly designed. The course is relatively inexpensive, simple to administer, and has been shown to be a feasible way to test mobility function. Data analysis demonstrates that for the outcome of percent error as well as for percentage preferred walking speed, that each of the three courses is different, and that within each level, each of the three iterations are equal. This allows for randomization of the courses during administration. Abbreviations: preferred walking speed (PWS) course speed (CS) percentage preferred walking speed (PPWS)
Medicine, Issue 84, Obstacle course, navigation assessment, BrainPort, wayfinding, low vision
51205
Play Button
Determining Soil-transmitted Helminth Infection Status and Physical Fitness of School-aged Children
Authors: Peiling Yap, Thomas Fürst, Ivan Müller, Susi Kriemler, Jürg Utzinger, Peter Steinmann.
Institutions: Swiss Tropical and Public Health Institute, Basel, Switzerland, University of Basel, Basel, Switzerland.
Soil-transmitted helminth (STH) infections are common. Indeed, more than 1 billion people are affected, mainly in the developing world where poverty prevails and hygiene behavior, water supply, and sanitation are often deficient1,2. Ascaris lumbricoides, Trichuris trichiura, and the two hookworm species, Ancylostoma duodenale and Necator americanus, are the most prevalent STHs3. The estimated global burden due to hookworm disease, ascariasis, and trichuriasis is 22.1, 10.5, and 6.4 million disability-adjusted life years (DALYs), respectively4. Furthermore, an estimated 30-100 million people are infected with Strongyloides stercoralis, the most neglected STH species of global significance which arguably also causes a considerable public health impact5,6. Multiple-species infections (i.e., different STHs harbored in a single individual) are common, and infections have been linked to lowered productivity and thus economic outlook of developing countries1,3. For the diagnosis of common STHs, the World Health Organization (WHO) recommends the Kato-Katz technique7,8, which is a relatively straightforward method for determining the prevalence and intensity of such infections. It facilitates the detection of parasite eggs that infected subjects pass in their feces. With regard to the diagnosis of S.stercoralis, there is currently no simple and accurate tool available. The Baermann technique is the most widely employed method for its diagnosis. The principle behind the Baermann technique is that active S.stercoralis larvae migrate out of an illuminated fresh fecal sample as the larvae are phototactic9. It requires less sophisticated laboratory materials and is less time consuming than culture and immunological methods5. Morbidities associated with STH infections range from acute but common symptoms, such as abdominal pain, diarrhea, and pruritus, to chronic symptoms, such as anemia, under- and malnutrition, and cognitive impairment10. Since the symptoms are generally unspecific and subtle, they often go unnoticed, are considered a normal condition by affected individuals, or are treated as symptoms of other diseases that might be more common in a given setting. Hence, it is conceivable that the true burden of STH infections is underestimated by assessment tools relying on self-declared signs and symptoms as is usually the case in population-based surveys. In the late 1980s and early 1990s, Stephenson and colleagues highlighted the possibility of STH infections lowering the physical fitness of boys aged 6-12 years11,12. This line of scientific inquiry gained new momentum recently13,14,15. The 20-meter (m) shuttle run test was developed and validated by Léger et al.16 and is used worldwide to measure the aerobic fitness of children17. The test is easy to standardize and can be performed wherever a 20-m long and flat running course and an audio source are available, making its use attractive in resource-constrained settings13. To facilitate and standardize attempts at assessing whether STH infections have an effect on the physical fitness of school-aged children, we present methodologies that diagnose STH infections or measure physical fitness that are simple to execute and yet, provide accurate and reproducible outcomes. This will help to generate new evidence regarding the health impact of STH infections.
Infection, Issue 66, Immunology, Medicine, Infectious Diseases, Soil-transmitted helminths, physical fitness, Kato-Katz technique, Baermann technique, 20-meter shuttle run test, children
3966
Play Button
Measuring Attentional Biases for Threat in Children and Adults
Authors: Vanessa LoBue.
Institutions: Rutgers University.
Investigators have long been interested in the human propensity for the rapid detection of threatening stimuli. However, until recently, research in this domain has focused almost exclusively on adult participants, completely ignoring the topic of threat detection over the course of development. One of the biggest reasons for the lack of developmental work in this area is likely the absence of a reliable paradigm that can measure perceptual biases for threat in children. To address this issue, we recently designed a modified visual search paradigm similar to the standard adult paradigm that is appropriate for studying threat detection in preschool-aged participants. Here we describe this new procedure. In the general paradigm, we present participants with matrices of color photographs, and ask them to find and touch a target on the screen. Latency to touch the target is recorded. Using a touch-screen monitor makes the procedure simple and easy, allowing us to collect data in participants ranging from 3 years of age to adults. Thus far, the paradigm has consistently shown that both adults and children detect threatening stimuli (e.g., snakes, spiders, angry/fearful faces) more quickly than neutral stimuli (e.g., flowers, mushrooms, happy/neutral faces). Altogether, this procedure provides an important new tool for researchers interested in studying the development of attentional biases for threat.
Behavior, Issue 92, Detection, threat, attention, attentional bias, anxiety, visual search
52190
Play Button
A Laser-induced Mouse Model of Chronic Ocular Hypertension to Characterize Visual Defects
Authors: Liang Feng, Hui Chen, Genn Suyeoka, Xiaorong Liu.
Institutions: Northwestern University, Northwestern University.
Glaucoma, frequently associated with elevated intraocular pressure (IOP), is one of the leading causes of blindness. We sought to establish a mouse model of ocular hypertension to mimic human high-tension glaucoma. Here laser illumination is applied to the corneal limbus to photocoagulate the aqueous outflow, inducing angle closure. The changes of IOP are monitored using a rebound tonometer before and after the laser treatment. An optomotor behavioral test is used to measure corresponding changes in visual capacity. The representative result from one mouse which developed sustained IOP elevation after laser illumination is shown. A decreased visual acuity and contrast sensitivity is observed in this ocular hypertensive mouse. Together, our study introduces a valuable model system to investigate neuronal degeneration and the underlying molecular mechanisms in glaucomatous mice.
Medicine, Issue 78, Biomedical Engineering, Neurobiology, Anatomy, Physiology, Neuroscience, Cellular Biology, Molecular Biology, Ophthalmology, Retinal Neurons, Retinal Neurons, Retinal Ganglion Cells, Neurodegenerative Diseases, Ocular Hypertension, Retinal Degeneration, Vision Tests, Visual Acuity, Eye Diseases, Retinal Ganglion Cell (RGC), Ocular Hypertension, Laser Photocoagulation, Intraocular pressure (IOP), Tonometer; Visual Acuity, Contrast Sensitivity, Optomotor, animal model
50440
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.