JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Analysis of quality of clinical practice guidelines for otorhinolaryngology in China.
PLoS ONE
PUBLISHED: 01-18-2013
To evaluate the quality of clinical practice guidelines (CPGs) for otorhinolaryngology in China.
Authors: Ronald W. Jensen, Jason Rivest, Wei Li, Varalakshmi Vissa.
Published: 07-15-2011
ABSTRACT
The study of the transmission of leprosy is particularly difficult since the causative agent, Mycobacterium leprae, cannot be cultured in the laboratory. The only sources of the bacteria are leprosy patients, and experimentally infected armadillos and nude mice. Thus, many of the methods used in modern epidemiology are not available for the study of leprosy. Despite an extensive global drug treatment program for leprosy implemented by the WHO1, leprosy remains endemic in many countries with approximately 250,000 new cases each year.2 The entire M. leprae genome has been mapped3,4 and many loci have been identified that have repeated segments of 2 or more base pairs (called micro- and minisatellites).5 Clinical strains of M. leprae may vary in the number of tandem repeated segments (short tandem repeats, STR) at many of these loci.5,6,7 Variable number tandem repeat (VNTR)5 analysis has been used to distinguish different strains of the leprosy bacilli. Some of the loci appear to be more stable than others, showing less variation in repeat numbers, while others seem to change more rapidly, sometimes in the same patient. While the variability of certain VNTRs has brought up questions regarding their suitability for strain typing7,8,9, the emerging data suggest that analyzing multiple loci, which are diverse in their stability, can be used as a valuable epidemiological tool. Multiple locus VNTR analysis (MLVA)10 has been used to study leprosy evolution and transmission in several countries including China11,12, Malawi8, the Philippines10,13, and Brazil14. MLVA involves multiple steps. First, bacterial DNA is extracted along with host tissue DNA from clinical biopsies or slit skin smears (SSS).10 The desired loci are then amplified from the extracted DNA via polymerase chain reaction (PCR). Fluorescently-labeled primers for 4-5 different loci are used per reaction, with 18 loci being amplified in a total of four reactions.10 The PCR products may be subjected to agarose gel electrophoresis to verify the presence of the desired DNA segments, and then submitted for fluorescent fragment length analysis (FLA) using capillary electrophoresis. DNA from armadillo passaged bacteria with a known number of repeat copies for each locus is used as a positive control. The FLA chromatograms are then examined using Peak Scanner software and fragment length is converted to number of VNTR copies (allele). Finally, the VNTR haplotypes are analyzed for patterns, and when combined with patient clinical data can be used to track distribution of strain types.
19 Related JoVE Articles!
Play Button
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
52183
Play Button
Computerized Dynamic Posturography for Postural Control Assessment in Patients with Intermittent Claudication
Authors: Natalie Vanicek, Stephanie A. King, Risha Gohil, Ian C. Chetter, Patrick A Coughlin.
Institutions: University of Sydney, University of Hull, Hull and East Yorkshire Hospitals, Addenbrookes Hospital.
Computerized dynamic posturography with the EquiTest is an objective technique for measuring postural strategies under challenging static and dynamic conditions. As part of a diagnostic assessment, the early detection of postural deficits is important so that appropriate and targeted interventions can be prescribed. The Sensory Organization Test (SOT) on the EquiTest determines an individual's use of the sensory systems (somatosensory, visual, and vestibular) that are responsible for postural control. Somatosensory and visual input are altered by the calibrated sway-referenced support surface and visual surround, which move in the anterior-posterior direction in response to the individual's postural sway. This creates a conflicting sensory experience. The Motor Control Test (MCT) challenges postural control by creating unexpected postural disturbances in the form of backwards and forwards translations. The translations are graded in magnitude and the time to recover from the perturbation is computed. Intermittent claudication, the most common symptom of peripheral arterial disease, is characterized by a cramping pain in the lower limbs and caused by muscle ischemia secondary to reduced blood flow to working muscles during physical exertion. Claudicants often display poor balance, making them susceptible to falls and activity avoidance. The Ankle Brachial Pressure Index (ABPI) is a noninvasive method for indicating the presence of peripheral arterial disease and intermittent claudication, a common symptom in the lower extremities. ABPI is measured as the highest systolic pressure from either the dorsalis pedis or posterior tibial artery divided by the highest brachial artery systolic pressure from either arm. This paper will focus on the use of computerized dynamic posturography in the assessment of balance in claudicants.
Medicine, Issue 82, Posture, Computerized dynamic posturography, Ankle brachial pressure index, Peripheral arterial disease, Intermittent claudication, Balance, Posture, EquiTest, Sensory Organization Test, Motor Control Test
51077
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
51242
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
51318
Play Button
Tissue Triage and Freezing for Models of Skeletal Muscle Disease
Authors: Hui Meng, Paul M.L. Janssen, Robert W. Grange, Lin Yang, Alan H. Beggs, Lindsay C. Swanson, Stacy A. Cossette, Alison Frase, Martin K. Childers, Henk Granzier, Emanuela Gussoni, Michael W. Lawlor.
Institutions: Medical College of Wisconsin, The Ohio State University, Virginia Tech, University of Kentucky, Boston Children's Hospital, Harvard Medical School, Cure Congenital Muscular Dystrophy, Joshua Frase Foundation, University of Washington, University of Arizona.
Skeletal muscle is a unique tissue because of its structure and function, which requires specific protocols for tissue collection to obtain optimal results from functional, cellular, molecular, and pathological evaluations. Due to the subtlety of some pathological abnormalities seen in congenital muscle disorders and the potential for fixation to interfere with the recognition of these features, pathological evaluation of frozen muscle is preferable to fixed muscle when evaluating skeletal muscle for congenital muscle disease. Additionally, the potential to produce severe freezing artifacts in muscle requires specific precautions when freezing skeletal muscle for histological examination that are not commonly used when freezing other tissues. This manuscript describes a protocol for rapid freezing of skeletal muscle using isopentane (2-methylbutane) cooled with liquid nitrogen to preserve optimal skeletal muscle morphology. This procedure is also effective for freezing tissue intended for genetic or protein expression studies. Furthermore, we have integrated our freezing protocol into a broader procedure that also describes preferred methods for the short term triage of tissue for (1) single fiber functional studies and (2) myoblast cell culture, with a focus on the minimum effort necessary to collect tissue and transport it to specialized research or reference labs to complete these studies. Overall, this manuscript provides an outline of how fresh tissue can be effectively distributed for a variety of phenotypic studies and thereby provides standard operating procedures (SOPs) for pathological studies related to congenital muscle disease.
Basic Protocol, Issue 89, Tissue, Freezing, Muscle, Isopentane, Pathology, Functional Testing, Cell Culture
51586
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Authors: Madeleine E. Hackney, Kathleen McKee.
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
52066
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
52070
Play Button
Transplantation of Olfactory Ensheathing Cells to Evaluate Functional Recovery after Peripheral Nerve Injury
Authors: Nicolas Guerout, Alexandre Paviot, Nicolas Bon-Mardion, Axel Honoré, Rais OBongo, Célia Duclos, Jean-Paul Marie.
Institutions: University of Rouen, Karolinska Institutet, Rouen University Hospital, Amiens University Hospital.
Olfactory ensheathing cells (OECs) are neural crest cells which allow growth and regrowth of the primary olfactory neurons. Indeed, the primary olfactory system is characterized by its ability to give rise to new neurons even in adult animals. This particular ability is partly due to the presence of OECs which create a favorable microenvironment for neurogenesis. This property of OECs has been used for cellular transplantation such as in spinal cord injury models. Although the peripheral nervous system has a greater capacity to regenerate after nerve injury than the central nervous system, complete sections induce misrouting during axonal regrowth in particular after facial of laryngeal nerve transection. Specifically, full sectioning of the recurrent laryngeal nerve (RLN) induces aberrant axonal regrowth resulting in synkinesis of the vocal cords. In this specific model, we showed that OECs transplantation efficiently increases axonal regrowth. OECs are constituted of several subpopulations present in both the olfactory mucosa (OM-OECs) and the olfactory bulbs (OB-OECs). We present here a model of cellular transplantation based on the use of these different subpopulations of OECs in a RLN injury model. Using this paradigm, primary cultures of OB-OECs and OM-OECs were transplanted in Matrigel after section and anastomosis of the RLN. Two months after surgery, we evaluated transplanted animals by complementary analyses based on videolaryngoscopy, electromyography (EMG), and histological studies. First, videolaryngoscopy allowed us to evaluate laryngeal functions, in particular muscular cocontractions phenomena. Then, EMG analyses demonstrated richness and synchronization of muscular activities. Finally, histological studies based on toluidine blue staining allowed the quantification of the number and profile of myelinated fibers. All together, we describe here how to isolate, culture, identify and transplant OECs from OM and OB after RLN section-anastomosis and how to evaluate and analyze the efficiency of these transplanted cells on axonal regrowth and laryngeal functions.
Neuroscience, Issue 84, olfactory ensheathing cells, spinal cord injury, transplantation, larynx, recurrent laryngeal nerve, peripheral nerve injury, vocal cords
50590
Play Button
Best Current Practice for Obtaining High Quality EEG Data During Simultaneous fMRI
Authors: Karen J. Mullinger, Pierluigi Castellone, Richard Bowtell.
Institutions: University of Nottingham , Brain Products GmbH.
Simultaneous EEG-fMRI allows the excellent temporal resolution of EEG to be combined with the high spatial accuracy of fMRI. The data from these two modalities can be combined in a number of ways, but all rely on the acquisition of high quality EEG and fMRI data. EEG data acquired during simultaneous fMRI are affected by several artifacts, including the gradient artefact (due to the changing magnetic field gradients required for fMRI), the pulse artefact (linked to the cardiac cycle) and movement artifacts (resulting from movements in the strong magnetic field of the scanner, and muscle activity). Post-processing methods for successfully correcting the gradient and pulse artifacts require a number of criteria to be satisfied during data acquisition. Minimizing head motion during EEG-fMRI is also imperative for limiting the generation of artifacts. Interactions between the radio frequency (RF) pulses required for MRI and the EEG hardware may occur and can cause heating. This is only a significant risk if safety guidelines are not satisfied. Hardware design and set-up, as well as careful selection of which MR sequences are run with the EEG hardware present must therefore be considered. The above issues highlight the importance of the choice of the experimental protocol employed when performing a simultaneous EEG-fMRI experiment. Based on previous research we describe an optimal experimental set-up. This provides high quality EEG data during simultaneous fMRI when using commercial EEG and fMRI systems, with safety risks to the subject minimized. We demonstrate this set-up in an EEG-fMRI experiment using a simple visual stimulus. However, much more complex stimuli can be used. Here we show the EEG-fMRI set-up using a Brain Products GmbH (Gilching, Germany) MRplus, 32 channel EEG system in conjunction with a Philips Achieva (Best, Netherlands) 3T MR scanner, although many of the techniques are transferable to other systems.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Biophysics, Medicine, Neuroimaging, Functional Neuroimaging, Investigative Techniques, neurosciences, EEG, functional magnetic resonance imaging, fMRI, magnetic resonance imaging, MRI, simultaneous, recording, imaging, clinical techniques
50283
Play Button
Clinical Testing and Spinal Cord Removal in a Mouse Model for Amyotrophic Lateral Sclerosis (ALS)
Authors: René Günther, Martin Suhr, Jan C. Koch, Mathias Bähr, Paul Lingor, Lars Tönges.
Institutions: University Medicine Göttingen, Göttingen, Germany.
Amyotrophic lateral sclerosis (ALS) is a fatal neurodegenerative disorder resulting in progressive degeneration of motoneurons. Peak of onset is around 60 years for the sporadic disease and around 50 years for the familial disease. Due to its progressive course, 50% of the patients die within 30 months of symptom onset. In order to evaluate novel treatment options for this disease, genetic mouse models of ALS have been generated based on human familial mutations in the SOD gene, such as the SOD1 (G93A) mutation. Most important aspects that have to be evaluated in the model are overall survival, clinical course and motor function. Here, we demonstrate the clinical evaluation, show the conduction of two behavioural motor tests and provide quantitative scoring systems for all parameters. Because an in depth analysis of the ALS mouse model usually requires an immunohistochemical examination of the spinal cord, we demonstrate its preparation in detail applying the dorsal laminectomy method. Exemplary histological findings are demonstrated. The comprehensive application of the depicted examination methods in studies on the mouse model of ALS will enable the researcher to reliably test future therapeutic options which can provide a basis for later human clinical trials.
Medicine, Issue 61, neuroscience, amyotrophic lateral sclerosis, ALS, spinal cord, mouse, rotarod, hanging wire
3936
Play Button
Prediction of HIV-1 Coreceptor Usage (Tropism) by Sequence Analysis using a Genotypic Approach
Authors: Saleta Sierra, Rolf Kaiser, Nadine Lübke, Alexander Thielen, Eugen Schuelter, Eva Heger, Martin Däumer, Stefan Reuter, Stefan Esser, Gerd Fätkenheuer, Herbert Pfister, Mark Oette, Thomas Lengauer.
Institutions: University of Cologne, Max Planck Institute for Informatics, Institute for Immune genetics, University of Duesseldorf, University of Essen, University of Cologne, Augustinerinnen Hospital.
Maraviroc (MVC) is the first licensed antiretroviral drug from the class of coreceptor antagonists. It binds to the host coreceptor CCR5, which is used by the majority of HIV strains in order to infect the human immune cells (Fig. 1). Other HIV isolates use a different coreceptor, the CXCR4. Which receptor is used, is determined in the virus by the Env protein (Fig. 2). Depending on the coreceptor used, the viruses are classified as R5 or X4, respectively. MVC binds to the CCR5 receptor inhibiting the entry of R5 viruses into the target cell. During the course of disease, X4 viruses may emerge and outgrow the R5 viruses. Determination of coreceptor usage (also called tropism) is therefore mandatory prior to administration of MVC, as demanded by EMA and FDA. The studies for MVC efficiency MOTIVATE, MERIT and 1029 have been performed with the Trofile assay from Monogram, San Francisco, U.S.A. This is a high quality assay based on sophisticated recombinant tests. The acceptance for this test for daily routine is rather low outside of the U.S.A., since the European physicians rather tend to work with decentralized expert laboratories, which also provide concomitant resistance testing. These laboratories have undergone several quality assurance evaluations, the last one being presented in 20111. For several years now, we have performed tropism determinations based on sequence analysis from the HIV env-V3 gene region (V3)2. This region carries enough information to perform a reliable prediction. The genotypic determination of coreceptor usage presents advantages such as: shorter turnover time (equivalent to resistance testing), lower costs, possibility to adapt the results to the patients' needs and possibility of analysing clinical samples with very low or even undetectable viral load (VL), particularly since the number of samples analysed with VL<1000 copies/μl roughly increased in the last years (Fig. 3). The main steps for tropism testing (Fig. 4) demonstrated in this video: 1. Collection of a blood sample 2. Isolation of the HIV RNA from the plasma and/or HIV proviral DNA from blood mononuclear cells 3. Amplification of the env region 4. Amplification of the V3 region 5. Sequence reaction of the V3 amplicon 6. Purification of the sequencing samples 7. Sequencing the purified samples 8. Sequence editing 9. Sequencing data interpretation and tropism prediction
Immunology, Issue 58, HIV-1, coreceptor, coreceptor antagonist, prediction of coreceptor usage, tropism, R5, X4, maraviroc, MVC
3264
Play Button
DNA Methylation: Bisulphite Modification and Analysis
Authors: Kate Patterson, Laura Molloy, Wenjia Qu, Susan Clark.
Institutions: Garvan Institute of Medical Research, University of NSW.
Epigenetics describes the heritable changes in gene function that occur independently to the DNA sequence. The molecular basis of epigenetic gene regulation is complex, but essentially involves modifications to the DNA itself or the proteins with which DNA associates. The predominant epigenetic modification of DNA in mammalian genomes is methylation of cytosine nucleotides (5-MeC). DNA methylation provides instruction to gene expression machinery as to where and when the gene should be expressed. The primary target sequence for DNA methylation in mammals is 5'-CpG-3' dinucleotides (Figure 1). CpG dinucleotides are not uniformly distributed throughout the genome, but are concentrated in regions of repetitive genomic sequences and CpG "islands" commonly associated with gene promoters (Figure 1). DNA methylation patterns are established early in development, modulated during tissue specific differentiation and disrupted in many disease states including cancer. To understand the biological role of DNA methylation and its role in human disease, precise, efficient and reproducible methods are required to detect and quantify individual 5-MeCs. This protocol for bisulphite conversion is the "gold standard" for DNA methylation analysis and facilitates identification and quantification of DNA methylation at single nucleotide resolution. The chemistry of cytosine deamination by sodium bisulphite involves three steps (Figure 2). (1) Sulphonation: The addition of bisulphite to the 5-6 double bond of cytosine (2) Hydrolic Deamination: hydrolytic deamination of the resulting cytosine-bisulphite derivative to give a uracil-bisulphite derivative (3) Alkali Desulphonation: Removal of the sulphonate group by an alkali treatment, to give uracil. Bisulphite preferentially deaminates cytosine to uracil in single stranded DNA, whereas 5-MeC, is refractory to bisulphite-mediated deamination. Upon PCR amplification, uracil is amplified as thymine while 5-MeC residues remain as cytosines, allowing methylated CpGs to be distinguished from unmethylated CpGs by presence of a cytosine "C" versus thymine "T" residue during sequencing. DNA modification by bisulphite conversion is a well-established protocol that can be exploited for many methods of DNA methylation analysis. Since the detection of 5-MeC by bisulphite conversion was first demonstrated by Frommer et al.1 and Clark et al.2, methods based around bisulphite conversion of genomic DNA account for the majority of new data on DNA methylation. Different methods of post PCR analysis may be utilized, depending on the degree of specificity and resolution of methylation required. Cloning and sequencing is still the most readily available method that can give single nucleotide resolution for methylation across the DNA molecule.
Genetics, Issue 56, epigenetics, DNA methylation, Bisulphite, 5-methylcytosine (5-MeC), PCR
3170
Play Button
Electrospinning Fundamentals: Optimizing Solution and Apparatus Parameters
Authors: Michelle K. Leach, Zhang-Qi Feng, Samuel J. Tuck, Joseph M. Corey.
Institutions: University of Michigan, Southeast University, University of Michigan, Veterans Affairs Ann Arbor Healthcare Center.
Electrospun nanofiber scaffolds have been shown to accelerate the maturation, improve the growth, and direct the migration of cells in vitro. Electrospinning is a process in which a charged polymer jet is collected on a grounded collector; a rapidly rotating collector results in aligned nanofibers while stationary collectors result in randomly oriented fiber mats. The polymer jet is formed when an applied electrostatic charge overcomes the surface tension of the solution. There is a minimum concentration for a given polymer, termed the critical entanglement concentration, below which a stable jet cannot be achieved and no nanofibers will form - although nanoparticles may be achieved (electrospray). A stable jet has two domains, a streaming segment and a whipping segment. While the whipping jet is usually invisible to the naked eye, the streaming segment is often visible under appropriate lighting conditions. Observing the length, thickness, consistency and movement of the stream is useful to predict the alignment and morphology of the nanofibers being formed. A short, non-uniform, inconsistent, and/or oscillating stream is indicative of a variety of problems, including poor fiber alignment, beading, splattering, and curlicue or wavy patterns. The stream can be optimized by adjusting the composition of the solution and the configuration of the electrospinning apparatus, thus optimizing the alignment and morphology of the fibers being produced. In this protocol, we present a procedure for setting up a basic electrospinning apparatus, empirically approximating the critical entanglement concentration of a polymer solution and optimizing the electrospinning process. In addition, we discuss some common problems and troubleshooting techniques.
Bioengineering, Issue 47, electrospinning, nanofibers, scaffold, alignment
2494
Play Button
Guidelines for Elective Pediatric Fiberoptic Intubation
Authors: Roland N. Kaddoum, Zulfiqar Ahmed, Alan A. D'Augsutine, Maria M. Zestos.
Institutions: St. Jude Children's Research Hospital, Children's Hospital of Michigan, Children's Hospital of Michigan.
Fiberoptic intubation in pediatric patients is often required especially in difficult airways of syndromic patients i.e. Pierre Robin Syndrome. Small babies will desaturate very quickly if ventilation is interrupted mainly to high metabolic rate. We describe guidelines to perform a safe fiberoptic intubation while maintaining spontaneous breathing throughout the procedure. Steps requiring the use of propofol pump, fentanyl, glycopyrrolate, red rubber catheter, metal insuflation hook, afrin, lubricant and lidocaine spray are shown.
Medicine, Issue 47, Fiberoptic, Intubation, Pediatric, elective
2364
Play Button
Using the optokinetic response to study visual function of zebrafish
Authors: Su-Qi Zou, Wu Yin, Ming-Jing Zhang, Chun-Rui Hu, Yu-Bin Huang, Bing Hu.
Institutions: University of Science and Technology of China (USTC).
Optokinetic response (OKR) is a behavior that an animal vibrates its eyes to follow a rotating grating around it. It has been widely used to assess the visual functions of larval zebrafish1-5. Nevertheless, the standard protocol for larval fish is not yet readily applicable in adult zabrafish. Here, we introduce how to measure the OKR of adult zebrafish with our simple custom-built apparatus using a new protocol which is established in our lab. Both our apparatus and step-by-step procedure of OKR in adult zebrafish are illustrated in this video. In addition, the measurements of the larval OKR, as well as the optomotor response (OMR) test of adult zebrafish, are also demonstrated in this video. This OKR assay of adult zebrafish in our experiment may last for up to 4 hours. Such OKR test applied in adult fish will benefit to visual function investigation more efficiently when the adult fish vision system is manipulated. Su-Qi Zou and Wu Yin contributed equally to this paper.
Neuroscience, Issue 36, Zebrafish, OKR, OMR, behavior, optokinetic, vision
1742
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
3705
Play Button
Electrophysiological Measurements and Analysis of Nociception in Human Infants
Authors: L. Fabrizi, A. Worley, D. Patten, S. Holdridge, L. Cornelissen, J. Meek, S. Boyd, R. Slater.
Institutions: University College London, Great Ormond Street Hospital, University College Hospital, University of Oxford.
Pain is an unpleasant sensory and emotional experience. Since infants cannot verbally report their experiences, current methods of pain assessment are based on behavioural and physiological body reactions, such as crying, body movements or changes in facial expression. While these measures demonstrate that infants mount a response following noxious stimulation, they are limited: they are based on activation of subcortical somatic and autonomic motor pathways that may not be reliably linked to central sensory processing in the brain. Knowledge of how the central nervous system responds to noxious events could provide an insight to how nociceptive information and pain is processed in newborns. The heel lancing procedure used to extract blood from hospitalised infants offers a unique opportunity to study pain in infancy. In this video we describe how electroencephalography (EEG) and electromyography (EMG) time-locked to this procedure can be used to investigate nociceptive activity in the brain and spinal cord. This integrative approach to the measurement of infant pain has the potential to pave the way for an effective and sensitive clinical measurement tool.
Neuroscience, Issue 58, pain, infant, electrophysiology, human development
3118
Play Button
Preparation and Using Phantom Lesions to Practice Fine Needle Aspiration Biopsies
Authors: Vinod B. Shidham, George M. Varsegi, Krista D'Amore, Anjani Shidham.
Institutions: University of Wisconsin - Milwaukee, BioInnovation LLC.
Currently, health workers including residents and fellows do not have a suitable phantom model to practice the fine- needle aspiration biopsy (FNAB) procedure. In the past, we standardized a model consisting of latex glove containing fresh cattle liver for practicing FNAB. However, this model is difficult to organize and prepare on short notice, with the procurement of fresh cattle liver being the most challenging aspect. Handling of liver with contamination-related problems is also a significant draw back. In addition, the glove material leaks after a few needle passes, with resulting mess. We have established a novel simple method of embedding a small piece of sausage or banana in a commercially available silicone rubber caulk. This model allows the retention of vacuum seal and aspiration of material from the embedded specimen, resembling an actual FNAB procedure on clinical mass lesions. The aspirated material in the needle hub can be processed similar to the specimens procured during an actual FNAB procedure, facilitating additional proficiency in smear preparation and staining.
Medicine, Issue 31, FNA, FNAB, Fine Needle Aspiration Biopsy, Proficiency, procedure, Cytopathology, cytology
1404
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.