JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Meta-analyses of 8 polymorphisms associated with the risk of the Alzheimers disease.
PLoS ONE
PUBLISHED: 01-01-2013
The aim of this study was to evaluate the combined contribution of 8 polymorphisms to the risk of Alzheimers disease (AD).
ABSTRACT
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro replication of HIV-1 as influenced by the gag gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro replication of chronically derived gag-pro sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
26 Related JoVE Articles!
Play Button
An Allele-specific Gene Expression Assay to Test the Functional Basis of Genetic Associations
Authors: Silvia Paracchini, Anthony P. Monaco, Julian C. Knight.
Institutions: University of Oxford.
The number of significant genetic associations with common complex traits is constantly increasing. However, most of these associations have not been understood at molecular level. One of the mechanisms mediating the effect of DNA variants on phenotypes is gene expression, which has been shown to be particularly relevant for complex traits1. This method tests in a cellular context the effect of specific DNA sequences on gene expression. The principle is to measure the relative abundance of transcripts arising from the two alleles of a gene, analysing cells which carry one copy of the DNA sequences associated with disease (the risk variants)2,3. Therefore, the cells used for this method should meet two fundamental genotypic requirements: they have to be heterozygous both for DNA risk variants and for DNA markers, typically coding polymorphisms, which can distinguish transcripts based on their chromosomal origin (Figure 1). DNA risk variants and DNA markers do not need to have the same allele frequency but the phase (haplotypic) relationship of the genetic markers needs to be understood. It is also important to choose cell types which express the gene of interest. This protocol refers specifically to the procedure adopted to extract nucleic acids from fibroblasts but the method is equally applicable to other cells types including primary cells. DNA and RNA are extracted from the selected cell lines and cDNA is generated. DNA and cDNA are analysed with a primer extension assay, designed to target the coding DNA markers4. The primer extension assay is carried out using the MassARRAY (Sequenom)5 platform according to the manufacturer's specifications. Primer extension products are then analysed by matrix-assisted laser desorption/ionization time of-flight mass spectrometry (MALDI-TOF/MS). Because the selected markers are heterozygous they will generate two peaks on the MS profiles. The area of each peak is proportional to the transcript abundance and can be measured with a function of the MassARRAY Typer software to generate an allelic ratio (allele 1: allele 2) calculation. The allelic ratio obtained for cDNA is normalized using that measured from genomic DNA, where the allelic ratio is expected to be 1:1 to correct for technical artifacts. Markers with a normalised allelic ratio significantly different to 1 indicate that the amount of transcript generated from the two chromosomes in the same cell is different, suggesting that the DNA variants associated with the phenotype have an effect on gene expression. Experimental controls should be used to confirm the results.
Cellular Biology, Issue 45, Gene expression, regulatory variant, haplotype, association study, primer extension, MALDI-TOF mass spectrometry, single nucleotide polymorphism, allele-specific
2279
Play Button
Anterior Cervical Discectomy and Fusion in the Ovine Model
Authors: Tony Goldschlager, Jeffrey V. Rosenfeld, Ian R. Young, Graham Jenkin.
Institutions: Monash University, Monash University.
Anterior cervical discectomy and fusion (ACDF) is the most common surgical operation for cervical radiculopathy and/or myelopathy in patients who have failed conservative treatment1,5. Since the operation was first described by Cloward2 and Smith and Robinson6 in 1958, a variety refinements in technique, graft material and implants have been made3. In particular, there is a need for safe osteoinductive agents that could benefit selected patients. The ovine model has been shown to have anatomical, biomechanical, bone density and radiological properties that are similar to the human counterpart, the most similar level being C3/44. It is therefore an ideal model in which preclinical studies can be performed. In particular this methodology may be useful to researchers interested in evaluating different devices and biologics, including stem cells, for potential application in human spinal surgery.
Medicine, Issue 32, Anterior cervical discectomy, interbody fusion, spine fusion, stem cells, biologics, spine instrumentation, interbody cage
1548
Play Button
Preparation of Oligomeric β-amyloid1-42 and Induction of Synaptic Plasticity Impairment on Hippocampal Slices
Authors: Mauro Fa, Ian J. Orozco, Yitshak I. Francis, Faisal Saeed, Yimin Gong, Ottavio Arancio.
Institutions: Columbia University.
Impairment of synaptic connections is likely to underlie the subtle amnesic changes occurring at the early stages of Alzheimer s Disease (AD). β-amyloid (Aβ), a peptide produced in high amounts in AD, is known to reduce Long-Term Potentiation (LTP), a cellular correlate of learning and memory. Indeed, LTP impairment caused by Aβ is a useful experimental paradigm for studying synaptic dysfunctions in AD models and for screening drugs capable of mitigating or reverting such synaptic impairments. Studies have shown that Aβ produces the LTP disruption preferentially via its oligomeric form. Here we provide a detailed protocol for impairing LTP by perfusion of oligomerized synthetic Aβ1-42 peptide onto acute hippocampal slices. In this video, we outline a step-by-step procedure for the preparation of oligomeric Aβ1-42. Then, we follow an individual experiment in which LTP is reduced in hippocampal slices exposed to oligomerized Aβ1-42 compared to slices in a control experiment where no Aβ1-42 exposure had occurred.
JoVE Neuroscience, Issue 41, brain, mouse, hippocampus, plasticity, LTP, amyloid
1884
Play Button
Linking Predation Risk, Herbivore Physiological Stress and Microbial Decomposition of Plant Litter
Authors: Oswald J. Schmitz, Mark A. Bradford, Michael S. Strickland, Dror Hawlena.
Institutions: Yale University, Virginia Tech, The Hebrew University of Jerusalem.
The quantity and quality of detritus entering the soil determines the rate of decomposition by microbial communities as well as recycle rates of nitrogen (N) and carbon (C) sequestration1,2. Plant litter comprises the majority of detritus3, and so it is assumed that decomposition is only marginally influenced by biomass inputs from animals such as herbivores and carnivores4,5. However, carnivores may influence microbial decomposition of plant litter via a chain of interactions in which predation risk alters the physiology of their herbivore prey that in turn alters soil microbial functioning when the herbivore carcasses are decomposed6. A physiological stress response by herbivores to the risk of predation can change the C:N elemental composition of herbivore biomass7,8,9 because stress from predation risk increases herbivore basal energy demands that in nutrient-limited systems forces herbivores to shift their consumption from N-rich resources to support growth and reproduction to C-rich carbohydrate resources to support heightened metabolism6. Herbivores have limited ability to store excess nutrients, so stressed herbivores excrete N as they increase carbohydrate-C consumption7. Ultimately, prey stressed by predation risk increase their body C:N ratio7,10, making them poorer quality resources for the soil microbial pool likely due to lower availability of labile N for microbial enzyme production6. Thus, decomposition of carcasses of stressed herbivores has a priming effect on the functioning of microbial communities that decreases subsequent ability to of microbes to decompose plant litter6,10,11. We present the methodology to evaluate linkages between predation risk and litter decomposition by soil microbes. We describe how to: induce stress in herbivores from predation risk; measure those stress responses, and measure the consequences on microbial decomposition. We use insights from a model grassland ecosystem comprising the hunting spider predator (Pisuarina mira), a dominant grasshopper herbivore (Melanoplus femurrubrum),and a variety of grass and forb plants9.
Environmental Sciences, Issue 73, Microbiology, Plant Biology, Entomology, Organisms, Investigative Techniques, Biological Phenomena, Chemical Phenomena, Metabolic Phenomena, Microbiological Phenomena, Earth Resources and Remote Sensing, Life Sciences (General), Litter Decomposition, Ecological Stoichiometry, Physiological Stress and Ecosystem Function, Predation Risk, Soil Respiration, Carbon Sequestration, Soil Science, respiration, spider, grasshoper, model system
50061
Play Button
Infinium Assay for Large-scale SNP Genotyping Applications
Authors: Adam J. Adler, Graham B. Wiley, Patrick M. Gaffney.
Institutions: Oklahoma Medical Research Foundation.
Genotyping variants in the human genome has proven to be an efficient method to identify genetic associations with phenotypes. The distribution of variants within families or populations can facilitate identification of the genetic factors of disease. Illumina's panel of genotyping BeadChips allows investigators to genotype thousands or millions of single nucleotide polymorphisms (SNPs) or to analyze other genomic variants, such as copy number, across a large number of DNA samples. These SNPs can be spread throughout the genome or targeted in specific regions in order to maximize potential discovery. The Infinium assay has been optimized to yield high-quality, accurate results quickly. With proper setup, a single technician can process from a few hundred to over a thousand DNA samples per week, depending on the type of array. This assay guides users through every step, starting with genomic DNA and ending with the scanning of the array. Using propriety reagents, samples are amplified, fragmented, precipitated, resuspended, hybridized to the chip, extended by a single base, stained, and scanned on either an iScan or Hi Scan high-resolution optical imaging system. One overnight step is required to amplify the DNA. The DNA is denatured and isothermally amplified by whole-genome amplification; therefore, no PCR is required. Samples are hybridized to the arrays during a second overnight step. By the third day, the samples are ready to be scanned and analyzed. Amplified DNA may be stockpiled in large quantities, allowing bead arrays to be processed every day of the week, thereby maximizing throughput.
Basic Protocol, Issue 81, genomics, SNP, Genotyping, Infinium, iScan, HiScan, Illumina
50683
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
52070
Play Button
A Practical and Novel Method to Extract Genomic DNA from Blood Collection Kits for Plasma Protein Preservation
Authors: Jon Waters, Vishal Dhere, Adam Benjamin, Arvind Sekar, Archana Kumar, Sampath Prahalad, David T. Okou, Subra Kugathasan.
Institutions: Emory University School of Medicine and Children's Health Care of Atlanta, Emory University School of Medicine and Children's Health Care of Atlanta.
Laboratory tests can be done on the cellular or fluid portions of the blood. The use of different blood collection tubes determines the portion of the blood that can be analyzed (whole blood, plasma or serum). Laboratories involved in studying the genetic basis of human disorders rely on anticoagulated whole blood collected in EDTA-containing vacutainer as the source of DNA for genetic / genomic analysis. Because most clinical laboratories perform biochemical, serologic and viral testing as a first step in phenotypic outcome investigation, anticoagulated blood is also collected in heparin-containing tube (plasma tube). Therefore when DNA and plasma are needed for simultaneous and parallel analyses of both genomic and proteomic data, it is customary to collect blood in both EDTA and heparin tubes. If blood could be collected in a single tube and serve as a source for both plasma and DNA, that method would be considered an advancement to existing methods. The use of the compacted blood after plasma extraction represents an alternative source for genomic DNA, thus minimizing the amount of blood samples processed and reducing the number of samples required from each patient. This would ultimately save time and resources. The BD P100 blood collection system for plasma protein preservation were created as an improved method over previous plasma or serum collection tubes1, to stabilize the protein content of blood, enabling better protein biomarker discovery and proteomics experimentation from human blood. The BD P100 tubes contain 15.8 ml of spray-dried K2EDTA and a lyophilized proprietary broad spectrum cocktail of protease inhibitors to prevent coagulation and stabilize the plasma proteins. They also include a mechanical separator, which provides a physical barrier between plasma and cell pellets after centrifugation. Few methods have been devised to extract DNA from clotted blood samples collected in old plasma tubes2-4. Challenges from these methods were mainly associated with the type of separator inside the tubes (gel separator) and included difficulty in recovering the clotted blood, the inconvenience of fragmenting or dispersing the clot, and obstruction of the clot extraction by the separation gel. We present the first method that extracts and purifies genomic DNA from blood drawn in the new BD P100 tubes. We compare the quality of the DNA sample from P100 tubes to that from EDTA tubes. Our approach is simple and efficient. It involves four major steps as follows: 1) the use of a plasma BD P100 (BD Diagnostics, Sparks, MD, USA) tube with mechanical separator for blood collection, 2) the removal of the mechanical separator using a combination of sucrose and a sterile paperclip metallic hook, 3) the separation of the buffy coat layer containing the white cells and 4) the isolation of the genomic DNA from the buffy coat using a regular commercial DNA extraction kit or a similar standard protocol.
Genetics, Issue 75, Molecular Biology, Cellular Biology, Medicine, Biochemistry, Hematology, Proteins, Genomics, genomic DNA, blood collection, P100 tubes, DNA extraction, buffy coat isolation, genotyping assays, red blood, whole blood, plasma, DNA, assay, genotyping
4241
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
50645
Play Button
An In vitro Model to Study Immune Responses of Human Peripheral Blood Mononuclear Cells to Human Respiratory Syncytial Virus Infection
Authors: Marloes Vissers, Marrit N. Habets, Inge M. L. Ahout, Jop Jans, Marien I. de Jonge, Dimitri A. Diavatopoulos, Gerben Ferwerda.
Institutions: Radboud university medical center.
Human respiratory syncytial virus (HRSV) infections present a broad spectrum of disease severity, ranging from mild infections to life-threatening bronchiolitis. An important part of the pathogenesis of severe disease is an enhanced immune response leading to immunopathology. Here, we describe a protocol used to investigate the immune response of human immune cells to an HRSV infection. First, we describe methods used for culturing, purification and quantification of HRSV. Subsequently, we describe a human in vitro model in which peripheral blood mononuclear cells (PBMCs) are stimulated with live HRSV. This model system can be used to study multiple parameters that may contribute to disease severity, including the innate and adaptive immune response. These responses can be measured at the transcriptional and translational level. Moreover, viral infection of cells can easily be measured using flow cytometry. Taken together, stimulation of PBMC with live HRSV provides a fast and reproducible model system to examine mechanisms involved in HRSV-induced disease.
Immunology, Issue 82, Blood Cells, Respiratory Syncytial Virus, Human, Respiratory Tract Infections, Paramyxoviridae Infections, Models, Immunological, Immunity, HRSV culture, purification, quantification, PBMC isolation, stimulation, inflammatory pathways
50766
Play Button
A Comprehensive Protocol for Manual Segmentation of the Medial Temporal Lobe Structures
Authors: Matthew Moore, Yifan Hu, Sarah Woo, Dylan O'Hearn, Alexandru D. Iordan, Sanda Dolcos, Florin Dolcos.
Institutions: University of Illinois Urbana-Champaign, University of Illinois Urbana-Champaign, University of Illinois Urbana-Champaign.
The present paper describes a comprehensive protocol for manual tracing of the set of brain regions comprising the medial temporal lobe (MTL): amygdala, hippocampus, and the associated parahippocampal regions (perirhinal, entorhinal, and parahippocampal proper). Unlike most other tracing protocols available, typically focusing on certain MTL areas (e.g., amygdala and/or hippocampus), the integrative perspective adopted by the present tracing guidelines allows for clear localization of all MTL subregions. By integrating information from a variety of sources, including extant tracing protocols separately targeting various MTL structures, histological reports, and brain atlases, and with the complement of illustrative visual materials, the present protocol provides an accurate, intuitive, and convenient guide for understanding the MTL anatomy. The need for such tracing guidelines is also emphasized by illustrating possible differences between automatic and manual segmentation protocols. This knowledge can be applied toward research involving not only structural MRI investigations but also structural-functional colocalization and fMRI signal extraction from anatomically defined ROIs, in healthy and clinical groups alike.
Neuroscience, Issue 89, Anatomy, Segmentation, Medial Temporal Lobe, MRI, Manual Tracing, Amygdala, Hippocampus, Perirhinal Cortex, Entorhinal Cortex, Parahippocampal Cortex
50991
Play Button
Detection of Neuritic Plaques in Alzheimer's Disease Mouse Model
Authors: Philip T.T. Ly, Fang Cai, Weihong Song.
Institutions: The University of British Columbia.
Alzheimer's disease (AD) is the most common neurodegenerative disorder leading to dementia. Neuritic plaque formation is one of the pathological hallmarks of Alzheimer's disease. The central component of neuritic plaques is a small filamentous protein called amyloid β protein (Aβ)1, which is derived from sequential proteolytic cleavage of the beta-amyloid precursor protein (APP) by β-secretase and γ-secretase. The amyloid hypothesis entails that Aγ-containing plaques as the underlying toxic mechanism in AD pathology2. The postmortem analysis of the presence of neuritic plaque confirms the diagnosis of AD. To further our understanding of Aγ neurobiology in AD pathogenesis, various mouse strains expressing AD-related mutations in the human APP genes were generated. Depending on the severity of the disease, these mice will develop neuritic plaques at different ages. These mice serve as invaluable tools for studying the pathogenesis and drug development that could affect the APP processing pathway and neuritic plaque formation. In this protocol, we employ an immunohistochemical method for specific detection of neuritic plaques in AD model mice. We will specifically discuss the preparation from extracting the half brain, paraformaldehyde fixation, cryosectioning, and two methods to detect neurotic plaques in AD transgenic mice: immunohistochemical detection using the ABC and DAB method and fluorescent detection using thiofalvin S staining method.
Neuroscience, Issue 53, Alzheimer’s disease, neuritic plaques, Amyloid β protein, APP, transgenic mouse
2831
Play Button
Assessment of Morphine-induced Hyperalgesia and Analgesic Tolerance in Mice Using Thermal and Mechanical Nociceptive Modalities
Authors: Khadija Elhabazi, Safia Ayachi, Brigitte Ilien, Frédéric Simonin.
Institutions: Université de Strasbourg.
Opioid-induced hyperalgesia and tolerance severely impact the clinical efficacy of opiates as pain relievers in animals and humans. The molecular mechanisms underlying both phenomena are not well understood and their elucidation should benefit from the study of animal models and from the design of appropriate experimental protocols. We describe here a methodological approach for inducing, recording and quantifying morphine-induced hyperalgesia as well as for evidencing analgesic tolerance, using the tail-immersion and tail pressure tests in wild-type mice. As shown in the video, the protocol is divided into five sequential steps. Handling and habituation phases allow a safe determination of the basal nociceptive response of the animals. Chronic morphine administration induces significant hyperalgesia as shown by an increase in both thermal and mechanical sensitivity, whereas the comparison of analgesia time-courses after acute or repeated morphine treatment clearly indicates the development of tolerance manifested by a decline in analgesic response amplitude. This protocol may be similarly adapted to genetically modified mice in order to evaluate the role of individual genes in the modulation of nociception and morphine analgesia. It also provides a model system to investigate the effectiveness of potential therapeutic agents to improve opiate analgesic efficacy.
Neuroscience, Issue 89, mice, nociception, tail immersion test, tail pressure test, morphine, analgesia, opioid-induced hyperalgesia, tolerance
51264
Play Button
Lesion Explorer: A Video-guided, Standardized Protocol for Accurate and Reliable MRI-derived Volumetrics in Alzheimer's Disease and Normal Elderly
Authors: Joel Ramirez, Christopher J.M. Scott, Alicia A. McNeely, Courtney Berezuk, Fuqiang Gao, Gregory M. Szilagyi, Sandra E. Black.
Institutions: Sunnybrook Health Sciences Centre, University of Toronto.
Obtaining in vivo human brain tissue volumetrics from MRI is often complicated by various technical and biological issues. These challenges are exacerbated when significant brain atrophy and age-related white matter changes (e.g. Leukoaraiosis) are present. Lesion Explorer (LE) is an accurate and reliable neuroimaging pipeline specifically developed to address such issues commonly observed on MRI of Alzheimer's disease and normal elderly. The pipeline is a complex set of semi-automatic procedures which has been previously validated in a series of internal and external reliability tests1,2. However, LE's accuracy and reliability is highly dependent on properly trained manual operators to execute commands, identify distinct anatomical landmarks, and manually edit/verify various computer-generated segmentation outputs. LE can be divided into 3 main components, each requiring a set of commands and manual operations: 1) Brain-Sizer, 2) SABRE, and 3) Lesion-Seg. Brain-Sizer's manual operations involve editing of the automatic skull-stripped total intracranial vault (TIV) extraction mask, designation of ventricular cerebrospinal fluid (vCSF), and removal of subtentorial structures. The SABRE component requires checking of image alignment along the anterior and posterior commissure (ACPC) plane, and identification of several anatomical landmarks required for regional parcellation. Finally, the Lesion-Seg component involves manual checking of the automatic lesion segmentation of subcortical hyperintensities (SH) for false positive errors. While on-site training of the LE pipeline is preferable, readily available visual teaching tools with interactive training images are a viable alternative. Developed to ensure a high degree of accuracy and reliability, the following is a step-by-step, video-guided, standardized protocol for LE's manual procedures.
Medicine, Issue 86, Brain, Vascular Diseases, Magnetic Resonance Imaging (MRI), Neuroimaging, Alzheimer Disease, Aging, Neuroanatomy, brain extraction, ventricles, white matter hyperintensities, cerebrovascular disease, Alzheimer disease
50887
Play Button
High-throughput Functional Screening using a Homemade Dual-glow Luciferase Assay
Authors: Jessica M. Baker, Frederick M. Boyce.
Institutions: Massachusetts General Hospital.
We present a rapid and inexpensive high-throughput screening protocol to identify transcriptional regulators of alpha-synuclein, a gene associated with Parkinson's disease. 293T cells are transiently transfected with plasmids from an arrayed ORF expression library, together with luciferase reporter plasmids, in a one-gene-per-well microplate format. Firefly luciferase activity is assayed after 48 hr to determine the effects of each library gene upon alpha-synuclein transcription, normalized to expression from an internal control construct (a hCMV promoter directing Renilla luciferase). This protocol is facilitated by a bench-top robot enclosed in a biosafety cabinet, which performs aseptic liquid handling in 96-well format. Our automated transfection protocol is readily adaptable to high-throughput lentiviral library production or other functional screening protocols requiring triple-transfections of large numbers of unique library plasmids in conjunction with a common set of helper plasmids. We also present an inexpensive and validated alternative to commercially-available, dual luciferase reagents which employs PTC124, EDTA, and pyrophosphate to suppress firefly luciferase activity prior to measurement of Renilla luciferase. Using these methods, we screened 7,670 human genes and identified 68 regulators of alpha-synuclein. This protocol is easily modifiable to target other genes of interest.
Cellular Biology, Issue 88, Luciferases, Gene Transfer Techniques, Transfection, High-Throughput Screening Assays, Transfections, Robotics
50282
Play Button
Bladder Smooth Muscle Strip Contractility as a Method to Evaluate Lower Urinary Tract Pharmacology
Authors: F. Aura Kullmann, Stephanie L. Daugherty, William C. de Groat, Lori A. Birder.
Institutions: University of Pittsburgh School of Medicine, University of Pittsburgh School of Medicine.
We describe an in vitro method to measure bladder smooth muscle contractility, and its use for investigating physiological and pharmacological properties of the smooth muscle as well as changes induced by pathology. This method provides critical information for understanding bladder function while overcoming major methodological difficulties encountered in in vivo experiments, such as surgical and pharmacological manipulations that affect stability and survival of the preparations, the use of human tissue, and/or the use of expensive chemicals. It also provides a way to investigate the properties of each bladder component (i.e. smooth muscle, mucosa, nerves) in healthy and pathological conditions. The urinary bladder is removed from an anesthetized animal, placed in Krebs solution and cut into strips. Strips are placed into a chamber filled with warm Krebs solution. One end is attached to an isometric tension transducer to measure contraction force, the other end is attached to a fixed rod. Tissue is stimulated by directly adding compounds to the bath or by electric field stimulation electrodes that activate nerves, similar to triggering bladder contractions in vivo. We demonstrate the use of this method to evaluate spontaneous smooth muscle contractility during development and after an experimental spinal cord injury, the nature of neurotransmission (transmitters and receptors involved), factors involved in modulation of smooth muscle activity, the role of individual bladder components, and species and organ differences in response to pharmacological agents. Additionally, it could be used for investigating intracellular pathways involved in contraction and/or relaxation of the smooth muscle, drug structure-activity relationships and evaluation of transmitter release. The in vitro smooth muscle contractility method has been used extensively for over 50 years, and has provided data that significantly contributed to our understanding of bladder function as well as to pharmaceutical development of compounds currently used clinically for bladder management.
Medicine, Issue 90, Krebs, species differences, in vitro, smooth muscle contractility, neural stimulation
51807
Play Button
Getting to Compliance in Forced Exercise in Rodents: A Critical Standard to Evaluate Exercise Impact in Aging-related Disorders and Disease
Authors: Jennifer C. Arnold, Michael F. Salvatore.
Institutions: Louisiana State University Health Sciences Center.
There is a major increase in the awareness of the positive impact of exercise on improving several disease states with neurobiological basis; these include improving cognitive function and physical performance. As a result, there is an increase in the number of animal studies employing exercise. It is argued that one intrinsic value of forced exercise is that the investigator has control over the factors that can influence the impact of exercise on behavioral outcomes, notably exercise frequency, duration, and intensity of the exercise regimen. However, compliance in forced exercise regimens may be an issue, particularly if potential confounds of employing foot-shock are to be avoided. It is also important to consider that since most cognitive and locomotor impairments strike in the aged individual, determining impact of exercise on these impairments should consider using aged rodents with a highest possible level of compliance to ensure minimal need for test subjects. Here, the pertinent steps and considerations necessary to achieve nearly 100% compliance to treadmill exercise in an aged rodent model will be presented and discussed. Notwithstanding the particular exercise regimen being employed by the investigator, our protocol should be of use to investigators that are particularly interested in the potential impact of forced exercise on aging-related impairments, including aging-related Parkinsonism and Parkinson’s disease.
Behavior, Issue 90, Exercise, locomotor, Parkinson’s disease, aging, treadmill, bradykinesia, Parkinsonism
51827
Play Button
Evaluation of Integrated Anaerobic Digestion and Hydrothermal Carbonization for Bioenergy Production
Authors: M. Toufiq Reza, Maja Werner, Marcel Pohl, Jan Mumme.
Institutions: Leibniz Institute for Agricultural Engineering.
Lignocellulosic biomass is one of the most abundant yet underutilized renewable energy resources. Both anaerobic digestion (AD) and hydrothermal carbonization (HTC) are promising technologies for bioenergy production from biomass in terms of biogas and HTC biochar, respectively. In this study, the combination of AD and HTC is proposed to increase overall bioenergy production. Wheat straw was anaerobically digested in a novel upflow anaerobic solid state reactor (UASS) in both mesophilic (37 °C) and thermophilic (55 °C) conditions. Wet digested from thermophilic AD was hydrothermally carbonized at 230 °C for 6 hr for HTC biochar production. At thermophilic temperature, the UASS system yields an average of 165 LCH4/kgVS (VS: volatile solids) and 121 L CH4/kgVS at mesophilic AD over the continuous operation of 200 days. Meanwhile, 43.4 g of HTC biochar with 29.6 MJ/kgdry_biochar was obtained from HTC of 1 kg digestate (dry basis) from mesophilic AD. The combination of AD and HTC, in this particular set of experiment yield 13.2 MJ of energy per 1 kg of dry wheat straw, which is at least 20% higher than HTC alone and 60.2% higher than AD only.
Environmental Sciences, Issue 88, Biomethane, Hydrothermal Carbonization (HTC), Calorific Value, Lignocellulosic Biomass, UASS, Anaerobic Digestion
51734
Play Button
Consensus Brain-derived Protein, Extraction Protocol for the Study of Human and Murine Brain Proteome Using Both 2D-DIGE and Mini 2DE Immunoblotting
Authors: Francisco-Jose Fernandez-Gomez, Fanny Jumeau, Maxime Derisbourg, Sylvie Burnouf, Hélène Tran, Sabiha Eddarkaoui, Hélène Obriot, Virginie Dutoit-Lefevre, Vincent Deramecourt, Valérie Mitchell, Didier Lefranc, Malika Hamdane, David Blum, Luc Buée, Valérie Buée-Scherrer, Nicolas Sergeant.
Institutions: Inserm UMR 837, CHRU-Lille, Faculté de Médecine - Pôle Recherche, CHRU-Lille.
Two-dimensional gel electrophoresis (2DE) is a powerful tool to uncover proteome modifications potentially related to different physiological or pathological conditions. Basically, this technique is based on the separation of proteins according to their isoelectric point in a first step, and secondly according to their molecular weights by SDS polyacrylamide gel electrophoresis (SDS-PAGE). In this report an optimized sample preparation protocol for little amount of human post-mortem and mouse brain tissue is described. This method enables to perform both two-dimensional fluorescence difference gel electrophoresis (2D-DIGE) and mini 2DE immunoblotting. The combination of these approaches allows one to not only find new proteins and/or protein modifications in their expression thanks to its compatibility with mass spectrometry detection, but also a new insight into markers validation. Thus, mini-2DE coupled to western blotting permits to identify and validate post-translational modifications, proteins catabolism and provides a qualitative comparison among different conditions and/or treatments. Herein, we provide a method to study components of protein aggregates found in AD and Lewy body dementia such as the amyloid-beta peptide and the alpha-synuclein. Our method can thus be adapted for the analysis of the proteome and insoluble proteins extract from human brain tissue and mice models too. In parallel, it may provide useful information for the study of molecular and cellular pathways involved in neurodegenerative diseases as well as potential novel biomarkers and therapeutic targets.
Neuroscience, Issue 86, proteomics, neurodegeneration, 2DE, human and mice brain tissue, fluorescence, immunoblotting. Abbreviations: 2DE (two-dimensional gel electrophoresis), 2D-DIGE (two-dimensional fluorescence difference gel electrophoresis), mini-2DE (mini 2DE immunoblotting),IPG (Immobilized pH Gradients), IEF (isoelectrofocusing), AD (Alzheimer´s disease)
51339
Play Button
Gene-environment Interaction Models to Unmask Susceptibility Mechanisms in Parkinson's Disease
Authors: Vivian P. Chou, Novie Ko, Theodore R. Holman, Amy B. Manning-Boğ.
Institutions: SRI International, University of California-Santa Cruz.
Lipoxygenase (LOX) activity has been implicated in neurodegenerative disorders such as Alzheimer's disease, but its effects in Parkinson's disease (PD) pathogenesis are less understood. Gene-environment interaction models have utility in unmasking the impact of specific cellular pathways in toxicity that may not be observed using a solely genetic or toxicant disease model alone. To evaluate if distinct LOX isozymes selectively contribute to PD-related neurodegeneration, transgenic (i.e. 5-LOX and 12/15-LOX deficient) mice can be challenged with a toxin that mimics cell injury and death in the disorder. Here we describe the use of a neurotoxin, 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP), which produces a nigrostriatal lesion to elucidate the distinct contributions of LOX isozymes to neurodegeneration related to PD. The use of MPTP in mouse, and nonhuman primate, is well-established to recapitulate the nigrostriatal damage in PD. The extent of MPTP-induced lesioning is measured by HPLC analysis of dopamine and its metabolites and semi-quantitative Western blot analysis of striatum for tyrosine hydroxylase (TH), the rate-limiting enzyme for the synthesis of dopamine. To assess inflammatory markers, which may demonstrate LOX isozyme-selective sensitivity, glial fibrillary acidic protein (GFAP) and Iba-1 immunohistochemistry are performed on brain sections containing substantia nigra, and GFAP Western blot analysis is performed on striatal homogenates. This experimental approach can provide novel insights into gene-environment interactions underlying nigrostriatal degeneration and PD.
Medicine, Issue 83, MPTP, dopamine, Iba1, TH, GFAP, lipoxygenase, transgenic, gene-environment interactions, mouse, Parkinson's disease, neurodegeneration, neuroinflammation
50960
Play Button
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Authors: Phoebe Spetsieris, Yilong Ma, Shichun Peng, Ji Hyun Ko, Vijay Dhawan, Chris C. Tang, David Eidelberg.
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
50319
Play Button
Murine Endoscopy for In Vivo Multimodal Imaging of Carcinogenesis and Assessment of Intestinal Wound Healing and Inflammation
Authors: Markus Brückner, Philipp Lenz, Tobias M. Nowacki, Friederike Pott, Dirk Foell, Dominik Bettenworth.
Institutions: University Hospital Münster, University Children's Hospital Münster.
Mouse models are widely used to study pathogenesis of human diseases and to evaluate diagnostic procedures as well as therapeutic interventions preclinically. However, valid assessment of pathological alterations often requires histological analysis, and when performed ex vivo, necessitates death of the animal. Therefore in conventional experimental settings, intra-individual follow-up examinations are rarely possible. Thus, development of murine endoscopy in live mice enables investigators for the first time to both directly visualize the gastrointestinal mucosa and also repeat the procedure to monitor for alterations. Numerous applications for in vivo murine endoscopy exist, including studying intestinal inflammation or wound healing, obtaining mucosal biopsies repeatedly, and to locally administer diagnostic or therapeutic agents using miniature injection catheters. Most recently, molecular imaging has extended diagnostic imaging modalities allowing specific detection of distinct target molecules using specific photoprobes. In conclusion, murine endoscopy has emerged as a novel cutting-edge technology for diagnostic experimental in vivo imaging and may significantly impact on preclinical research in various fields.
Medicine, Issue 90, gastroenterology, in vivo imaging, murine endoscopy, diagnostic imaging, carcinogenesis, intestinal wound healing, experimental colitis
51875
Play Button
A New Single Chamber Implantable Defibrillator with Atrial Sensing: A Practical Demonstration of Sensing and Ease of Implantation
Authors: Dietmar Bänsch, Ralph Schneider, Ibrahim Akin, Cristoph A. Nienaber.
Institutions: University Hospital of Rostock, Germany.
Implantable cardioverter-defibrillators (ICDs) terminate ventricular tachycardia (VT) and ventricular fibrillation (VF) with high efficacy and can protect patients from sudden cardiac death (SCD). However, inappropriate shocks may occur if tachycardias are misdiagnosed. Inappropriate shocks are harmful and impair patient quality of life. The risk of inappropriate therapy increases with lower detection rates programmed in the ICD. Single-chamber detection poses greater risks for misdiagnosis when compared with dual-chamber devices that have the benefit of additional atrial information. However, using a dual-chamber device merely for the sake of detection is generally not accepted, since the risks associated with the second electrode may outweigh the benefits of detection. Therefore, BIOTRONIK developed a ventricular lead called the LinoxSMART S DX, which allows for the detection of atrial signals from two electrodes positioned at the atrial part of the ventricular electrode. This device contains two ring electrodes; one that contacts the atrial wall at the junction of the superior vena cava (SVC) and one positioned at the free floating part of the electrode in the atrium. The excellent signal quality can only be achieved by a special filter setting in the ICD (Lumax 540 and 740 VR-T DX, BIOTRONIK). Here, the ease of implantation of the system will be demonstrated.
Medicine, Issue 60, Implantable defibrillator, dual chamber, single chamber, tachycardia detection
3750
Play Button
Methods for ECG Evaluation of Indicators of Cardiac Risk, and Susceptibility to Aconitine-induced Arrhythmias in Rats Following Status Epilepticus
Authors: Steven L. Bealer, Cameron S. Metcalf, Jason G. Little.
Institutions: University of Utah.
Lethal cardiac arrhythmias contribute to mortality in a number of pathological conditions. Several parameters obtained from a non-invasive, easily obtained electrocardiogram (ECG) are established, well-validated prognostic indicators of cardiac risk in patients suffering from a number of cardiomyopathies. Increased heart rate, decreased heart rate variability (HRV), and increased duration and variability of cardiac ventricular electrical activity (QT interval) are all indicative of enhanced cardiac risk 1-4. In animal models, it is valuable to compare these ECG-derived variables and susceptibility to experimentally induced arrhythmias. Intravenous infusion of the arrhythmogenic agent aconitine has been widely used to evaluate susceptibility to arrhythmias in a range of experimental conditions, including animal models of depression 5 and hypertension 6, following exercise 7 and exposure to air pollutants 8, as well as determination of the antiarrhythmic efficacy of pharmacological agents 9,10. It should be noted that QT dispersion in humans is a measure of QT interval variation across the full set of leads from a standard 12-lead ECG. Consequently, the measure of QT dispersion from the 2-lead ECG in the rat described in this protocol is different than that calculated from human ECG records. This represents a limitation in the translation of the data obtained from rodents to human clinical medicine. Status epilepticus (SE) is a single seizure or series of continuously recurring seizures lasting more than 30 min 11,12 11,12, and results in mortality in 20% of cases 13. Many individuals survive the SE, but die within 30 days 14,15. The mechanism(s) of this delayed mortality is not fully understood. It has been suggested that lethal ventricular arrhythmias contribute to many of these deaths 14-17. In addition to SE, patients experiencing spontaneously recurring seizures, i.e. epilepsy, are at risk of premature sudden and unexpected death associated with epilepsy (SUDEP) 18. As with SE, the precise mechanisms mediating SUDEP are not known. It has been proposed that ventricular abnormalities and resulting arrhythmias make a significant contribution 18-22. To investigate the mechanisms of seizure-related cardiac death, and the efficacy of cardioprotective therapies, it is necessary to obtain both ECG-derived indicators of risk and evaluate susceptibility to cardiac arrhythmias in animal models of seizure disorders 23-25. Here we describe methods for implanting ECG electrodes in the Sprague-Dawley laboratory rat (Rattus norvegicus), following SE, collection and analysis of ECG recordings, and induction of arrhythmias during iv infusion of aconitine. These procedures can be used to directly determine the relationships between ECG-derived measures of cardiac electrical activity and susceptibility to ventricular arrhythmias in rat models of seizure disorders, or any pathology associated with increased risk of sudden cardiac death.
Medicine, Issue 50, cardiac, seizure disorders, QTc, QTd, cardiac arrhythmias, rat
2726
Play Button
A Technique for Serial Collection of Cerebrospinal Fluid from the Cisterna Magna in Mouse
Authors: Li Liu, Karen Duff.
Institutions: Columbia University.
Alzheimer's disease (AD) is a progressive neurodegenerative disease that is pathologically characterized by extracellular deposition of β-amyloid peptide (Aβ) and intraneuronal accumulation of hyperphosphorylated tau protein. Because cerebrospinal fluid (CSF) is in direct contact with the extracellular space of the brain, it provides a reflection of the biochemical changes in the brain in response to pathological processes. CSF from AD patients shows a decrease in the 42 amino-acid form of Aβ (Aβ42), and increases in total tau and hyperphosphorylated tau, though the mechanisms responsible for these changes are still not fully understood. Transgenic (Tg) mouse models of AD provide an excellent opportunity to investigate how and why Aβ or tau levels in CSF change as the disease progresses. Here, we demonstrate a refined cisterna magna puncture technique for CSF sampling from the mouse. This extremely gentle sampling technique allows serial CSF samples to be obtained from the same mouse at 2-3 month intervals which greatly minimizes the confounding effect of between-mouse variability in Aβ or tau levels, making it possible to detect subtle alterations over time. In combination with Aβ and tau ELISA, this technique will be useful for studies designed to investigate the relationship between the levels of CSF Aβ42 and tau, and their metabolism in the brain in AD mouse models. Studies in Tg mice could provide important validation as to the potential of CSF Aβ or tau levels to be used as biological markers for monitoring disease progression, and to monitor the effect of therapeutic interventions. As the mice can be sacrificed and the brains can be examined for biochemical or histological changes, the mechanisms underlying the CSF changes can be better assessed. These data are likely to be informative for interpretation of human AD CSF changes.
Neuroscience, Issue 21, Cerebrospinal fluid, Alzheimer's disease, Transgenic mouse, β-amyloid, tau
960
Play Button
Pyrosequencing: A Simple Method for Accurate Genotyping
Authors: Cristi King, Tiffany Scott-Horton.
Institutions: Washington University in St. Louis.
Pharmacogenetic research benefits first-hand from the abundance of information provided by the completion of the Human Genome Project. With such a tremendous amount of data available comes an explosion of genotyping methods. Pyrosequencing(R) is one of the most thorough yet simple methods to date used to analyze polymorphisms. It also has the ability to identify tri-allelic, indels, short-repeat polymorphisms, along with determining allele percentages for methylation or pooled sample assessment. In addition, there is a standardized control sequence that provides internal quality control. This method has led to rapid and efficient single-nucleotide polymorphism evaluation including many clinically relevant polymorphisms. The technique and methodology of Pyrosequencing is explained.
Cellular Biology, Issue 11, Springer Protocols, Pyrosequencing, genotype, polymorphism, SNP, pharmacogenetics, pharmacogenomics, PCR
630
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
1988
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.