JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Plasma protein profiling reveals protein clusters related to BMI and insulin levels in middle-aged overweight subjects.
PUBLISHED: 07-15-2010
Biomarkers that allow detection of the onset of disease are of high interest since early detection would allow intervening with lifestyle and nutritional changes before the disease is manifested and pharmacological therapy is required. Our study aimed to improve the phenotypic characterization of overweight but apparently healthy subjects and to identify new candidate profiles for early biomarkers of obesity-related diseases such as cardiovascular disease and type 2 diabetes.
Authors: Bryan Fiema, Andrew C. Harris, Aurelie Gomez, Praechompoo Pongtornpipat, Kelly Lamiman, Mark T. Vander Lugt, Sophie Paczesny.
Published: 10-31-2012
Unbiased discovery proteomics strategies have the potential to identify large numbers of novel biomarkers that can improve diagnostic and prognostic testing in a clinical setting and may help guide therapeutic interventions. When large numbers of candidate proteins are identified, it may be difficult to validate candidate biomarkers in a timely and efficient fashion from patient plasma samples that are event-driven, of finite volume and irreplaceable, such as at the onset of acute graft-versus-host disease (GVHD), a potentially life-threatening complication of allogeneic hematopoietic stem cell transplantation (HSCT). Here we describe the process of performing commercially available ELISAs for six validated GVHD proteins: IL-2Rα5, TNFR16, HGF7, IL-88, elafin2, and REG3α3 (also known as PAP1) in a sequential fashion to minimize freeze-thaw cycles, thawed plasma time and plasma usage. For this procedure we perform the ELISAs in sequential order as determined by sample dilution factor as established in our laboratory using manufacturer ELISA kits and protocols with minor adjustments to facilitate optimal sequential ELISA performance. The resulting plasma biomarker concentrations can then be compiled and analyzed for significant findings within a patient cohort. While these biomarkers are currently for research purposes only, their incorporation into clinical care is currently being investigated in clinical trials. This technique can be applied to perform ELISAs for multiple proteins/cytokines of interest on the same sample(s) provided the samples do not need to be mixed with other reagents. If ELISA kits do not come with pre-coated plates, 96-well half-well plates or 384-well plates can be used to further minimize use of samples/reagents.
24 Related JoVE Articles!
Play Button
Analytical Techniques for Assaying Nitric Oxide Bioactivity
Authors: Hong Jiang, Deepa Parthasarathy, Ashley C. Torregrossa, Asad Mian, Nathan S. Bryan.
Institutions: University of Texas Health Science Center at Houston , Baylor College of Medicine .
Nitric oxide (NO) is a diatomic free radical that is extremely short lived in biological systems (less than 1 second in circulating blood)1. NO may be considered one of the most important signaling molecules produced in our body, regulating essential functions including but not limited to regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification in biological matrices is critical to understanding the role of NO in health and disease. With such a short physiological half life of NO, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of relevant NO metabolites in multiple biological compartments provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. The ability to compare blood with select tissues in experimental animals will help bridge the gap between basic science and clinical medicine as far as diagnostic and prognostic utility of NO biomarkers in health and disease. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The established paradigm of NO biochemistry from production by NO synthases to activation of soluble guanylyl cyclase (sGC) to eventual oxidation to nitrite (NO2-) and nitrate (NO3-) may only represent part of NO's effects in vivo. The interaction of NO and NO-derived metabolites with protein thiols, secondary amines, and metals to form S-nitrosothiols (RSNOs), N-nitrosamines (RNNOs), and nitrosyl-heme respectively represent cGMP-independent effects of NO and are likely just as important physiologically as activation of sGC by NO. A true understanding of NO in physiology is derived from in vivo experiments sampling multiple compartments simultaneously. Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. The elucidation of new mechanisms and signaling pathways involving NO hinges on our ability to specifically, selectively and sensitively detect and quantify NO and all relevant NO products and metabolites in complex biological matrices. Here, we present a method for the rapid and sensitive analysis of nitrite and nitrate by HPLC as well as detection of free NO in biological samples using in vitro ozone based chemiluminescence with chemical derivitazation to determine molecular source of NO as well as ex vivo with organ bath myography.
Medicine, Issue 64, Molecular Biology, Nitric oxide, nitrite, nitrate, endothelium derived relaxing factor, HPLC, chemiluminscence
Play Button
Quantitative Visualization and Detection of Skin Cancer Using Dynamic Thermal Imaging
Authors: Cila Herman, Muge Pirtini Cetingul.
Institutions: The Johns Hopkins University.
In 2010 approximately 68,720 melanomas will be diagnosed in the US alone, with around 8,650 resulting in death 1. To date, the only effective treatment for melanoma remains surgical excision, therefore, the key to extended survival is early detection 2,3. Considering the large numbers of patients diagnosed every year and the limitations in accessing specialized care quickly, the development of objective in vivo diagnostic instruments to aid the diagnosis is essential. New techniques to detect skin cancer, especially non-invasive diagnostic tools, are being explored in numerous laboratories. Along with the surgical methods, techniques such as digital photography, dermoscopy, multispectral imaging systems (MelaFind), laser-based systems (confocal scanning laser microscopy, laser doppler perfusion imaging, optical coherence tomography), ultrasound, magnetic resonance imaging, are being tested. Each technique offers unique advantages and disadvantages, many of which pose a compromise between effectiveness and accuracy versus ease of use and cost considerations. Details about these techniques and comparisons are available in the literature 4. Infrared (IR) imaging was shown to be a useful method to diagnose the signs of certain diseases by measuring the local skin temperature. There is a large body of evidence showing that disease or deviation from normal functioning are accompanied by changes of the temperature of the body, which again affect the temperature of the skin 5,6. Accurate data about the temperature of the human body and skin can provide a wealth of information on the processes responsible for heat generation and thermoregulation, in particular the deviation from normal conditions, often caused by disease. However, IR imaging has not been widely recognized in medicine due to the premature use of the technology 7,8 several decades ago, when temperature measurement accuracy and the spatial resolution were inadequate and sophisticated image processing tools were unavailable. This situation changed dramatically in the late 1990s-2000s. Advances in IR instrumentation, implementation of digital image processing algorithms and dynamic IR imaging, which enables scientists to analyze not only the spatial, but also the temporal thermal behavior of the skin 9, allowed breakthroughs in the field. In our research, we explore the feasibility of IR imaging, combined with theoretical and experimental studies, as a cost effective, non-invasive, in vivo optical measurement technique for tumor detection, with emphasis on the screening and early detection of melanoma 10-13. In this study, we show data obtained in a patient study in which patients that possess a pigmented lesion with a clinical indication for biopsy are selected for imaging. We compared the difference in thermal responses between healthy and malignant tissue and compared our data with biopsy results. We concluded that the increased metabolic activity of the melanoma lesion can be detected by dynamic infrared imaging.
Medicine, Issue 51, Infrared imaging, quantitative thermal analysis, image processing, skin cancer, melanoma, transient thermal response, skin thermal models, skin phantom experiment, patient study
Play Button
Getting to Compliance in Forced Exercise in Rodents: A Critical Standard to Evaluate Exercise Impact in Aging-related Disorders and Disease
Authors: Jennifer C. Arnold, Michael F. Salvatore.
Institutions: Louisiana State University Health Sciences Center.
There is a major increase in the awareness of the positive impact of exercise on improving several disease states with neurobiological basis; these include improving cognitive function and physical performance. As a result, there is an increase in the number of animal studies employing exercise. It is argued that one intrinsic value of forced exercise is that the investigator has control over the factors that can influence the impact of exercise on behavioral outcomes, notably exercise frequency, duration, and intensity of the exercise regimen. However, compliance in forced exercise regimens may be an issue, particularly if potential confounds of employing foot-shock are to be avoided. It is also important to consider that since most cognitive and locomotor impairments strike in the aged individual, determining impact of exercise on these impairments should consider using aged rodents with a highest possible level of compliance to ensure minimal need for test subjects. Here, the pertinent steps and considerations necessary to achieve nearly 100% compliance to treadmill exercise in an aged rodent model will be presented and discussed. Notwithstanding the particular exercise regimen being employed by the investigator, our protocol should be of use to investigators that are particularly interested in the potential impact of forced exercise on aging-related impairments, including aging-related Parkinsonism and Parkinson’s disease.
Behavior, Issue 90, Exercise, locomotor, Parkinson’s disease, aging, treadmill, bradykinesia, Parkinsonism
Play Button
Assessment of Age-related Changes in Cognitive Functions Using EmoCogMeter, a Novel Tablet-computer Based Approach
Authors: Philipp Fuge, Simone Grimm, Anne Weigand, Yan Fan, Matti Gärtner, Melanie Feeser, Malek Bajbouj.
Institutions: Freie Universität Berlin, Charité Berlin, Freie Universität Berlin, Psychiatric University Hospital Zurich.
The main goal of this study was to assess the usability of a tablet-computer-based application (EmoCogMeter) in investigating the effects of age on cognitive functions across the lifespan in a sample of 378 healthy subjects (age range 18-89 years). Consistent with previous findings we found an age-related cognitive decline across a wide range of neuropsychological domains (memory, attention, executive functions), thereby proving the usability of our tablet-based application. Regardless of prior computer experience, subjects of all age groups were able to perform the tasks without instruction or feedback from an experimenter. Increased motivation and compliance proved to be beneficial for task performance, thereby potentially increasing the validity of the results. Our promising findings underline the great clinical and practical potential of a tablet-based application for detection and monitoring of cognitive dysfunction.
Behavior, Issue 84, Neuropsychological Testing, cognitive decline, age, tablet-computer, memory, attention, executive functions
Play Button
Multi-step Preparation Technique to Recover Multiple Metabolite Compound Classes for In-depth and Informative Metabolomic Analysis
Authors: Charmion Cruickshank-Quinn, Kevin D. Quinn, Roger Powell, Yanhui Yang, Michael Armstrong, Spencer Mahaffey, Richard Reisdorph, Nichole Reisdorph.
Institutions: National Jewish Health, University of Colorado Denver.
Metabolomics is an emerging field which enables profiling of samples from living organisms in order to obtain insight into biological processes. A vital aspect of metabolomics is sample preparation whereby inconsistent techniques generate unreliable results. This technique encompasses protein precipitation, liquid-liquid extraction, and solid-phase extraction as a means of fractionating metabolites into four distinct classes. Improved enrichment of low abundance molecules with a resulting increase in sensitivity is obtained, and ultimately results in more confident identification of molecules. This technique has been applied to plasma, bronchoalveolar lavage fluid, and cerebrospinal fluid samples with volumes as low as 50 µl.  Samples can be used for multiple downstream applications; for example, the pellet resulting from protein precipitation can be stored for later analysis. The supernatant from that step undergoes liquid-liquid extraction using water and strong organic solvent to separate the hydrophilic and hydrophobic compounds. Once fractionated, the hydrophilic layer can be processed for later analysis or discarded if not needed. The hydrophobic fraction is further treated with a series of solvents during three solid-phase extraction steps to separate it into fatty acids, neutral lipids, and phospholipids. This allows the technician the flexibility to choose which class of compounds is preferred for analysis. It also aids in more reliable metabolite identification since some knowledge of chemical class exists.
Bioengineering, Issue 89, plasma, chemistry techniques, analytical, solid phase extraction, mass spectrometry, metabolomics, fluids and secretions, profiling, small molecules, lipids, liquid chromatography, liquid-liquid extraction, cerebrospinal fluid, bronchoalveolar lavage fluid
Play Button
High Efficiency Differentiation of Human Pluripotent Stem Cells to Cardiomyocytes and Characterization by Flow Cytometry
Authors: Subarna Bhattacharya, Paul W. Burridge, Erin M. Kropp, Sandra L. Chuppa, Wai-Meng Kwok, Joseph C. Wu, Kenneth R. Boheler, Rebekah L. Gundry.
Institutions: Medical College of Wisconsin, Stanford University School of Medicine, Medical College of Wisconsin, Hong Kong University, Johns Hopkins University School of Medicine, Medical College of Wisconsin.
There is an urgent need to develop approaches for repairing the damaged heart, discovering new therapeutic drugs that do not have toxic effects on the heart, and improving strategies to accurately model heart disease. The potential of exploiting human induced pluripotent stem cell (hiPSC) technology to generate cardiac muscle “in a dish” for these applications continues to generate high enthusiasm. In recent years, the ability to efficiently generate cardiomyogenic cells from human pluripotent stem cells (hPSCs) has greatly improved, offering us new opportunities to model very early stages of human cardiac development not otherwise accessible. In contrast to many previous methods, the cardiomyocyte differentiation protocol described here does not require cell aggregation or the addition of Activin A or BMP4 and robustly generates cultures of cells that are highly positive for cardiac troponin I and T (TNNI3, TNNT2), iroquois-class homeodomain protein IRX-4 (IRX4), myosin regulatory light chain 2, ventricular/cardiac muscle isoform (MLC2v) and myosin regulatory light chain 2, atrial isoform (MLC2a) by day 10 across all human embryonic stem cell (hESC) and hiPSC lines tested to date. Cells can be passaged and maintained for more than 90 days in culture. The strategy is technically simple to implement and cost-effective. Characterization of cardiomyocytes derived from pluripotent cells often includes the analysis of reference markers, both at the mRNA and protein level. For protein analysis, flow cytometry is a powerful analytical tool for assessing quality of cells in culture and determining subpopulation homogeneity. However, technical variation in sample preparation can significantly affect quality of flow cytometry data. Thus, standardization of staining protocols should facilitate comparisons among various differentiation strategies. Accordingly, optimized staining protocols for the analysis of IRX4, MLC2v, MLC2a, TNNI3, and TNNT2 by flow cytometry are described.
Cellular Biology, Issue 91, human induced pluripotent stem cell, flow cytometry, directed differentiation, cardiomyocyte, IRX4, TNNI3, TNNT2, MCL2v, MLC2a
Play Button
Polysome Fractionation and Analysis of Mammalian Translatomes on a Genome-wide Scale
Authors: Valentina Gandin, Kristina Sikström, Tommy Alain, Masahiro Morita, Shannon McLaughlan, Ola Larsson, Ivan Topisirovic.
Institutions: McGill University, Karolinska Institutet, McGill University.
mRNA translation plays a central role in the regulation of gene expression and represents the most energy consuming process in mammalian cells. Accordingly, dysregulation of mRNA translation is considered to play a major role in a variety of pathological states including cancer. Ribosomes also host chaperones, which facilitate folding of nascent polypeptides, thereby modulating function and stability of newly synthesized polypeptides. In addition, emerging data indicate that ribosomes serve as a platform for a repertoire of signaling molecules, which are implicated in a variety of post-translational modifications of newly synthesized polypeptides as they emerge from the ribosome, and/or components of translational machinery. Herein, a well-established method of ribosome fractionation using sucrose density gradient centrifugation is described. In conjunction with the in-house developed “anota” algorithm this method allows direct determination of differential translation of individual mRNAs on a genome-wide scale. Moreover, this versatile protocol can be used for a variety of biochemical studies aiming to dissect the function of ribosome-associated protein complexes, including those that play a central role in folding and degradation of newly synthesized polypeptides.
Biochemistry, Issue 87, Cells, Eukaryota, Nutritional and Metabolic Diseases, Neoplasms, Metabolic Phenomena, Cell Physiological Phenomena, mRNA translation, ribosomes, protein synthesis, genome-wide analysis, translatome, mTOR, eIF4E, 4E-BP1
Play Button
RNA-seq Analysis of Transcriptomes in Thrombin-treated and Control Human Pulmonary Microvascular Endothelial Cells
Authors: Dilyara Cheranova, Margaret Gibson, Suman Chaudhary, Li Qin Zhang, Daniel P. Heruth, Dmitry N. Grigoryev, Shui Qing Ye.
Institutions: Children's Mercy Hospital and Clinics, School of Medicine, University of Missouri-Kansas City.
The characterization of gene expression in cells via measurement of mRNA levels is a useful tool in determining how the transcriptional machinery of the cell is affected by external signals (e.g. drug treatment), or how cells differ between a healthy state and a diseased state. With the advent and continuous refinement of next-generation DNA sequencing technology, RNA-sequencing (RNA-seq) has become an increasingly popular method of transcriptome analysis to catalog all species of transcripts, to determine the transcriptional structure of all expressed genes and to quantify the changing expression levels of the total set of transcripts in a given cell, tissue or organism1,2 . RNA-seq is gradually replacing DNA microarrays as a preferred method for transcriptome analysis because it has the advantages of profiling a complete transcriptome, providing a digital type datum (copy number of any transcript) and not relying on any known genomic sequence3. Here, we present a complete and detailed protocol to apply RNA-seq to profile transcriptomes in human pulmonary microvascular endothelial cells with or without thrombin treatment. This protocol is based on our recent published study entitled "RNA-seq Reveals Novel Transcriptome of Genes and Their Isoforms in Human Pulmonary Microvascular Endothelial Cells Treated with Thrombin,"4 in which we successfully performed the first complete transcriptome analysis of human pulmonary microvascular endothelial cells treated with thrombin using RNA-seq. It yielded unprecedented resources for further experimentation to gain insights into molecular mechanisms underlying thrombin-mediated endothelial dysfunction in the pathogenesis of inflammatory conditions, cancer, diabetes, and coronary heart disease, and provides potential new leads for therapeutic targets to those diseases. The descriptive text of this protocol is divided into four parts. The first part describes the treatment of human pulmonary microvascular endothelial cells with thrombin and RNA isolation, quality analysis and quantification. The second part describes library construction and sequencing. The third part describes the data analysis. The fourth part describes an RT-PCR validation assay. Representative results of several key steps are displayed. Useful tips or precautions to boost success in key steps are provided in the Discussion section. Although this protocol uses human pulmonary microvascular endothelial cells treated with thrombin, it can be generalized to profile transcriptomes in both mammalian and non-mammalian cells and in tissues treated with different stimuli or inhibitors, or to compare transcriptomes in cells or tissues between a healthy state and a disease state.
Genetics, Issue 72, Molecular Biology, Immunology, Medicine, Genomics, Proteins, RNA-seq, Next Generation DNA Sequencing, Transcriptome, Transcription, Thrombin, Endothelial cells, high-throughput, DNA, genomic DNA, RT-PCR, PCR
Play Button
Barnes Maze Testing Strategies with Small and Large Rodent Models
Authors: Cheryl S. Rosenfeld, Sherry A. Ferguson.
Institutions: University of Missouri, Food and Drug Administration.
Spatial learning and memory of laboratory rodents is often assessed via navigational ability in mazes, most popular of which are the water and dry-land (Barnes) mazes. Improved performance over sessions or trials is thought to reflect learning and memory of the escape cage/platform location. Considered less stressful than water mazes, the Barnes maze is a relatively simple design of a circular platform top with several holes equally spaced around the perimeter edge. All but one of the holes are false-bottomed or blind-ending, while one leads to an escape cage. Mildly aversive stimuli (e.g. bright overhead lights) provide motivation to locate the escape cage. Latency to locate the escape cage can be measured during the session; however, additional endpoints typically require video recording. From those video recordings, use of automated tracking software can generate a variety of endpoints that are similar to those produced in water mazes (e.g. distance traveled, velocity/speed, time spent in the correct quadrant, time spent moving/resting, and confirmation of latency). Type of search strategy (i.e. random, serial, or direct) can be categorized as well. Barnes maze construction and testing methodologies can differ for small rodents, such as mice, and large rodents, such as rats. For example, while extra-maze cues are effective for rats, smaller wild rodents may require intra-maze cues with a visual barrier around the maze. Appropriate stimuli must be identified which motivate the rodent to locate the escape cage. Both Barnes and water mazes can be time consuming as 4-7 test trials are typically required to detect improved learning and memory performance (e.g. shorter latencies or path lengths to locate the escape platform or cage) and/or differences between experimental groups. Even so, the Barnes maze is a widely employed behavioral assessment measuring spatial navigational abilities and their potential disruption by genetic, neurobehavioral manipulations, or drug/ toxicant exposure.
Behavior, Issue 84, spatial navigation, rats, Peromyscus, mice, intra- and extra-maze cues, learning, memory, latency, search strategy, escape motivation
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Hydrogel Nanoparticle Harvesting of Plasma or Urine for Detecting Low Abundance Proteins
Authors: Ruben Magni, Benjamin H. Espina, Lance A. Liotta, Alessandra Luchini, Virginia Espina.
Institutions: George Mason University, Ceres Nanosciences.
Novel biomarker discovery plays a crucial role in providing more sensitive and specific disease detection. Unfortunately many low-abundance biomarkers that exist in biological fluids cannot be easily detected with mass spectrometry or immunoassays because they are present in very low concentration, are labile, and are often masked by high-abundance proteins such as albumin or immunoglobulin. Bait containing poly(N-isopropylacrylamide) (NIPAm) based nanoparticles are able to overcome these physiological barriers. In one step they are able to capture, concentrate and preserve biomarkers from body fluids. Low-molecular weight analytes enter the core of the nanoparticle and are captured by different organic chemical dyes, which act as high affinity protein baits. The nanoparticles are able to concentrate the proteins of interest by several orders of magnitude. This concentration factor is sufficient to increase the protein level such that the proteins are within the detection limit of current mass spectrometers, western blotting, and immunoassays. Nanoparticles can be incubated with a plethora of biological fluids and they are able to greatly enrich the concentration of low-molecular weight proteins and peptides while excluding albumin and other high-molecular weight proteins. Our data show that a 10,000 fold amplification in the concentration of a particular analyte can be achieved, enabling mass spectrometry and immunoassays to detect previously undetectable biomarkers.
Bioengineering, Issue 90, biomarker, hydrogel, low abundance, mass spectrometry, nanoparticle, plasma, protein, urine
Play Button
Dried Blood Spot Collection of Health Biomarkers to Maximize Participation in Population Studies
Authors: Michael W. Ostler, James H. Porter, Orfeu M. Buxton.
Institutions: Harvard School of Public Health, Brigham and Women's Hospital, Harvard Medical School, Pennsylvania State University.
Biomarkers are directly-measured biological indicators of disease, health, exposures, or other biological information. In population and social sciences, biomarkers need to be easy to obtain, transport, and analyze. Dried Blood Spots meet this need, and can be collected in the field with high response rates. These elements are particularly important in longitudinal study designs including interventions where attrition is critical to avoid, and high response rates improve the interpretation of results. Dried Blood Spot sample collection is simple, quick, relatively painless, less invasive then venipuncture, and requires minimal field storage requirements (i.e. samples do not need to be immediately frozen and can be stored for a long period of time in a stable freezer environment before assay). The samples can be analyzed for a variety of different analytes, including cholesterol, C-reactive protein, glycosylated hemoglobin, numerous cytokines, and other analytes, as well as provide genetic material. DBS collection is depicted as employed in several recent studies.
Medicine, Issue 83, dried blood spots (DBS), Biomarkers, cardiometabolic risk, Inflammation, standard precautions, blood collection
Play Button
A Sensitive and Specific Quantitation Method for Determination of Serum Cardiac Myosin Binding Protein-C by Electrochemiluminescence Immunoassay
Authors: Diederik W.D. Kuster, David Barefield, Suresh Govindan, Sakthivel Sadayappan.
Institutions: Loyola University Chicago.
Biomarkers are becoming increasingly more important in clinical decision-making, as well as basic science. Diagnosing myocardial infarction (MI) is largely driven by detecting cardiac-specific proteins in patients' serum or plasma as an indicator of myocardial injury. Having recently shown that cardiac myosin binding protein-C (cMyBP-C) is detectable in the serum after MI, we have proposed it as a potential biomarker for MI. Biomarkers are typically detected by traditional sandwich enzyme-linked immunosorbent assays. However, this technique requires a large sample volume, has a small dynamic range, and can measure only one protein at a time. Here we show a multiplex immunoassay in which three cardiac proteins can be measured simultaneously with high sensitivity. Measuring cMyBP-C in uniplex or together with creatine kinase MB and cardiac troponin I showed comparable sensitivity. This technique uses the Meso Scale Discovery (MSD) method of multiplexing in a 96-well plate combined with electrochemiluminescence for detection. While only small sample volumes are required, high sensitivity and a large dynamic range are achieved. Using this technique, we measured cMyBP-C, creatine kinase MB, and cardiac troponin I levels in serum samples from 16 subjects with MI and compared the results with 16 control subjects. We were able to detect all three markers in these samples and found all three biomarkers to be increased after MI. This technique is, therefore, suitable for the sensitive detection of cardiac biomarkers in serum samples.
Molecular Biology, Issue 78, Cellular Biology, Biochemistry, Genetics, Biomedical Engineering, Medicine, Cardiology, Heart Diseases, Myocardial Ischemia, Myocardial Infarction, Cardiovascular Diseases, cardiovascular disease, immunoassay, cardiac myosin binding protein-C, cardiac troponin I, creatine kinase MB, electrochemiluminescence, multiplex biomarkers, ELISA, assay
Play Button
Analysis of Oxidative Stress in Zebrafish Embryos
Authors: Vera Mugoni, Annalisa Camporeale, Massimo M. Santoro.
Institutions: University of Torino, Vesalius Research Center, VIB.
High levels of reactive oxygen species (ROS) may cause a change of cellular redox state towards oxidative stress condition. This situation causes oxidation of molecules (lipid, DNA, protein) and leads to cell death. Oxidative stress also impacts the progression of several pathological conditions such as diabetes, retinopathies, neurodegeneration, and cancer. Thus, it is important to define tools to investigate oxidative stress conditions not only at the level of single cells but also in the context of whole organisms. Here, we consider the zebrafish embryo as a useful in vivo system to perform such studies and present a protocol to measure in vivo oxidative stress. Taking advantage of fluorescent ROS probes and zebrafish transgenic fluorescent lines, we develop two different methods to measure oxidative stress in vivo: i) a “whole embryo ROS-detection method” for qualitative measurement of oxidative stress and ii) a “single-cell ROS detection method” for quantitative measurements of oxidative stress. Herein, we demonstrate the efficacy of these procedures by increasing oxidative stress in tissues by oxidant agents and physiological or genetic methods. This protocol is amenable for forward genetic screens and it will help address cause-effect relationships of ROS in animal models of oxidative stress-related pathologies such as neurological disorders and cancer.
Developmental Biology, Issue 89, Danio rerio, zebrafish embryos, endothelial cells, redox state analysis, oxidative stress detection, in vivo ROS measurements, FACS (fluorescence activated cell sorter), molecular probes
Play Button
Consensus Brain-derived Protein, Extraction Protocol for the Study of Human and Murine Brain Proteome Using Both 2D-DIGE and Mini 2DE Immunoblotting
Authors: Francisco-Jose Fernandez-Gomez, Fanny Jumeau, Maxime Derisbourg, Sylvie Burnouf, Hélène Tran, Sabiha Eddarkaoui, Hélène Obriot, Virginie Dutoit-Lefevre, Vincent Deramecourt, Valérie Mitchell, Didier Lefranc, Malika Hamdane, David Blum, Luc Buée, Valérie Buée-Scherrer, Nicolas Sergeant.
Institutions: Inserm UMR 837, CHRU-Lille, Faculté de Médecine - Pôle Recherche, CHRU-Lille.
Two-dimensional gel electrophoresis (2DE) is a powerful tool to uncover proteome modifications potentially related to different physiological or pathological conditions. Basically, this technique is based on the separation of proteins according to their isoelectric point in a first step, and secondly according to their molecular weights by SDS polyacrylamide gel electrophoresis (SDS-PAGE). In this report an optimized sample preparation protocol for little amount of human post-mortem and mouse brain tissue is described. This method enables to perform both two-dimensional fluorescence difference gel electrophoresis (2D-DIGE) and mini 2DE immunoblotting. The combination of these approaches allows one to not only find new proteins and/or protein modifications in their expression thanks to its compatibility with mass spectrometry detection, but also a new insight into markers validation. Thus, mini-2DE coupled to western blotting permits to identify and validate post-translational modifications, proteins catabolism and provides a qualitative comparison among different conditions and/or treatments. Herein, we provide a method to study components of protein aggregates found in AD and Lewy body dementia such as the amyloid-beta peptide and the alpha-synuclein. Our method can thus be adapted for the analysis of the proteome and insoluble proteins extract from human brain tissue and mice models too. In parallel, it may provide useful information for the study of molecular and cellular pathways involved in neurodegenerative diseases as well as potential novel biomarkers and therapeutic targets.
Neuroscience, Issue 86, proteomics, neurodegeneration, 2DE, human and mice brain tissue, fluorescence, immunoblotting. Abbreviations: 2DE (two-dimensional gel electrophoresis), 2D-DIGE (two-dimensional fluorescence difference gel electrophoresis), mini-2DE (mini 2DE immunoblotting),IPG (Immobilized pH Gradients), IEF (isoelectrofocusing), AD (Alzheimer´s disease)
Play Button
Measuring Oral Fatty Acid Thresholds, Fat Perception, Fatty Food Liking, and Papillae Density in Humans
Authors: Rivkeh Y. Haryono, Madeline A. Sprajcer, Russell S. J. Keast.
Institutions: Deakin University.
Emerging evidence from a number of laboratories indicates that humans have the ability to identify fatty acids in the oral cavity, presumably via fatty acid receptors housed on taste cells. Previous research has shown that an individual's oral sensitivity to fatty acid, specifically oleic acid (C18:1) is associated with body mass index (BMI), dietary fat consumption, and the ability to identify fat in foods. We have developed a reliable and reproducible method to assess oral chemoreception of fatty acids, using a milk and C18:1 emulsion, together with an ascending forced choice triangle procedure. In parallel, a food matrix has been developed to assess an individual's ability to perceive fat, in addition to a simple method to assess fatty food liking. As an added measure tongue photography is used to assess papillae density, with higher density often being associated with increased taste sensitivity.
Neuroscience, Issue 88, taste, overweight and obesity, dietary fat, fatty acid, diet, fatty food liking, detection threshold
Play Button
Telomere Length and Telomerase Activity; A Yin and Yang of Cell Senescence
Authors: Mary Derasmo Axelrad, Temuri Budagov, Gil Atzmon.
Institutions: Albert Einstein College of Medicine , Albert Einstein College of Medicine , Albert Einstein College of Medicine .
Telomeres are repeating DNA sequences at the tip ends of the chromosomes that are diverse in length and in humans can reach a length of 15,000 base pairs. The telomere serves as a bioprotective mechanism of chromosome attrition at each cell division. At a certain length, telomeres become too short to allow replication, a process that may lead to chromosome instability or cell death. Telomere length is regulated by two opposing mechanisms: attrition and elongation. Attrition occurs as each cell divides. In contrast, elongation is partially modulated by the enzyme telomerase, which adds repeating sequences to the ends of the chromosomes. In this way, telomerase could possibly reverse an aging mechanism and rejuvenates cell viability. These are crucial elements in maintaining cell life and are used to assess cellular aging. In this manuscript we will describe an accurate, short, sophisticated and cheap method to assess telomere length in multiple tissues and species. This method takes advantage of two key elements, the tandem repeat of the telomere sequence and the sensitivity of the qRT-PCR to detect differential copy numbers of tested samples. In addition, we will describe a simple assay to assess telomerase activity as a complementary backbone test for telomere length.
Genetics, Issue 75, Molecular Biology, Cellular Biology, Medicine, Biomedical Engineering, Genomics, Telomere length, telomerase activity, telomerase, telomeres, telomere, DNA, PCR, polymerase chain reaction, qRT-PCR, sequencing, aging, telomerase assay
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
Play Button
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Authors: Phoebe Spetsieris, Yilong Ma, Shichun Peng, Ji Hyun Ko, Vijay Dhawan, Chris C. Tang, David Eidelberg.
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
Play Button
Assessing Endothelial Vasodilator Function with the Endo-PAT 2000
Authors: Andrea L. Axtell, Fatemeh A. Gomari, John P. Cooke.
Institutions: Stanford University .
The endothelium is a delicate monolayer of cells that lines all blood vessels, and which comprises the systemic and lymphatic capillaries. By virtue of the panoply of paracrine factors that it secretes, the endothelium regulates the contractile and proliferative state of the underlying vascular smooth muscle, as well as the interaction of the vessel wall with circulating blood elements. Because of its central role in mediating vessel tone and growth, its position as gateway to circulating immune cells, and its local regulation of hemostasis and coagulation, the the properly functioning endothelium is the key to cardiovascular health. Conversely, the earliest disorder in most vascular diseases is endothelial dysfunction. In the arterial circulation, the healthy endothelium generally exerts a vasodilator influence on the vascular smooth muscle. There are a number of methods to assess endothelial vasodilator function. The Endo-PAT 2000 is a new device that is used to assess endothelial vasodilator function in a rapid and non-invasive fashion. Unlike the commonly used technique of duplex ultra-sonography to assess flow-mediated vasodilation, it is totally non-operator-dependent, and the equipment is an order of magnitude less expensive. The device records endothelium-mediated changes in the digital pulse waveform known as the PAT ( peripheral Arterial Tone) signal, measured with a pair of novel modified plethysmographic probes situated on the finger index of each hand. Endothelium-mediated changes in the PAT signal are elicited by creating a downstream hyperemic response. Hyperemia is induced by occluding blood flow through the brachial artery for 5 minutes using an inflatable cuff on one hand. The response to reactive hyperemia is calculated automatically by the system. A PAT ratio is created using the post and pre occlusion values. These values are normalized to measurements from the contra-lateral arm, which serves as control for non-endothelial dependent systemic effects. Most notably, this normalization controls for fluctuations in sympathetic nerve outflow that may induce changes in peripheral arterial tone that are superimposed on the hyperemic response. In this video we demonstrate how to use the Endo-PAT 2000 to perform a clinically relevant assessment of endothelial vasodilator function.
Medicine, Issue 44, endothelium, endothelial dysfunction, Endo-PAT 2000, peripheral arterial tone, reactive hyperemia
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Play Button
Regulatory T cells: Therapeutic Potential for Treating Transplant Rejection and Type I Diabetes
Authors: Jeffry A. Bluestone.
Institutions: University of California, San Francisco - UCSF.
Issue 7, Immunology, Pancreatic Islets, Cell Culture, Diabetes, Ficoll Gradient, Translational Research
Play Button
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Authors: Gauthier Julie, Fadi F. Hamdan, Guy A. Rouleau.
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1 and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo mutations. This is the case for autism and schizophrenia3. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo mutations would more frequently come from males, particularly older males4. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing
Play Button
Ole Isacson: Development of New Therapies for Parkinson's Disease
Authors: Ole Isacson.
Institutions: Harvard Medical School.
Medicine, Issue 3, Parkinson' disease, Neuroscience, dopamine, neuron, L-DOPA, stem cell, transplantation
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.