JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Blood amyloid Beta levels in healthy, mild cognitive impairment and Alzheimers disease individuals: replication of diastolic blood pressure correlations and analysis of critical covariates.
PLoS ONE
PUBLISHED: 01-01-2013
Plasma amyloid beta (A?) levels are being investigated as potential biomarkers for Alzheimers disease. In AB128 cross-sectional study, a number of medical relevant correlates of blood A?40 or A?42 were analyzed in 140 subjects (51 Alzheimers disease patients, 53 healthy controls and 36 individuals diagnosed with mild cognitive impairment). We determined the association between multiple variables with A?40 and A?42 levels measured in three different blood compartments called i) A? directly accessible (DA) in the plasma, ii) A? recovered from the plasma matrix (RP) after diluting the plasma sample in a formulated buffer, and iii) associated with the remaining cellular pellet (CP). We confirmed that diastolic blood pressure (DBP) is consistently correlated with blood DA A?40 levels (r=-0.19, P=0.032). These results were consistent in the three phenotypic groups studied. Importantly, the observation resisted covariation with age, gender or creatinine levels. Observed effect size and direction of A?40 levels/DBP correlation are in accordance with previous reports. Of note, DA A?40 and the RP A?40 were also strongly associated with creatinine levels (r=0.599, P<0.001) and to a lesser extent to urea, age, hematocrit, uric acid and homocysteine (p<0.001). DBP and the rest of statistical significant correlates identified should be considered as potential confounder factors in studies investigating blood A? levels as potential AD biomarker. Remarkably, the factors affecting A? levels in plasma (DA, RP) and blood cell compartments (CP) seem completely different.
Authors: Peter Novak.
Published: 07-19-2011
ABSTRACT
Disorders associated with dysfunction of autonomic nervous system are quite common yet frequently unrecognized. Quantitative autonomic testing can be invaluable tool for evaluation of these disorders, both in clinic and research. There are number of autonomic tests, however, only few were validated clinically or are quantitative. Here, fully quantitative and clinically validated protocol for testing of autonomic functions is presented. As a bare minimum the clinical autonomic laboratory should have a tilt table, ECG monitor, continuous noninvasive blood pressure monitor, respiratory monitor and a mean for evaluation of sudomotor domain. The software for recording and evaluation of autonomic tests is critical for correct evaluation of data. The presented protocol evaluates 3 major autonomic domains: cardiovagal, adrenergic and sudomotor. The tests include deep breathing, Valsalva maneuver, head-up tilt, and quantitative sudomotor axon test (QSART). The severity and distribution of dysautonomia is quantitated using Composite Autonomic Severity Scores (CASS). Detailed protocol is provided highlighting essential aspects of testing with emphasis on proper data acquisition, obtaining the relevant parameters and unbiased evaluation of autonomic signals. The normative data and CASS algorithm for interpretation of results are provided as well.
24 Related JoVE Articles!
Play Button
A High-throughput Method for Measurement of Glomerular Filtration Rate in Conscious Mice
Authors: Timo Rieg.
Institutions: University of California, San Diego , San Diego VA Healthcare System.
The measurement of glomerular filtration rate (GFR) is the gold standard in kidney function assessment. Currently, investigators determine GFR by measuring the level of the endogenous biomarker creatinine or exogenously applied radioactive labeled inulin (3H or 14C). Creatinine has the substantial drawback that proximal tubular secretion accounts for ~50% of total renal creatinine excretion and therefore creatinine is not a reliable GFR marker. Depending on the experiment performed, inulin clearance can be determined by an intravenous single bolus injection or continuous infusion (intravenous or osmotic minipump). Both approaches require the collection of plasma or plasma and urine, respectively. Other drawbacks of radioactive labeled inulin include usage of isotopes, time consuming surgical preparation of the animals, and the requirement of a terminal experiment. Here we describe a method which uses a single bolus injection of fluorescein isothiocyanate-(FITC) labeled inulin and the measurement of its fluorescence in 1-2 μl of diluted plasma. By applying a two-compartment model, with 8 blood collections per mouse, it is possible to measure GFR in up to 24 mice per day using a special work-flow protocol. This method only requires brief isoflurane anesthesia with all the blood samples being collected in a non-restrained and awake mouse. Another advantage is that it is possible to follow mice over a period of several months and treatments (i.e. doing paired experiments with dietary changes or drug applications). We hope that this technique of measuring GFR is useful to other investigators studying mouse kidney function and will replace less accurate methods of estimating kidney function, such as plasma creatinine and blood urea nitrogen.
Medicine, Issue 75, Anatomy, Physiology, Biomedical Engineering, Molecular Biology, Nephrology, Kidney Function Tests, Glomerular filtration rate, rats, mice, conscious, creatinine, inulin, Jaffe, hypertension, HPLC, animal model
50330
Play Button
Rapid Point-of-Care Assay of Enoxaparin Anticoagulant Efficacy in Whole Blood
Authors: Mario A. Inchiosa Jr., Suryanarayana Pothula, Keshar Kubal, Vajubhai T. Sanchala, Iris Navarro.
Institutions: New York Medical College , New York Medical College .
There is the need for a clinical assay to determine the extent to which a patient's blood is effectively anticoagulated by the low-molecular-weight-heparin (LMWH), enoxaparin. There are also urgent clinical situations where it would be important if this could be determined rapidly. The present assay is designed to accomplish this. We only assayed human blood samples that were spiked with known concentrations of enoxaparin. The essential feature of the present assay is the quantification of the efficacy of enoxaparin in a patient's blood sample by degrading it to complete inactivity with heparinase. Two blood samples were drawn into Vacutainer tubes (Becton-Dickenson; Franklin Lakes, NJ) that were spiked with enoxaparin; one sample was digested with heparinase for 5 min at 37 °C, the other sample represented the patient's baseline anticoagulated status. The percent shortening of clotting time in the heparinase-treated sample, as compared to the baseline state, yielded the anticoagulant contribution of enoxaparin. We used the portable, battery operated Hemochron 801 apparatus for measurements of clotting times (International Technidyne Corp., Edison, NJ). The apparatus has 2 thermostatically controlled (37 °C) assay tube wells. We conducted the assays in two types of assay cartridges that are available from the manufacturer of the instrument. One cartridge was modified to increase its sensitivity. We removed the kaolin from the FTK-ACT cartridge by extensive rinsing with distilled water, leaving only the glass surface of the tube, and perhaps the detection magnet, as activators. We called this our minimally activated assay (MAA). The use of a minimally activated assay has been studied by us and others. 2-4 The second cartridge that was studied was an activated partial thromboplastin time (aPTT) assay (A104). This was used as supplied from the manufacturer. The thermostated wells of the instrument were used for both the heparinase digestion and coagulation assays. The assay can be completed within 10 min. The MAA assay showed robust changes in clotting time after heparinase digestion of enoxaparin over a typical clinical concentration range. At 0.2 anti-Xa I.U. of enoxaparin per ml of blood sample, heparinase digestion caused an average decrease of 9.8% (20.4 sec) in clotting time; at 1.0 I.U. per ml of enoxaparin there was a 41.4% decrease (148.8 sec). This report only presents the experimental application of the assay; its value in a clinical setting must still be established.
Medicine, Issue 68, Immunology, Physiology, Pharmacology, low-molecular-weight-heparin, low-molecular-weight-heparin assay, LMWH point-of-care assay, anti-Factor-Xa activity, enoxaparin, heparinase, whole blood, assay
3852
Play Button
Renal Ischaemia Reperfusion Injury: A Mouse Model of Injury and Regeneration
Authors: Emily E. Hesketh, Alicja Czopek, Michael Clay, Gary Borthwick, David Ferenbach, David Kluth, Jeremy Hughes.
Institutions: University of Edinburgh.
Renal ischaemia reperfusion injury (IRI) is a common cause of acute kidney injury (AKI) in patients and occlusion of renal blood flow is unavoidable during renal transplantation. Experimental models that accurately and reproducibly recapitulate renal IRI are crucial in dissecting the pathophysiology of AKI and the development of novel therapeutic agents. Presented here is a mouse model of renal IRI that results in reproducible AKI. This is achieved by a midline laparotomy approach for the surgery with one incision allowing both a right nephrectomy that provides control tissue and clamping of the left renal pedicle to induce ischaemia of the left kidney. By careful monitoring of the clamp position and body temperature during the period of ischaemia this model achieves reproducible functional and structural injury. Mice sacrificed 24 hr following surgery demonstrate loss of renal function with elevation of the serum or plasma creatinine level as well as structural kidney damage with acute tubular necrosis evident. Renal function improves and the acute tissue injury resolves during the course of 7 days following renal IRI such that this model may be used to study renal regeneration. This model of renal IRI has been utilized to study the molecular and cellular pathophysiology of AKI as well as analysis of the subsequent renal regeneration.
Medicine, Issue 88, Murine, Acute Kidney Injury, Ischaemia, Reperfusion, Nephrectomy, Regeneration, Laparotomy
51816
Play Button
High Throughput Sequential ELISA for Validation of Biomarkers of Acute Graft-Versus-Host Disease
Authors: Bryan Fiema, Andrew C. Harris, Aurelie Gomez, Praechompoo Pongtornpipat, Kelly Lamiman, Mark T. Vander Lugt, Sophie Paczesny.
Institutions: University of Michigan .
Unbiased discovery proteomics strategies have the potential to identify large numbers of novel biomarkers that can improve diagnostic and prognostic testing in a clinical setting and may help guide therapeutic interventions. When large numbers of candidate proteins are identified, it may be difficult to validate candidate biomarkers in a timely and efficient fashion from patient plasma samples that are event-driven, of finite volume and irreplaceable, such as at the onset of acute graft-versus-host disease (GVHD), a potentially life-threatening complication of allogeneic hematopoietic stem cell transplantation (HSCT). Here we describe the process of performing commercially available ELISAs for six validated GVHD proteins: IL-2Rα5, TNFR16, HGF7, IL-88, elafin2, and REG3α3 (also known as PAP1) in a sequential fashion to minimize freeze-thaw cycles, thawed plasma time and plasma usage. For this procedure we perform the ELISAs in sequential order as determined by sample dilution factor as established in our laboratory using manufacturer ELISA kits and protocols with minor adjustments to facilitate optimal sequential ELISA performance. The resulting plasma biomarker concentrations can then be compiled and analyzed for significant findings within a patient cohort. While these biomarkers are currently for research purposes only, their incorporation into clinical care is currently being investigated in clinical trials. This technique can be applied to perform ELISAs for multiple proteins/cytokines of interest on the same sample(s) provided the samples do not need to be mixed with other reagents. If ELISA kits do not come with pre-coated plates, 96-well half-well plates or 384-well plates can be used to further minimize use of samples/reagents.
Medicine, Issue 68, ELISA, Sequential ELISA, Cytokine, Blood plasma, biomarkers, proteomics, graft-versus-host disease, Small sample, Quantification
4247
Play Button
Low Molecular Weight Protein Enrichment on Mesoporous Silica Thin Films for Biomarker Discovery
Authors: Jia Fan, James W. Gallagher, Hung-Jen Wu, Matthew G. Landry, Jason Sakamoto, Mauro Ferrari, Ye Hu.
Institutions: The Methodist Hospital Research Institute, National Center for Nanoscience and Technology.
The identification of circulating biomarkers holds great potential for non invasive approaches in early diagnosis and prognosis, as well as for the monitoring of therapeutic efficiency.1-3 The circulating low molecular weight proteome (LMWP) composed of small proteins shed from tissues and cells or peptide fragments derived from the proteolytic degradation of larger proteins, has been associated with the pathological condition in patients and likely reflects the state of disease.4,5 Despite these potential clinical applications, the use of Mass Spectrometry (MS) to profile the LMWP from biological fluids has proven to be very challenging due to the large dynamic range of protein and peptide concentrations in serum.6 Without sample pre-treatment, some of the more highly abundant proteins obscure the detection of low-abundance species in serum/plasma. Current proteomic-based approaches, such as two-dimensional polyacrylamide gel-electrophoresis (2D-PAGE) and shotgun proteomics methods are labor-intensive, low throughput and offer limited suitability for clinical applications.7-9 Therefore, a more effective strategy is needed to isolate LMWP from blood and allow the high throughput screening of clinical samples. Here, we present a fast, efficient and reliable multi-fractionation system based on mesoporous silica chips to specifically target and enrich LMWP.10,11 Mesoporous silica (MPS) thin films with tunable features at the nanoscale were fabricated using the triblock copolymer template pathway. Using different polymer templates and polymer concentrations in the precursor solution, various pore size distributions, pore structures, connectivity and surface properties were determined and applied for selective recovery of low mass proteins. The selective parsing of the enriched peptides into different subclasses according to their physicochemical properties will enhance the efficiency of recovery and detection of low abundance species. In combination with mass spectrometry and statistic analysis, we demonstrated the correlation between the nanophase characteristics of the mesoporous silica thin films and the specificity and efficacy of low mass proteome harvesting. The results presented herein reveal the potential of the nanotechnology-based technology to provide a powerful alternative to conventional methods for LMWP harvesting from complex biological fluids. Because of the ability to tune the material properties, the capability for low-cost production, the simplicity and rapidity of sample collection, and the greatly reduced sample requirements for analysis, this novel nanotechnology will substantially impact the field of proteomic biomarker research and clinical proteomic assessment.
Bioengineering, Issue 62, Nanoporous silica chip, Low molecular weight proteomics, Peptidomics, MALDI-TOF mass spectrometry, early diagnostics, proteomics
3876
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
51318
Play Button
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Authors: Madeleine E. Hackney, Kathleen McKee.
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
52066
Play Button
Developing Neuroimaging Phenotypes of the Default Mode Network in PTSD: Integrating the Resting State, Working Memory, and Structural Connectivity
Authors: Noah S. Philip, S. Louisa Carpenter, Lawrence H. Sweet.
Institutions: Alpert Medical School, Brown University, University of Georgia.
Complementary structural and functional neuroimaging techniques used to examine the Default Mode Network (DMN) could potentially improve assessments of psychiatric illness severity and provide added validity to the clinical diagnostic process. Recent neuroimaging research suggests that DMN processes may be disrupted in a number of stress-related psychiatric illnesses, such as posttraumatic stress disorder (PTSD). Although specific DMN functions remain under investigation, it is generally thought to be involved in introspection and self-processing. In healthy individuals it exhibits greatest activity during periods of rest, with less activity, observed as deactivation, during cognitive tasks, e.g., working memory. This network consists of the medial prefrontal cortex, posterior cingulate cortex/precuneus, lateral parietal cortices and medial temporal regions. Multiple functional and structural imaging approaches have been developed to study the DMN. These have unprecedented potential to further the understanding of the function and dysfunction of this network. Functional approaches, such as the evaluation of resting state connectivity and task-induced deactivation, have excellent potential to identify targeted neurocognitive and neuroaffective (functional) diagnostic markers and may indicate illness severity and prognosis with increased accuracy or specificity. Structural approaches, such as evaluation of morphometry and connectivity, may provide unique markers of etiology and long-term outcomes. Combined, functional and structural methods provide strong multimodal, complementary and synergistic approaches to develop valid DMN-based imaging phenotypes in stress-related psychiatric conditions. This protocol aims to integrate these methods to investigate DMN structure and function in PTSD, relating findings to illness severity and relevant clinical factors.
Medicine, Issue 89, default mode network, neuroimaging, functional magnetic resonance imaging, diffusion tensor imaging, structural connectivity, functional connectivity, posttraumatic stress disorder
51651
Play Button
Development of a Virtual Reality Assessment of Everyday Living Skills
Authors: Stacy A. Ruse, Vicki G. Davis, Alexandra S. Atkins, K. Ranga R. Krishnan, Kolleen H. Fox, Philip D. Harvey, Richard S.E. Keefe.
Institutions: NeuroCog Trials, Inc., Duke-NUS Graduate Medical Center, Duke University Medical Center, Fox Evaluation and Consulting, PLLC, University of Miami Miller School of Medicine.
Cognitive impairments affect the majority of patients with schizophrenia and these impairments predict poor long term psychosocial outcomes.  Treatment studies aimed at cognitive impairment in patients with schizophrenia not only require demonstration of improvements on cognitive tests, but also evidence that any cognitive changes lead to clinically meaningful improvements.  Measures of “functional capacity” index the extent to which individuals have the potential to perform skills required for real world functioning.  Current data do not support the recommendation of any single instrument for measurement of functional capacity.  The Virtual Reality Functional Capacity Assessment Tool (VRFCAT) is a novel, interactive gaming based measure of functional capacity that uses a realistic simulated environment to recreate routine activities of daily living. Studies are currently underway to evaluate and establish the VRFCAT’s sensitivity, reliability, validity, and practicality. This new measure of functional capacity is practical, relevant, easy to use, and has several features that improve validity and sensitivity of measurement of function in clinical trials of patients with CNS disorders.
Behavior, Issue 86, Virtual Reality, Cognitive Assessment, Functional Capacity, Computer Based Assessment, Schizophrenia, Neuropsychology, Aging, Dementia
51405
Play Button
Consensus Brain-derived Protein, Extraction Protocol for the Study of Human and Murine Brain Proteome Using Both 2D-DIGE and Mini 2DE Immunoblotting
Authors: Francisco-Jose Fernandez-Gomez, Fanny Jumeau, Maxime Derisbourg, Sylvie Burnouf, Hélène Tran, Sabiha Eddarkaoui, Hélène Obriot, Virginie Dutoit-Lefevre, Vincent Deramecourt, Valérie Mitchell, Didier Lefranc, Malika Hamdane, David Blum, Luc Buée, Valérie Buée-Scherrer, Nicolas Sergeant.
Institutions: Inserm UMR 837, CHRU-Lille, Faculté de Médecine - Pôle Recherche, CHRU-Lille.
Two-dimensional gel electrophoresis (2DE) is a powerful tool to uncover proteome modifications potentially related to different physiological or pathological conditions. Basically, this technique is based on the separation of proteins according to their isoelectric point in a first step, and secondly according to their molecular weights by SDS polyacrylamide gel electrophoresis (SDS-PAGE). In this report an optimized sample preparation protocol for little amount of human post-mortem and mouse brain tissue is described. This method enables to perform both two-dimensional fluorescence difference gel electrophoresis (2D-DIGE) and mini 2DE immunoblotting. The combination of these approaches allows one to not only find new proteins and/or protein modifications in their expression thanks to its compatibility with mass spectrometry detection, but also a new insight into markers validation. Thus, mini-2DE coupled to western blotting permits to identify and validate post-translational modifications, proteins catabolism and provides a qualitative comparison among different conditions and/or treatments. Herein, we provide a method to study components of protein aggregates found in AD and Lewy body dementia such as the amyloid-beta peptide and the alpha-synuclein. Our method can thus be adapted for the analysis of the proteome and insoluble proteins extract from human brain tissue and mice models too. In parallel, it may provide useful information for the study of molecular and cellular pathways involved in neurodegenerative diseases as well as potential novel biomarkers and therapeutic targets.
Neuroscience, Issue 86, proteomics, neurodegeneration, 2DE, human and mice brain tissue, fluorescence, immunoblotting. Abbreviations: 2DE (two-dimensional gel electrophoresis), 2D-DIGE (two-dimensional fluorescence difference gel electrophoresis), mini-2DE (mini 2DE immunoblotting),IPG (Immobilized pH Gradients), IEF (isoelectrofocusing), AD (Alzheimer´s disease)
51339
Play Button
Analytical Techniques for Assaying Nitric Oxide Bioactivity
Authors: Hong Jiang, Deepa Parthasarathy, Ashley C. Torregrossa, Asad Mian, Nathan S. Bryan.
Institutions: University of Texas Health Science Center at Houston , Baylor College of Medicine .
Nitric oxide (NO) is a diatomic free radical that is extremely short lived in biological systems (less than 1 second in circulating blood)1. NO may be considered one of the most important signaling molecules produced in our body, regulating essential functions including but not limited to regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification in biological matrices is critical to understanding the role of NO in health and disease. With such a short physiological half life of NO, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of relevant NO metabolites in multiple biological compartments provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. The ability to compare blood with select tissues in experimental animals will help bridge the gap between basic science and clinical medicine as far as diagnostic and prognostic utility of NO biomarkers in health and disease. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The established paradigm of NO biochemistry from production by NO synthases to activation of soluble guanylyl cyclase (sGC) to eventual oxidation to nitrite (NO2-) and nitrate (NO3-) may only represent part of NO's effects in vivo. The interaction of NO and NO-derived metabolites with protein thiols, secondary amines, and metals to form S-nitrosothiols (RSNOs), N-nitrosamines (RNNOs), and nitrosyl-heme respectively represent cGMP-independent effects of NO and are likely just as important physiologically as activation of sGC by NO. A true understanding of NO in physiology is derived from in vivo experiments sampling multiple compartments simultaneously. Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. The elucidation of new mechanisms and signaling pathways involving NO hinges on our ability to specifically, selectively and sensitively detect and quantify NO and all relevant NO products and metabolites in complex biological matrices. Here, we present a method for the rapid and sensitive analysis of nitrite and nitrate by HPLC as well as detection of free NO in biological samples using in vitro ozone based chemiluminescence with chemical derivitazation to determine molecular source of NO as well as ex vivo with organ bath myography.
Medicine, Issue 64, Molecular Biology, Nitric oxide, nitrite, nitrate, endothelium derived relaxing factor, HPLC, chemiluminscence
3722
Play Button
Measuring Ascending Aortic Stiffness In Vivo in Mice Using Ultrasound
Authors: Maggie M. Kuo, Viachaslau Barodka, Theodore P. Abraham, Jochen Steppan, Artin A. Shoukas, Mark Butlin, Alberto Avolio, Dan E. Berkowitz, Lakshmi Santhanam.
Institutions: Johns Hopkins University, Johns Hopkins University, Johns Hopkins University, Macquarie University.
We present a protocol for measuring in vivo aortic stiffness in mice using high-resolution ultrasound imaging. Aortic diameter is measured by ultrasound and aortic blood pressure is measured invasively with a solid-state pressure catheter. Blood pressure is raised then lowered incrementally by intravenous infusion of vasoactive drugs phenylephrine and sodium nitroprusside. Aortic diameter is measured for each pressure step to characterize the pressure-diameter relationship of the ascending aorta. Stiffness indices derived from the pressure-diameter relationship can be calculated from the data collected. Calculation of arterial compliance is described in this protocol. This technique can be used to investigate mechanisms underlying increased aortic stiffness associated with cardiovascular disease and aging. The technique produces a physiologically relevant measure of stiffness compared to ex vivo approaches because physiological influences on aortic stiffness are incorporated in the measurement. The primary limitation of this technique is the measurement error introduced from the movement of the aorta during the cardiac cycle. This motion can be compensated by adjusting the location of the probe with the aortic movement as well as making multiple measurements of the aortic pressure-diameter relationship and expanding the experimental group size.
Medicine, Issue 94, Aortic stiffness, ultrasound, in vivo, aortic compliance, elastic modulus, mouse model, cardiovascular disease
52200
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
52070
Play Button
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Authors: Daniel T. Claiborne, Jessica L. Prince, Eric Hunter.
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro replication of HIV-1 as influenced by the gag gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro replication of chronically derived gag-pro sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
51506
Play Button
Dual-mode Imaging of Cutaneous Tissue Oxygenation and Vascular Function
Authors: Ronald X. Xu, Kun Huang, Ruogu Qin, Jiwei Huang, Jeff S. Xu, Liya Ding, Urmila S. Gnyawali, Gayle M. Gordillo, Surya C. Gnyawali, Chandan K. Sen.
Institutions: The Ohio State University, The Ohio State University, The Ohio State University, The Ohio State University.
Accurate assessment of cutaneous tissue oxygenation and vascular function is important for appropriate detection, staging, and treatment of many health disorders such as chronic wounds. We report the development of a dual-mode imaging system for non-invasive and non-contact imaging of cutaneous tissue oxygenation and vascular function. The imaging system integrated an infrared camera, a CCD camera, a liquid crystal tunable filter and a high intensity fiber light source. A Labview interface was programmed for equipment control, synchronization, image acquisition, processing, and visualization. Multispectral images captured by the CCD camera were used to reconstruct the tissue oxygenation map. Dynamic thermographic images captured by the infrared camera were used to reconstruct the vascular function map. Cutaneous tissue oxygenation and vascular function images were co-registered through fiduciary markers. The performance characteristics of the dual-mode image system were tested in humans.
Medicine, Issue 46, Dual-mode, multispectral imaging, infrared imaging, cutaneous tissue oxygenation, vascular function, co-registration, wound healing
2095
Play Button
Setting-up an In Vitro Model of Rat Blood-brain Barrier (BBB): A Focus on BBB Impermeability and Receptor-mediated Transport
Authors: Yves Molino, Françoise Jabès, Emmanuelle Lacassagne, Nicolas Gaudin, Michel Khrestchatisky.
Institutions: VECT-HORUS SAS, CNRS, NICN UMR 7259.
The blood brain barrier (BBB) specifically regulates molecular and cellular flux between the blood and the nervous tissue. Our aim was to develop and characterize a highly reproducible rat syngeneic in vitro model of the BBB using co-cultures of primary rat brain endothelial cells (RBEC) and astrocytes to study receptors involved in transcytosis across the endothelial cell monolayer. Astrocytes were isolated by mechanical dissection following trypsin digestion and were frozen for later co-culture. RBEC were isolated from 5-week-old rat cortices. The brains were cleaned of meninges and white matter, and mechanically dissociated following enzymatic digestion. Thereafter, the tissue homogenate was centrifuged in bovine serum albumin to separate vessel fragments from nervous tissue. The vessel fragments underwent a second enzymatic digestion to free endothelial cells from their extracellular matrix. The remaining contaminating cells such as pericytes were further eliminated by plating the microvessel fragments in puromycin-containing medium. They were then passaged onto filters for co-culture with astrocytes grown on the bottom of the wells. RBEC expressed high levels of tight junction (TJ) proteins such as occludin, claudin-5 and ZO-1 with a typical localization at the cell borders. The transendothelial electrical resistance (TEER) of brain endothelial monolayers, indicating the tightness of TJs reached 300 ohm·cm2 on average. The endothelial permeability coefficients (Pe) for lucifer yellow (LY) was highly reproducible with an average of 0.26 ± 0.11 x 10-3 cm/min. Brain endothelial cells organized in monolayers expressed the efflux transporter P-glycoprotein (P-gp), showed a polarized transport of rhodamine 123, a ligand for P-gp, and showed specific transport of transferrin-Cy3 and DiILDL across the endothelial cell monolayer. In conclusion, we provide a protocol for setting up an in vitro BBB model that is highly reproducible due to the quality assurance methods, and that is suitable for research on BBB transporters and receptors.
Medicine, Issue 88, rat brain endothelial cells (RBEC), mouse, spinal cord, tight junction (TJ), receptor-mediated transport (RMT), low density lipoprotein (LDL), LDLR, transferrin, TfR, P-glycoprotein (P-gp), transendothelial electrical resistance (TEER),
51278
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
50671
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Authors: Phoebe Spetsieris, Yilong Ma, Shichun Peng, Ji Hyun Ko, Vijay Dhawan, Chris C. Tang, David Eidelberg.
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
50319
Play Button
Multi-step Preparation Technique to Recover Multiple Metabolite Compound Classes for In-depth and Informative Metabolomic Analysis
Authors: Charmion Cruickshank-Quinn, Kevin D. Quinn, Roger Powell, Yanhui Yang, Michael Armstrong, Spencer Mahaffey, Richard Reisdorph, Nichole Reisdorph.
Institutions: National Jewish Health, University of Colorado Denver.
Metabolomics is an emerging field which enables profiling of samples from living organisms in order to obtain insight into biological processes. A vital aspect of metabolomics is sample preparation whereby inconsistent techniques generate unreliable results. This technique encompasses protein precipitation, liquid-liquid extraction, and solid-phase extraction as a means of fractionating metabolites into four distinct classes. Improved enrichment of low abundance molecules with a resulting increase in sensitivity is obtained, and ultimately results in more confident identification of molecules. This technique has been applied to plasma, bronchoalveolar lavage fluid, and cerebrospinal fluid samples with volumes as low as 50 µl.  Samples can be used for multiple downstream applications; for example, the pellet resulting from protein precipitation can be stored for later analysis. The supernatant from that step undergoes liquid-liquid extraction using water and strong organic solvent to separate the hydrophilic and hydrophobic compounds. Once fractionated, the hydrophilic layer can be processed for later analysis or discarded if not needed. The hydrophobic fraction is further treated with a series of solvents during three solid-phase extraction steps to separate it into fatty acids, neutral lipids, and phospholipids. This allows the technician the flexibility to choose which class of compounds is preferred for analysis. It also aids in more reliable metabolite identification since some knowledge of chemical class exists.
Bioengineering, Issue 89, plasma, chemistry techniques, analytical, solid phase extraction, mass spectrometry, metabolomics, fluids and secretions, profiling, small molecules, lipids, liquid chromatography, liquid-liquid extraction, cerebrospinal fluid, bronchoalveolar lavage fluid
51670
Play Button
Hydrogel Nanoparticle Harvesting of Plasma or Urine for Detecting Low Abundance Proteins
Authors: Ruben Magni, Benjamin H. Espina, Lance A. Liotta, Alessandra Luchini, Virginia Espina.
Institutions: George Mason University, Ceres Nanosciences.
Novel biomarker discovery plays a crucial role in providing more sensitive and specific disease detection. Unfortunately many low-abundance biomarkers that exist in biological fluids cannot be easily detected with mass spectrometry or immunoassays because they are present in very low concentration, are labile, and are often masked by high-abundance proteins such as albumin or immunoglobulin. Bait containing poly(N-isopropylacrylamide) (NIPAm) based nanoparticles are able to overcome these physiological barriers. In one step they are able to capture, concentrate and preserve biomarkers from body fluids. Low-molecular weight analytes enter the core of the nanoparticle and are captured by different organic chemical dyes, which act as high affinity protein baits. The nanoparticles are able to concentrate the proteins of interest by several orders of magnitude. This concentration factor is sufficient to increase the protein level such that the proteins are within the detection limit of current mass spectrometers, western blotting, and immunoassays. Nanoparticles can be incubated with a plethora of biological fluids and they are able to greatly enrich the concentration of low-molecular weight proteins and peptides while excluding albumin and other high-molecular weight proteins. Our data show that a 10,000 fold amplification in the concentration of a particular analyte can be achieved, enabling mass spectrometry and immunoassays to detect previously undetectable biomarkers.
Bioengineering, Issue 90, biomarker, hydrogel, low abundance, mass spectrometry, nanoparticle, plasma, protein, urine
51789
Play Button
High Efficiency Differentiation of Human Pluripotent Stem Cells to Cardiomyocytes and Characterization by Flow Cytometry
Authors: Subarna Bhattacharya, Paul W. Burridge, Erin M. Kropp, Sandra L. Chuppa, Wai-Meng Kwok, Joseph C. Wu, Kenneth R. Boheler, Rebekah L. Gundry.
Institutions: Medical College of Wisconsin, Stanford University School of Medicine, Medical College of Wisconsin, Hong Kong University, Johns Hopkins University School of Medicine, Medical College of Wisconsin.
There is an urgent need to develop approaches for repairing the damaged heart, discovering new therapeutic drugs that do not have toxic effects on the heart, and improving strategies to accurately model heart disease. The potential of exploiting human induced pluripotent stem cell (hiPSC) technology to generate cardiac muscle “in a dish” for these applications continues to generate high enthusiasm. In recent years, the ability to efficiently generate cardiomyogenic cells from human pluripotent stem cells (hPSCs) has greatly improved, offering us new opportunities to model very early stages of human cardiac development not otherwise accessible. In contrast to many previous methods, the cardiomyocyte differentiation protocol described here does not require cell aggregation or the addition of Activin A or BMP4 and robustly generates cultures of cells that are highly positive for cardiac troponin I and T (TNNI3, TNNT2), iroquois-class homeodomain protein IRX-4 (IRX4), myosin regulatory light chain 2, ventricular/cardiac muscle isoform (MLC2v) and myosin regulatory light chain 2, atrial isoform (MLC2a) by day 10 across all human embryonic stem cell (hESC) and hiPSC lines tested to date. Cells can be passaged and maintained for more than 90 days in culture. The strategy is technically simple to implement and cost-effective. Characterization of cardiomyocytes derived from pluripotent cells often includes the analysis of reference markers, both at the mRNA and protein level. For protein analysis, flow cytometry is a powerful analytical tool for assessing quality of cells in culture and determining subpopulation homogeneity. However, technical variation in sample preparation can significantly affect quality of flow cytometry data. Thus, standardization of staining protocols should facilitate comparisons among various differentiation strategies. Accordingly, optimized staining protocols for the analysis of IRX4, MLC2v, MLC2a, TNNI3, and TNNT2 by flow cytometry are described.
Cellular Biology, Issue 91, human induced pluripotent stem cell, flow cytometry, directed differentiation, cardiomyocyte, IRX4, TNNI3, TNNT2, MCL2v, MLC2a
52010
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
1988
Play Button
A Technique for Serial Collection of Cerebrospinal Fluid from the Cisterna Magna in Mouse
Authors: Li Liu, Karen Duff.
Institutions: Columbia University.
Alzheimer's disease (AD) is a progressive neurodegenerative disease that is pathologically characterized by extracellular deposition of β-amyloid peptide (Aβ) and intraneuronal accumulation of hyperphosphorylated tau protein. Because cerebrospinal fluid (CSF) is in direct contact with the extracellular space of the brain, it provides a reflection of the biochemical changes in the brain in response to pathological processes. CSF from AD patients shows a decrease in the 42 amino-acid form of Aβ (Aβ42), and increases in total tau and hyperphosphorylated tau, though the mechanisms responsible for these changes are still not fully understood. Transgenic (Tg) mouse models of AD provide an excellent opportunity to investigate how and why Aβ or tau levels in CSF change as the disease progresses. Here, we demonstrate a refined cisterna magna puncture technique for CSF sampling from the mouse. This extremely gentle sampling technique allows serial CSF samples to be obtained from the same mouse at 2-3 month intervals which greatly minimizes the confounding effect of between-mouse variability in Aβ or tau levels, making it possible to detect subtle alterations over time. In combination with Aβ and tau ELISA, this technique will be useful for studies designed to investigate the relationship between the levels of CSF Aβ42 and tau, and their metabolism in the brain in AD mouse models. Studies in Tg mice could provide important validation as to the potential of CSF Aβ or tau levels to be used as biological markers for monitoring disease progression, and to monitor the effect of therapeutic interventions. As the mice can be sacrificed and the brains can be examined for biochemical or histological changes, the mechanisms underlying the CSF changes can be better assessed. These data are likely to be informative for interpretation of human AD CSF changes.
Neuroscience, Issue 21, Cerebrospinal fluid, Alzheimer's disease, Transgenic mouse, β-amyloid, tau
960
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.