JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Socio-economic position and type 2 diabetes risk factors: patterns in UK children of South Asian, black African-Caribbean and white European origin.
PLoS ONE
Socio-economic position (SEP) and ethnicity influence type 2 diabetes mellitus (T2DM) risk in adults. However, the influence of SEP on emerging T2DM risks in different ethnic groups and the contribution of SEP to ethnic differences in T2DM risk in young people have been little studied. We examined the relationships between SEP and T2DM risk factors in UK children of South Asian, black African-Caribbean and white European origin, using the official UK National Statistics Socio-economic Classification (NS-SEC) and assessed the extent to which NS-SEC explained ethnic differences in T2DM risk factors.
Authors: Patrick De Boever, Tijs Louwies, Eline Provost, Luc Int Panis, Tim S. Nawrot.
Published: 10-22-2014
ABSTRACT
The microcirculation consists of blood vessels with diameters less than 150 µm. It makes up a large part of the circulatory system and plays an important role in maintaining cardiovascular health. The retina is a tissue that lines the interior of the eye and it is the only tissue that allows for a non-invasive analysis of the microvasculature. Nowadays, high-quality fundus images can be acquired using digital cameras. Retinal images can be collected in 5 min or less, even without dilatation of the pupils. This unobtrusive and fast procedure for visualizing the microcirculation is attractive to apply in epidemiological studies and to monitor cardiovascular health from early age up to old age. Systemic diseases that affect the circulation can result in progressive morphological changes in the retinal vasculature. For example, changes in the vessel calibers of retinal arteries and veins have been associated with hypertension, atherosclerosis, and increased risk of stroke and myocardial infarction. The vessel widths are derived using image analysis software and the width of the six largest arteries and veins are summarized in the Central Retinal Arteriolar Equivalent (CRAE) and the Central Retinal Venular Equivalent (CRVE). The latter features have been shown useful to study the impact of modifiable lifestyle and environmental cardiovascular disease risk factors. The procedures to acquire fundus images and the analysis steps to obtain CRAE and CRVE are described. Coefficients of variation of repeated measures of CRAE and CRVE are less than 2% and within-rater reliability is very high. Using a panel study, the rapid response of the retinal vessel calibers to short-term changes in particulate air pollution, a known risk factor for cardiovascular mortality and morbidity, is reported. In conclusion, retinal imaging is proposed as a convenient and instrumental tool for epidemiological studies to study microvascular responses to cardiovascular disease risk factors.
23 Related JoVE Articles!
Play Button
Measuring the Subjective Value of Risky and Ambiguous Options using Experimental Economics and Functional MRI Methods
Authors: Ifat Levy, Lior Rosenberg Belmaker, Kirk Manson, Agnieszka Tymula, Paul W. Glimcher.
Institutions: Yale School of Medicine, Yale School of Medicine, New York University , New York University , New York University .
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2, the degree of those aversions vary substantially across individuals, such that the subjective value of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3 to assess the neural representation of the subjective values of risky and ambiguous options4. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations. In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
Neuroscience, Issue 67, Medicine, Molecular Biology, fMRI, magnetic resonance imaging, decision-making, value, uncertainty, risk, ambiguity
3724
Play Button
Rapid Point-of-Care Assay of Enoxaparin Anticoagulant Efficacy in Whole Blood
Authors: Mario A. Inchiosa Jr., Suryanarayana Pothula, Keshar Kubal, Vajubhai T. Sanchala, Iris Navarro.
Institutions: New York Medical College , New York Medical College .
There is the need for a clinical assay to determine the extent to which a patient's blood is effectively anticoagulated by the low-molecular-weight-heparin (LMWH), enoxaparin. There are also urgent clinical situations where it would be important if this could be determined rapidly. The present assay is designed to accomplish this. We only assayed human blood samples that were spiked with known concentrations of enoxaparin. The essential feature of the present assay is the quantification of the efficacy of enoxaparin in a patient's blood sample by degrading it to complete inactivity with heparinase. Two blood samples were drawn into Vacutainer tubes (Becton-Dickenson; Franklin Lakes, NJ) that were spiked with enoxaparin; one sample was digested with heparinase for 5 min at 37 °C, the other sample represented the patient's baseline anticoagulated status. The percent shortening of clotting time in the heparinase-treated sample, as compared to the baseline state, yielded the anticoagulant contribution of enoxaparin. We used the portable, battery operated Hemochron 801 apparatus for measurements of clotting times (International Technidyne Corp., Edison, NJ). The apparatus has 2 thermostatically controlled (37 °C) assay tube wells. We conducted the assays in two types of assay cartridges that are available from the manufacturer of the instrument. One cartridge was modified to increase its sensitivity. We removed the kaolin from the FTK-ACT cartridge by extensive rinsing with distilled water, leaving only the glass surface of the tube, and perhaps the detection magnet, as activators. We called this our minimally activated assay (MAA). The use of a minimally activated assay has been studied by us and others. 2-4 The second cartridge that was studied was an activated partial thromboplastin time (aPTT) assay (A104). This was used as supplied from the manufacturer. The thermostated wells of the instrument were used for both the heparinase digestion and coagulation assays. The assay can be completed within 10 min. The MAA assay showed robust changes in clotting time after heparinase digestion of enoxaparin over a typical clinical concentration range. At 0.2 anti-Xa I.U. of enoxaparin per ml of blood sample, heparinase digestion caused an average decrease of 9.8% (20.4 sec) in clotting time; at 1.0 I.U. per ml of enoxaparin there was a 41.4% decrease (148.8 sec). This report only presents the experimental application of the assay; its value in a clinical setting must still be established.
Medicine, Issue 68, Immunology, Physiology, Pharmacology, low-molecular-weight-heparin, low-molecular-weight-heparin assay, LMWH point-of-care assay, anti-Factor-Xa activity, enoxaparin, heparinase, whole blood, assay
3852
Play Button
Localization, Identification, and Excision of Murine Adipose Depots
Authors: Adrien Mann, Allie Thompson, Nathan Robbins, Andra L. Blomkalns.
Institutions: University of Cincinnati College of Medicine.
Obesity has increased dramatically in the last few decades and affects over one third of the adult US population. The economic effect of obesity in 2005 reached a staggering sum of $190.2 billion in direct medical costs alone. Obesity is a major risk factor for a wide host of diseases. Historically, little was known regarding adipose and its major and essential functions in the body. Brown and white adipose are the two main types of adipose but current literature has identified a new type of fat called brite or beige adipose. Research has shown that adipose depots have specific metabolic profiles and certain depots allow for a propensity for obesity and other related disorders. The goal of this protocol is to provide researchers the capacity to identify and excise adipose depots that will allow for the analysis of different factorial effects on adipose; as well as the beneficial or detrimental role adipose plays in disease and overall health. Isolation and excision of adipose depots allows investigators to look at gross morphological changes as well as histological changes. The adipose isolated can also be used for molecular studies to evaluate transcriptional and translational change or for in vitro experimentation to discover targets of interest and mechanisms of action. This technique is superior to other published techniques due to the design allowing for isolation of multiple depots with simplicity and minimal contamination.
Medicine, Issue 94, adipose, surgical, excision, subcutaneous adipose tissue (SQ), perivascular adipose tissue (PVAT), visceral adipose tissue (VAT), brown adipose tissue (BAT), white adipose tissue (WAT)
52174
Play Button
Nerve Excitability Assessment in Chemotherapy-induced Neurotoxicity
Authors: Susanna B. Park, Cindy S-Y. Lin, Matthew C. Kiernan.
Institutions: University of New South Wales , University of New South Wales , University of New South Wales .
Chemotherapy-induced neurotoxicity is a serious consequence of cancer treatment, which occurs with some of the most commonly used chemotherapies1,2. Chemotherapy-induced peripheral neuropathy produces symptoms of numbness and paraesthesia in the limbs and may progress to difficulties with fine motor skills and walking, leading to functional impairment. In addition to producing troubling symptoms, chemotherapy-induced neuropathy may limit treatment success leading to dose reduction or early cessation of treatment. Neuropathic symptoms may persist long-term, leaving permanent nerve damage in patients with an otherwise good prognosis3. As chemotherapy is utilised more often as a preventative measure, and survival rates increase, the importance of long-lasting and significant neurotoxicity will increase. There are no established neuroprotective or treatment options and a lack of sensitive assessment methods. Appropriate assessment of neurotoxicity will be critical as a prognostic factor and as suitable endpoints for future trials of neuroprotective agents. Current methods to assess the severity of chemotherapy-induced neuropathy utilise clinician-based grading scales which have been demonstrated to lack sensitivity to change and inter-observer objectivity4. Conventional nerve conduction studies provide information about compound action potential amplitude and conduction velocity, which are relatively non-specific measures and do not provide insight into ion channel function or resting membrane potential. Accordingly, prior studies have demonstrated that conventional nerve conduction studies are not sensitive to early change in chemotherapy-induced neurotoxicity4-6. In comparison, nerve excitability studies utilize threshold tracking techniques which have been developed to enable assessment of ion channels, pumps and exchangers in vivo in large myelinated human axons7-9. Nerve excitability techniques have been established as a tool to examine the development and severity of chemotherapy-induced neurotoxicity10-13. Comprising a number of excitability parameters, nerve excitability studies can be used to assess acute neurotoxicity arising immediately following infusion and the development of chronic, cumulative neurotoxicity. Nerve excitability techniques are feasible in the clinical setting, with each test requiring only 5 -10 minutes to complete. Nerve excitability equipment is readily commercially available, and a portable system has been devised so that patients can be tested in situ in the infusion centre setting. In addition, these techniques can be adapted for use in multiple chemotherapies. In patients treated with the chemotherapy oxaliplatin, primarily utilised for colorectal cancer, nerve excitability techniques provide a method to identify patients at-risk for neurotoxicity prior to the onset of chronic neuropathy. Nerve excitability studies have revealed the development of an acute Na+ channelopathy in motor and sensory axons10-13. Importantly, patients who demonstrated changes in excitability in early treatment were subsequently more likely to develop moderate to severe neurotoxicity11. However, across treatment, striking longitudinal changes were identified only in sensory axons which were able to predict clinical neurological outcome in 80% of patients10. These changes demonstrated a different pattern to those seen acutely following oxaliplatin infusion, and most likely reflect the development of significant axonal damage and membrane potential change in sensory nerves which develops longitudinally during oxaliplatin treatment10. Significant abnormalities developed during early treatment, prior to any reduction in conventional measures of nerve function, suggesting that excitability parameters may provide a sensitive biomarker.
Neuroscience, Issue 62, Chemotherapy, Neurotoxicity, Neuropathy, Nerve excitability, Ion channel function, Oxaliplatin, oncology, medicine
3439
Play Button
Multimodal Optical Microscopy Methods Reveal Polyp Tissue Morphology and Structure in Caribbean Reef Building Corals
Authors: Mayandi Sivaguru, Glenn A. Fried, Carly A. H. Miller, Bruce W. Fouke.
Institutions: University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign.
An integrated suite of imaging techniques has been applied to determine the three-dimensional (3D) morphology and cellular structure of polyp tissues comprising the Caribbean reef building corals Montastraeaannularis and M. faveolata. These approaches include fluorescence microscopy (FM), serial block face imaging (SBFI), and two-photon confocal laser scanning microscopy (TPLSM). SBFI provides deep tissue imaging after physical sectioning; it details the tissue surface texture and 3D visualization to tissue depths of more than 2 mm. Complementary FM and TPLSM yield ultra-high resolution images of tissue cellular structure. Results have: (1) identified previously unreported lobate tissue morphologies on the outer wall of individual coral polyps and (2) created the first surface maps of the 3D distribution and tissue density of chromatophores and algae-like dinoflagellate zooxanthellae endosymbionts. Spectral absorption peaks of 500 nm and 675 nm, respectively, suggest that M. annularis and M. faveolata contain similar types of chlorophyll and chromatophores. However, M. annularis and M. faveolata exhibit significant differences in the tissue density and 3D distribution of these key cellular components. This study focusing on imaging methods indicates that SBFI is extremely useful for analysis of large mm-scale samples of decalcified coral tissues. Complimentary FM and TPLSM reveal subtle submillimeter scale changes in cellular distribution and density in nondecalcified coral tissue samples. The TPLSM technique affords: (1) minimally invasive sample preparation, (2) superior optical sectioning ability, and (3) minimal light absorption and scattering, while still permitting deep tissue imaging.
Environmental Sciences, Issue 91, Serial block face imaging, two-photon fluorescence microscopy, Montastraea annularis, Montastraea faveolata, 3D coral tissue morphology and structure, zooxanthellae, chromatophore, autofluorescence, light harvesting optimization, environmental change
51824
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
50341
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
An Experimental and Bioinformatics Protocol for RNA-seq Analyses of Photoperiodic Diapause in the Asian Tiger Mosquito, Aedes albopictus
Authors: Monica F. Poelchau, Xin Huang, Allison Goff, Julie Reynolds, Peter Armbruster.
Institutions: Georgetown University, The Ohio State University.
Photoperiodic diapause is an important adaptation that allows individuals to escape harsh seasonal environments via a series of physiological changes, most notably developmental arrest and reduced metabolism. Global gene expression profiling via RNA-Seq can provide important insights into the transcriptional mechanisms of photoperiodic diapause. The Asian tiger mosquito, Aedes albopictus, is an outstanding organism for studying the transcriptional bases of diapause due to its ease of rearing, easily induced diapause, and the genomic resources available. This manuscript presents a general experimental workflow for identifying diapause-induced transcriptional differences in A. albopictus. Rearing techniques, conditions necessary to induce diapause and non-diapause development, methods to estimate percent diapause in a population, and RNA extraction and integrity assessment for mosquitoes are documented. A workflow to process RNA-Seq data from Illumina sequencers culminates in a list of differentially expressed genes. The representative results demonstrate that this protocol can be used to effectively identify genes differentially regulated at the transcriptional level in A. albopictus due to photoperiodic differences. With modest adjustments, this workflow can be readily adapted to study the transcriptional bases of diapause or other important life history traits in other mosquitoes.
Genetics, Issue 93, Aedes albopictus Asian tiger mosquito, photoperiodic diapause, RNA-Seq de novo transcriptome assembly, mosquito husbandry
51961
Play Button
Generation of High Quality Chromatin Immunoprecipitation DNA Template for High-throughput Sequencing (ChIP-seq)
Authors: Sandra Deliard, Jianhua Zhao, Qianghua Xia, Struan F.A. Grant.
Institutions: Children's Hospital of Philadelphia Research Institute, University of Pennsylvania .
ChIP-sequencing (ChIP-seq) methods directly offer whole-genome coverage, where combining chromatin immunoprecipitation (ChIP) and massively parallel sequencing can be utilized to identify the repertoire of mammalian DNA sequences bound by transcription factors in vivo. "Next-generation" genome sequencing technologies provide 1-2 orders of magnitude increase in the amount of sequence that can be cost-effectively generated over older technologies thus allowing for ChIP-seq methods to directly provide whole-genome coverage for effective profiling of mammalian protein-DNA interactions. For successful ChIP-seq approaches, one must generate high quality ChIP DNA template to obtain the best sequencing outcomes. The description is based around experience with the protein product of the gene most strongly implicated in the pathogenesis of type 2 diabetes, namely the transcription factor transcription factor 7-like 2 (TCF7L2). This factor has also been implicated in various cancers. Outlined is how to generate high quality ChIP DNA template derived from the colorectal carcinoma cell line, HCT116, in order to build a high-resolution map through sequencing to determine the genes bound by TCF7L2, giving further insight in to its key role in the pathogenesis of complex traits.
Molecular Biology, Issue 74, Genetics, Biochemistry, Microbiology, Medicine, Proteins, DNA-Binding Proteins, Transcription Factors, Chromatin Immunoprecipitation, Genes, chromatin, immunoprecipitation, ChIP, DNA, PCR, sequencing, antibody, cross-link, cell culture, assay
50286
Play Button
Quantification of Atherosclerotic Plaque Activity and Vascular Inflammation using [18-F] Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography (FDG-PET/CT)
Authors: Nehal N. Mehta, Drew A. Torigian, Joel M. Gelfand, Babak Saboury, Abass Alavi.
Institutions: University of Pennsylvania, Perelman School of Medicine, University of Pennsylvania, Perelman School of Medicine, University of Pennsylvania, Perelman School of Medicine.
Conventional non-invasive imaging modalities of atherosclerosis such as coronary artery calcium (CAC)1 and carotid intimal medial thickness (C-IMT)2 provide information about the burden of disease. However, despite multiple validation studies of CAC3-5, and C-IMT2,6, these modalities do not accurately assess plaque characteristics7,8, and the composition and inflammatory state of the plaque determine its stability and, therefore, the risk of clinical events9-13. [18F]-2-fluoro-2-deoxy-D-glucose (FDG) imaging using positron-emission tomography (PET)/computed tomography (CT) has been extensively studied in oncologic metabolism14,15. Studies using animal models and immunohistochemistry in humans show that FDG-PET/CT is exquisitely sensitive for detecting macrophage activity16, an important source of cellular inflammation in vessel walls. More recently, we17,18 and others have shown that FDG-PET/CT enables highly precise, novel measurements of inflammatory activity of activity of atherosclerotic plaques in large and medium-sized arteries9,16,19,20. FDG-PET/CT studies have many advantages over other imaging modalities: 1) high contrast resolution; 2) quantification of plaque volume and metabolic activity allowing for multi-modal atherosclerotic plaque quantification; 3) dynamic, real-time, in vivo imaging; 4) minimal operator dependence. Finally, vascular inflammation detected by FDG-PET/CT has been shown to predict cardiovascular (CV) events independent of traditional risk factors21,22 and is also highly associated with overall burden of atherosclerosis23. Plaque activity by FDG-PET/CT is modulated by known beneficial CV interventions such as short term (12 week) statin therapy24 as well as longer term therapeutic lifestyle changes (16 months)25. The current methodology for quantification of FDG uptake in atherosclerotic plaque involves measurement of the standardized uptake value (SUV) of an artery of interest and of the venous blood pool in order to calculate a target to background ratio (TBR), which is calculated by dividing the arterial SUV by the venous blood pool SUV. This method has shown to represent a stable, reproducible phenotype over time, has a high sensitivity for detection of vascular inflammation, and also has high inter-and intra-reader reliability26. Here we present our methodology for patient preparation, image acquisition, and quantification of atherosclerotic plaque activity and vascular inflammation using SUV, TBR, and a global parameter called the metabolic volumetric product (MVP). These approaches may be applied to assess vascular inflammation in various study samples of interest in a consistent fashion as we have shown in several prior publications.9,20,27,28
Medicine, Issue 63, FDG-PET/CT, atherosclerosis, vascular inflammation, quantitative radiology, imaging
3777
Play Button
Transient Expression of Proteins by Hydrodynamic Gene Delivery in Mice
Authors: Daniella Kovacsics, Jayne Raper.
Institutions: Hunter College, CUNY.
Efficient expression of transgenes in vivo is of critical importance in studying gene function and developing treatments for diseases. Over the past years, hydrodynamic gene delivery (HGD) has emerged as a simple, fast, safe and effective method for delivering transgenes into rodents. This technique relies on the force generated by the rapid injection of a large volume of physiological solution to increase the permeability of cell membranes of perfused organs and thus deliver DNA into cells. One of the main advantages of HGD is the ability to introduce transgenes into mammalian cells using naked plasmid DNA (pDNA). Introducing an exogenous gene using a plasmid is minimally laborious, highly efficient and, contrary to viral carriers, remarkably safe. HGD was initially used to deliver genes into mice, it is now used to deliver a wide range of substances, including oligonucleotides, artificial chromosomes, RNA, proteins and small molecules into mice, rats and, to a limited degree, other animals. This protocol describes HGD in mice and focuses on three key aspects of the method that are critical to performing the procedure successfully: correct insertion of the needle into the vein, the volume of injection and the speed of delivery. Examples are given to show the application of this method to the transient expression of two genes that encode secreted, primate-specific proteins, apolipoprotein L-I (APOL-I) and haptoglobin-related protein (HPR).
Genetics, Issue 87, hydrodynamic gene delivery, hydrodynamics-based transfection, mouse, gene therapy, plasmid DNA, transient gene expression, tail vein injection
51481
Play Button
Lateral Diffusion and Exocytosis of Membrane Proteins in Cultured Neurons Assessed using Fluorescence Recovery and Fluorescence-loss Photobleaching
Authors: Keri L. Hildick, Inmaculada M. González-González, Frédéric Jaskolski, Jeremy. M. Henley.
Institutions: University of Bristol.
Membrane proteins such as receptors and ion channels undergo active trafficking in neurons, which are highly polarised and morphologically complex. This directed trafficking is of fundamental importance to deliver, maintain or remove synaptic proteins. Super-ecliptic pHluorin (SEP) is a pH-sensitive derivative of eGFP that has been extensively used for live cell imaging of plasma membrane proteins1-2. At low pH, protonation of SEP decreases photon absorption and eliminates fluorescence emission. As most intracellular trafficking events occur in compartments with low pH, where SEP fluorescence is eclipsed, the fluorescence signal from SEP-tagged proteins is predominantly from the plasma membrane where the SEP is exposed to a neutral pH extracellular environment. When illuminated at high intensity SEP, like every fluorescent dye, is irreversibly photodamaged (photobleached)3-5. Importantly, because low pH quenches photon absorption, only surface expressed SEP can be photobleached whereas intracellular SEP is unaffected by the high intensity illumination6-10. FRAP (fluorescence recovery after photobleaching) of SEP-tagged proteins is a convenient and powerful technique for assessing protein dynamics at the plasma membrane. When fluorescently tagged proteins are photobleached in a region of interest (ROI) the recovery in fluorescence occurs due to the movement of unbleached SEP-tagged proteins into the bleached region. This can occur via lateral diffusion and/or from exocytosis of non-photobleached receptors supplied either by de novo synthesis or recycling (see Fig. 1). The fraction of immobile and mobile protein can be determined and the mobility and kinetics of the diffusible fraction can be interrogated under basal and stimulated conditions such as agonist application or neuronal activation stimuli such as NMDA or KCl application8,10. We describe photobleaching techniques designed to selectively visualize the recovery of fluorescence attributable to exocytosis. Briefly, an ROI is photobleached once as with standard FRAP protocols, followed, after a brief recovery, by repetitive bleaching of the flanking regions. This 'FRAP-FLIP' protocol, developed in our lab, has been used to characterize AMPA receptor trafficking at dendritic spines10, and is applicable to a wide range of trafficking studies to evaluate the intracellular trafficking and exocytosis.
Neuroscience, Issue 60, Fluorescence Recovery After Photobleaching, FRAP, Confocal imaging, fluorophore, GFP, Super-ecliptic pHluorin, SEP, fluorescence loss in photobleach, FLIP, neuron, protein traffic, synapse
3747
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
52070
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
51242
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Polysome Fractionation and Analysis of Mammalian Translatomes on a Genome-wide Scale
Authors: Valentina Gandin, Kristina Sikström, Tommy Alain, Masahiro Morita, Shannon McLaughlan, Ola Larsson, Ivan Topisirovic.
Institutions: McGill University, Karolinska Institutet, McGill University.
mRNA translation plays a central role in the regulation of gene expression and represents the most energy consuming process in mammalian cells. Accordingly, dysregulation of mRNA translation is considered to play a major role in a variety of pathological states including cancer. Ribosomes also host chaperones, which facilitate folding of nascent polypeptides, thereby modulating function and stability of newly synthesized polypeptides. In addition, emerging data indicate that ribosomes serve as a platform for a repertoire of signaling molecules, which are implicated in a variety of post-translational modifications of newly synthesized polypeptides as they emerge from the ribosome, and/or components of translational machinery. Herein, a well-established method of ribosome fractionation using sucrose density gradient centrifugation is described. In conjunction with the in-house developed “anota” algorithm this method allows direct determination of differential translation of individual mRNAs on a genome-wide scale. Moreover, this versatile protocol can be used for a variety of biochemical studies aiming to dissect the function of ribosome-associated protein complexes, including those that play a central role in folding and degradation of newly synthesized polypeptides.
Biochemistry, Issue 87, Cells, Eukaryota, Nutritional and Metabolic Diseases, Neoplasms, Metabolic Phenomena, Cell Physiological Phenomena, mRNA translation, ribosomes, protein synthesis, genome-wide analysis, translatome, mTOR, eIF4E, 4E-BP1
51455
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
Assessment of Morphine-induced Hyperalgesia and Analgesic Tolerance in Mice Using Thermal and Mechanical Nociceptive Modalities
Authors: Khadija Elhabazi, Safia Ayachi, Brigitte Ilien, Frédéric Simonin.
Institutions: Université de Strasbourg.
Opioid-induced hyperalgesia and tolerance severely impact the clinical efficacy of opiates as pain relievers in animals and humans. The molecular mechanisms underlying both phenomena are not well understood and their elucidation should benefit from the study of animal models and from the design of appropriate experimental protocols. We describe here a methodological approach for inducing, recording and quantifying morphine-induced hyperalgesia as well as for evidencing analgesic tolerance, using the tail-immersion and tail pressure tests in wild-type mice. As shown in the video, the protocol is divided into five sequential steps. Handling and habituation phases allow a safe determination of the basal nociceptive response of the animals. Chronic morphine administration induces significant hyperalgesia as shown by an increase in both thermal and mechanical sensitivity, whereas the comparison of analgesia time-courses after acute or repeated morphine treatment clearly indicates the development of tolerance manifested by a decline in analgesic response amplitude. This protocol may be similarly adapted to genetically modified mice in order to evaluate the role of individual genes in the modulation of nociception and morphine analgesia. It also provides a model system to investigate the effectiveness of potential therapeutic agents to improve opiate analgesic efficacy.
Neuroscience, Issue 89, mice, nociception, tail immersion test, tail pressure test, morphine, analgesia, opioid-induced hyperalgesia, tolerance
51264
Play Button
Dried Blood Spot Collection of Health Biomarkers to Maximize Participation in Population Studies
Authors: Michael W. Ostler, James H. Porter, Orfeu M. Buxton.
Institutions: Harvard School of Public Health, Brigham and Women's Hospital, Harvard Medical School, Pennsylvania State University.
Biomarkers are directly-measured biological indicators of disease, health, exposures, or other biological information. In population and social sciences, biomarkers need to be easy to obtain, transport, and analyze. Dried Blood Spots meet this need, and can be collected in the field with high response rates. These elements are particularly important in longitudinal study designs including interventions where attrition is critical to avoid, and high response rates improve the interpretation of results. Dried Blood Spot sample collection is simple, quick, relatively painless, less invasive then venipuncture, and requires minimal field storage requirements (i.e. samples do not need to be immediately frozen and can be stored for a long period of time in a stable freezer environment before assay). The samples can be analyzed for a variety of different analytes, including cholesterol, C-reactive protein, glycosylated hemoglobin, numerous cytokines, and other analytes, as well as provide genetic material. DBS collection is depicted as employed in several recent studies.
Medicine, Issue 83, dried blood spots (DBS), Biomarkers, cardiometabolic risk, Inflammation, standard precautions, blood collection
50973
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
3705
Play Button
Enrichment of NK Cells from Human Blood with the RosetteSep Kit from StemCell Technologies
Authors: Christine Beeton, K. George Chandy.
Institutions: University of California, Irvine (UCI).
Natural killer (NK) cells are large granular cytotoxic lymphocytes that belong to the innate immune system and play major roles in fighting against cancer and infections, but are also implicated in the early stages of pregnancy and transplant rejection. These cells are present in peripheral blood, from which they can be isolated. Cells can be isolated using either positive or negative selection. For positive selection we use antibodies directed to a surface marker present only on the cells of interest whereas for negative selection we use cocktails of antibodies targeted to surface markers present on all cells but the cells of interest. This latter technique presents the advantage of leaving the cells of interest free of antibodies, thereby reducing the risk of unwanted cell activation or differenciation. In this video-protocol we demonstrate how to separate NK cells from human blood by negative selection, using the RosetteSep kit from StemCell technologies. The procedure involves obtaining human peripheral blood (under an institutional review board-approved protocol to protect the human subjects) and mixing it with a cocktail of antibodies that will bind to markers absent on NK cells, but present on all other mononuclear cells present in peripheral blood (e.g., T lymphocytes, monocytes...). The antibodies present in the cocktail are conjugated to antibodies directed to glycophorin A on erythrocytes. All unwanted cells and red blood cells will therefore be trapped in complexes. The mix of blood and antibody cocktail is then diluted, overlayed on a Histopaque gradient, and centrifuged. NK cells (>80% pure) can be collected at the interface between the Histopaque and the diluted plasma. Similar cocktails are available for enrichment of other cell populations, such as human T lymphocytes.
Immunology, issue 8, blood, cell isolation, natural killer, lymphocyte, primary cells, negative selection, PBMC, Ficoll gradient, cell separation
326
Play Button
Brain Imaging Investigation of the Neural Correlates of Observing Virtual Social Interactions
Authors: Keen Sung, Sanda Dolcos, Sophie Flor-Henry, Crystal Zhou, Claudia Gasior, Jennifer Argo, Florin Dolcos.
Institutions: University of Alberta, University of Illinois, University of Alberta, University of Alberta, University of Alberta, University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign.
The ability to gauge social interactions is crucial in the assessment of others’ intentions. Factors such as facial expressions and body language affect our decisions in personal and professional life alike 1. These "friend or foe" judgements are often based on first impressions, which in turn may affect our decisions to "approach or avoid". Previous studies investigating the neural correlates of social cognition tended to use static facial stimuli 2. Here, we illustrate an experimental design in which whole-body animated characters were used in conjunction with functional magnetic resonance imaging (fMRI) recordings. Fifteen participants were presented with short movie-clips of guest-host interactions in a business setting, while fMRI data were recorded; at the end of each movie, participants also provided ratings of the host behaviour. This design mimics more closely real-life situations, and hence may contribute to better understanding of the neural mechanisms of social interactions in healthy behaviour, and to gaining insight into possible causes of deficits in social behaviour in such clinical conditions as social anxiety and autism 3.
Neuroscience, Issue 53, Social Perception, Social Knowledge, Social Cognition Network, Non-Verbal Communication, Decision-Making, Event-Related fMRI
2379
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.