JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Semi-automatic quantification of subsolid pulmonary nodules: comparison with manual measurements.
PLoS ONE
PUBLISHED: 01-01-2013
Accurate measurement of subsolid pulmonary nodules (SSN) is becoming increasingly important in the management of these nodules. SSNs were previously quantified with time-consuming manual measurements. The aim of the present study is to test the feasibility of semi-automatic SSNs measurements and to compare the results to the manual measurements.
ABSTRACT
Most techniques used to study small molecules, such as pharmaceutical drugs or endogenous metabolites, employ tissue extracts which require the homogenization of the tissue of interest that could potentially cause changes in the metabolic pathways being studied1. Mass spectrometric imaging (MSI) is a powerful analytical tool that can provide spatial information of analytes within intact slices of biological tissue samples1-5. This technique has been used extensively to study various types of compounds including proteins, peptides, lipids, and small molecules such as endogenous metabolites. With matrix-assisted laser desorption/ionization (MALDI)-MSI, spatial distributions of multiple metabolites can be simultaneously detected. Herein, a method developed specifically for conducting untargeted metabolomics MSI experiments on legume roots and root nodules is presented which could reveal insights into the biological processes taking place. The method presented here shows a typical MSI workflow, from sample preparation to image acquisition, and focuses on the matrix application step, demonstrating several matrix application techniques that are useful for detecting small molecules. Once the MS images are generated, the analysis and identification of metabolites of interest is discussed and demonstrated. The standard workflow presented here can be easily modified for different tissue types, molecular species, and instrumentation.
22 Related JoVE Articles!
Play Button
High-throughput Image Analysis of Tumor Spheroids: A User-friendly Software Application to Measure the Size of Spheroids Automatically and Accurately
Authors: Wenjin Chen, Chung Wong, Evan Vosburgh, Arnold J. Levine, David J. Foran, Eugenia Y. Xu.
Institutions: Raymond and Beverly Sackler Foundation, New Jersey, Rutgers University, Rutgers University, Institute for Advanced Study, New Jersey.
The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application – SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary “Manual Initialize” and “Hand Draw” tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.
Cancer Biology, Issue 89, computer programming, high-throughput, image analysis, tumor spheroids, 3D, software application, cancer therapy, drug screen, neuroendocrine tumor cell line, BON-1, cancer research
51639
Play Button
Experimental Metastasis and CTL Adoptive Transfer Immunotherapy Mouse Model
Authors: Mary Zimmerman, Xiaolin Hu, Kebin Liu.
Institutions: Medical College of Georgia.
Experimental metastasis mouse model is a simple and yet physiologically relevant metastasis model. The tumor cells are injected intravenously (i.v) into mouse tail veins and colonize in the lungs, thereby, resembling the last steps of tumor cell spontaneous metastasis: survival in the circulation, extravasation and colonization in the distal organs. From a therapeutic point of view, the experimental metastasis model is the simplest and ideal model since the target of therapies is often the end point of metastasis: established metastatic tumor in the distal organ. In this model, tumor cells are injected i.v into mouse tail veins and allowed to colonize and grow in the lungs. Tumor-specific CTLs are then injected i.v into the metastases-bearing mouse. The number and size of the lung metastases can be controlled by the number of tumor cells to be injected and the time of tumor growth. Therefore, various stages of metastasis, from minimal metastasis to extensive metastasis, can be modeled. Lung metastases are analyzed by inflation with ink, thus allowing easier visual observation and quantification.
Immunology, Issue 45, Metastasis, CTL adoptive transfer, Lung, Tumor Immunology
2077
Play Button
Doppler Optical Coherence Tomography of Retinal Circulation
Authors: Ou Tan, Yimin Wang, Ranjith K. Konduru, Xinbo Zhang, SriniVas R. Sadda, David Huang.
Institutions: Oregon Health and Science University , University of Southern California.
Noncontact retinal blood flow measurements are performed with a Fourier domain optical coherence tomography (OCT) system using a circumpapillary double circular scan (CDCS) that scans around the optic nerve head at 3.40 mm and 3.75 mm diameters. The double concentric circles are performed 6 times consecutively over 2 sec. The CDCS scan is saved with Doppler shift information from which flow can be calculated. The standard clinical protocol calls for 3 CDCS scans made with the OCT beam passing through the superonasal edge of the pupil and 3 CDCS scan through the inferonal pupil. This double-angle protocol ensures that acceptable Doppler angle is obtained on each retinal branch vessel in at least 1 scan. The CDCS scan data, a 3-dimensional volumetric OCT scan of the optic disc scan, and a color photograph of the optic disc are used together to obtain retinal blood flow measurement on an eye. We have developed a blood flow measurement software called "Doppler optical coherence tomography of retinal circulation" (DOCTORC). This semi-automated software is used to measure total retinal blood flow, vessel cross section area, and average blood velocity. The flow of each vessel is calculated from the Doppler shift in the vessel cross-sectional area and the Doppler angle between the vessel and the OCT beam. Total retinal blood flow measurement is summed from the veins around the optic disc. The results obtained at our Doppler OCT reading center showed good reproducibility between graders and methods (<10%). Total retinal blood flow could be useful in the management of glaucoma, other retinal diseases, and retinal diseases. In glaucoma patients, OCT retinal blood flow measurement was highly correlated with visual field loss (R2>0.57 with visual field pattern deviation). Doppler OCT is a new method to perform rapid, noncontact, and repeatable measurement of total retinal blood flow using widely available Fourier-domain OCT instrumentation. This new technology may improve the practicality of making these measurements in clinical studies and routine clinical practice.
Medicine, Issue 67, Ophthalmology, Physics, Doppler optical coherence tomography, total retinal blood flow, dual circular scan pattern, image analysis, semi-automated grading software, optic disc
3524
Play Button
Microvolume Protein Concentration Determination using the NanoDrop 2000c Spectrophotometer
Authors: Philippe Desjardins, Joel B. Hansen, Michael Allen.
Institutions: Thermo Scientific NanoDrop Products.
Traditional spectrophotometry requires placing samples into cuvettes or capillaries. This is often impractical due to the limited sample volumes often used for protein analysis. The Thermo Scientific NanoDrop 2000c Spectrophotometer solves this issue with an innovative sample retention system that holds microvolume samples between two measurement surfaces using the surface tension properties of liquids, enabling the quantification of samples in volumes as low as 0.5-2 μL. The elimination of cuvettes or capillaries allows real time changes in path length, which reduces the measurement time while greatly increasing the dynamic range of protein concentrations that can be measured. The need for dilutions is also eliminated, and preparations for sample quantification are relatively easy as the measurement surfaces can be simply wiped with laboratory wipe. This video article presents modifications to traditional protein concentration determination methods for quantification of microvolume amounts of protein using A280 absorbance readings or the BCA colorimetric assay.
Basic Protocols, Issue 33, NanoDrop, protein measurement, protein concentration, spectrophotometer, A280, UV/Vis, BCA, microvolume, microsample, proteomics
1610
Play Button
Echocardiographic Assessment of the Right Heart in Mice
Authors: Evan Brittain, Niki L. Penner, James West, Anna Hemnes.
Institutions: Vanderbilt University Medical Center, Vanderbilt University Medical Center.
Transgenic and toxic models of pulmonary arterial hypertension (PAH) are widely used to study the pathophysiology of PAH and to investigate potential therapies. Given the expense and time involved in creating animal models of disease, it is critical that researchers have tools to accurately assess phenotypic expression of disease. Right ventricular dysfunction is the major manifestation of pulmonary hypertension. Echocardiography is the mainstay of the noninvasive assessment of right ventricular function in rodent models and has the advantage of clear translation to humans in whom the same tool is used. Published echocardiography protocols in murine models of PAH are lacking. In this article, we describe a protocol for assessing RV and pulmonary vascular function in a mouse model of PAH with a dominant negative BMPRII mutation; however, this protocol is applicable to any diseases affecting the pulmonary vasculature or right heart. We provide a detailed description of animal preparation, image acquisition and hemodynamic calculation of stroke volume, cardiac output and an estimate of pulmonary artery pressure.
Medicine, Issue 81, Anatomy, Physiology, Biomedical Engineering, Cardiology, Cardiac Imaging Techniques, Echocardiography, Echocardiography, Doppler, Cardiovascular Physiological Processes, Cardiovascular System, Cardiovascular Diseases, Echocardiography, right ventricle, right ventricular function, pulmonary hypertension, Pulmonary Arterial Hypertension, transgenic models, hemodynamics, animal model
50912
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
51823
Play Button
Videomorphometric Analysis of Hypoxic Pulmonary Vasoconstriction of Intra-pulmonary Arteries Using Murine Precision Cut Lung Slices
Authors: Renate Paddenberg, Petra Mermer, Anna Goldenberg, Wolfgang Kummer.
Institutions: Justus-Liebig-University.
Acute alveolar hypoxia causes pulmonary vasoconstriction (HPV) - also known as von Euler-Liljestrand mechanism - which serves to match lung perfusion to ventilation. Up to now, the underlying mechanisms are not fully understood. The major vascular segment contributing to HPV is the intra-acinar artery. This vessel section is responsible for the blood supply of an individual acinus, which is defined as the portion of lung distal to a terminal bronchiole. Intra-acinar arteries are mostly located in that part of the lung that cannot be selectively reached by a number of commonly used techniques such as measurement of the pulmonary artery pressure in isolated perfused lungs or force recordings from dissected proximal pulmonary artery segments1,2. The analysis of subpleural vessels by real-time confocal laser scanning luminescence microscopy is limited to vessels with up to 50 µm in diameter3. We provide a technique to study HPV of murine intra-pulmonary arteries in the range of 20-100 µm inner diameters. It is based on the videomorphometric analysis of cross-sectioned arteries in precision cut lung slices (PCLS). This method allows the quantitative measurement of vasoreactivity of small intra-acinar arteries with inner diameter between 20-40 µm which are located at gussets of alveolar septa next to alveolar ducts and of larger pre-acinar arteries with inner diameters between 40-100 µm which run adjacent to bronchi and bronchioles. In contrast to real-time imaging of subpleural vessels in anesthetized and ventilated mice, videomorphometric analysis of PCLS occurs under conditions free of shear stress. In our experimental model both arterial segments exhibit a monophasic HPV when exposed to medium gassed with 1% O2 and the response fades after 30-40 min at hypoxia.
Medicine, Issue 83, Hypoxic pulmonary vasoconstriction, murine lungs, precision cut lung slices, intra-pulmonary, pre- and intra-acinar arteries, videomorphometry
50970
Play Button
A Comprehensive Protocol for Manual Segmentation of the Medial Temporal Lobe Structures
Authors: Matthew Moore, Yifan Hu, Sarah Woo, Dylan O'Hearn, Alexandru D. Iordan, Sanda Dolcos, Florin Dolcos.
Institutions: University of Illinois Urbana-Champaign, University of Illinois Urbana-Champaign, University of Illinois Urbana-Champaign.
The present paper describes a comprehensive protocol for manual tracing of the set of brain regions comprising the medial temporal lobe (MTL): amygdala, hippocampus, and the associated parahippocampal regions (perirhinal, entorhinal, and parahippocampal proper). Unlike most other tracing protocols available, typically focusing on certain MTL areas (e.g., amygdala and/or hippocampus), the integrative perspective adopted by the present tracing guidelines allows for clear localization of all MTL subregions. By integrating information from a variety of sources, including extant tracing protocols separately targeting various MTL structures, histological reports, and brain atlases, and with the complement of illustrative visual materials, the present protocol provides an accurate, intuitive, and convenient guide for understanding the MTL anatomy. The need for such tracing guidelines is also emphasized by illustrating possible differences between automatic and manual segmentation protocols. This knowledge can be applied toward research involving not only structural MRI investigations but also structural-functional colocalization and fMRI signal extraction from anatomically defined ROIs, in healthy and clinical groups alike.
Neuroscience, Issue 89, Anatomy, Segmentation, Medial Temporal Lobe, MRI, Manual Tracing, Amygdala, Hippocampus, Perirhinal Cortex, Entorhinal Cortex, Parahippocampal Cortex
50991
Play Button
Electrochemotherapy of Tumours
Authors: Gregor Sersa, Damijan Miklavcic.
Institutions: Institute of Oncology Ljubljana, University of Ljubljana.
Electrochemotherapy is a combined use of certain chemotherapeutic drugs and electric pulses applied to the treated tumour nodule. Local application of electric pulses to the tumour increases drug delivery into cells, specifically at the site of electric pulse application. Drug uptake by delivery of electric pulses is increased for only those chemotherapeutic drugs whose transport through the plasma membrane is impeded. Among many drugs that have been tested so far, bleomycin and cisplatin found their way from preclinical testing to clinical use. Clinical data collected within a number of clinical studies indicate that approximately 80% of the treated cutaneous and subcutaneous tumour nodules of different malignancies are in an objective response, from these, approximately 70% in complete response after a single application of electrochemotherapy. Usually only one treatment is needed, however, electrochemotherapy can be repeated several times every few weeks with equal effectiveness each time. The treatment results in an effective eradication of the treated nodules, with a good cosmetic effect without tissue scarring.
Medicine, Issue 22, electrochemotherapy, electroporation, cisplatin, bleomycin, malignant tumours, cutaneous lesions
1038
Play Button
High Resolution 3D Imaging of Ex-Vivo Biological Samples by Micro CT
Authors: Amnon Sharir, Gregory Ramniceanu, Vlad Brumfeld.
Institutions: Weizmann Institute of Science, Weizmann Institute of Science, Weizmann Institute of Science.
Non-destructive volume visualization can be achieved only by tomographic techniques, of which the most efficient is the x-ray micro computerized tomography (μCT). High resolution μCT is a very versatile yet accurate (1-2 microns of resolution) technique for 3D examination of ex-vivo biological samples1, 2. As opposed to electron tomography, the μCT allows the examination of up to 4 cm thick samples. This technique requires only few hours of measurement as compared to weeks in histology. In addition, μCT does not rely on 2D stereologic models, thus it may complement and in some cases can even replace histological methods3, 4, which are both time consuming and destructive. Sample conditioning and positioning in μCT is straightforward and does not require high vacuum or low temperatures, which may adversely affect the structure. The sample is positioned and rotated 180° or 360°between a microfocused x-ray source and a detector, which includes a scintillator and an accurate CCD camera, For each angle a 2D image is taken, and then the entire volume is reconstructed using one of the different available algorithms5-7. The 3D resolution increases with the decrease of the rotation step. The present video protocol shows the main steps in preparation, immobilization and positioning of the sample followed by imaging at high resolution.
Bioengineering, Issue 52, 3D imaging, tomography, x-ray, non invasive, ex-vivo
2688
Play Button
Measuring Changes in Tactile Sensitivity in the Hind Paw of Mice Using an Electronic von Frey Apparatus
Authors: Tijana Martinov, Madison Mack, Akilah Sykes, Devavani Chatterjea.
Institutions: Macalester College.
Measuring inflammation-induced changes in thresholds of hind paw withdrawal from mechanical pressure is a useful technique to assess changes in pain perception in rodents. Withdrawal thresholds can be measured first at baseline and then following drug, venom, injury, allergen, or otherwise evoked inflammation by applying an accurate force on very specific areas of the skin. An electronic von Frey apparatus allows precise assessment of mouse hind paw withdrawal thresholds that are not limited by the available filament sizes in contrast to classical von Frey measurements. The ease and rapidity of measurements allow for incorporation of assessment of tactile sensitivity outcomes in diverse models of rapid-onset inflammatory and neuropathic pain as multiple measurements can be taken within a short time period. Experimental measurements for individual rodent subjects can be internally controlled against individual baseline responses and exclusion criteria easily established to standardize baseline responses within and across experimental groups. Thus, measurements using an electronic von Frey apparatus represent a useful modification of the well-established classical von Frey filament-based assays for rodent mechanical allodynia that may also be applied to other nonhuman mammalian models.
Neuroscience, Issue 82, Natural Science Disciplines, Life Sciences (General), Behavioral Sciences, mechanical hyperalgesia, mice, electronic pressure meter, inflammation, snake venom
51212
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Lesion Explorer: A Video-guided, Standardized Protocol for Accurate and Reliable MRI-derived Volumetrics in Alzheimer's Disease and Normal Elderly
Authors: Joel Ramirez, Christopher J.M. Scott, Alicia A. McNeely, Courtney Berezuk, Fuqiang Gao, Gregory M. Szilagyi, Sandra E. Black.
Institutions: Sunnybrook Health Sciences Centre, University of Toronto.
Obtaining in vivo human brain tissue volumetrics from MRI is often complicated by various technical and biological issues. These challenges are exacerbated when significant brain atrophy and age-related white matter changes (e.g. Leukoaraiosis) are present. Lesion Explorer (LE) is an accurate and reliable neuroimaging pipeline specifically developed to address such issues commonly observed on MRI of Alzheimer's disease and normal elderly. The pipeline is a complex set of semi-automatic procedures which has been previously validated in a series of internal and external reliability tests1,2. However, LE's accuracy and reliability is highly dependent on properly trained manual operators to execute commands, identify distinct anatomical landmarks, and manually edit/verify various computer-generated segmentation outputs. LE can be divided into 3 main components, each requiring a set of commands and manual operations: 1) Brain-Sizer, 2) SABRE, and 3) Lesion-Seg. Brain-Sizer's manual operations involve editing of the automatic skull-stripped total intracranial vault (TIV) extraction mask, designation of ventricular cerebrospinal fluid (vCSF), and removal of subtentorial structures. The SABRE component requires checking of image alignment along the anterior and posterior commissure (ACPC) plane, and identification of several anatomical landmarks required for regional parcellation. Finally, the Lesion-Seg component involves manual checking of the automatic lesion segmentation of subcortical hyperintensities (SH) for false positive errors. While on-site training of the LE pipeline is preferable, readily available visual teaching tools with interactive training images are a viable alternative. Developed to ensure a high degree of accuracy and reliability, the following is a step-by-step, video-guided, standardized protocol for LE's manual procedures.
Medicine, Issue 86, Brain, Vascular Diseases, Magnetic Resonance Imaging (MRI), Neuroimaging, Alzheimer Disease, Aging, Neuroanatomy, brain extraction, ventricles, white matter hyperintensities, cerebrovascular disease, Alzheimer disease
50887
Play Button
Single-plant, Sterile Microcosms for Nodulation and Growth of the Legume Plant Medicago truncatula with the Rhizobial Symbiont Sinorhizobium meliloti
Authors: Kathryn M. Jones, Hajeewaka C. Mendis, Clothilde Queiroux.
Institutions: Florida State University.
Rhizobial bacteria form symbiotic, nitrogen-fixing nodules on the roots of compatible host legume plants. One of the most well-developed model systems for studying these interactions is the plant Medicago truncatula cv. Jemalong A17 and the rhizobial bacterium Sinorhizobium meliloti 1021. Repeated imaging of plant roots and scoring of symbiotic phenotypes requires methods that are non-destructive to either plants or bacteria. The symbiotic phenotypes of some plant and bacterial mutants become apparent after relatively short periods of growth, and do not require long-term observation of the host/symbiont interaction. However, subtle differences in symbiotic efficiency and nodule senescence phenotypes that are not apparent in the early stages of the nodulation process require relatively long growth periods before they can be scored. Several methods have been developed for long-term growth and observation of this host/symbiont pair. However, many of these methods require repeated watering, which increases the possibility of contamination by other microbes. Other methods require a relatively large space for growth of large numbers of plants. The method described here, symbiotic growth of M. truncatula/S. meliloti in sterile, single-plant microcosms, has several advantages. Plants in these microcosms have sufficient moisture and nutrients to ensure that watering is not required for up to 9 weeks, preventing cross-contamination during watering. This allows phenotypes to be quantified that might be missed in short-term growth systems, such as subtle delays in nodule development and early nodule senescence. Also, the roots and nodules in the microcosm are easily viewed through the plate lid, so up-rooting of the plants for observation is not required.
Environmental Sciences, Issue 80, Plant Roots, Medicago, Gram-Negative Bacteria, Nitrogen, Microbiological Techniques, Bacterial Processes, Symbiosis, botany, microbiology, Medicago truncatula, Sinorhizobium meliloti, nodule, nitrogen fixation, legume, rhizobia, bacteria
50916
Play Button
Metabolomic Analysis of Rat Brain by High Resolution Nuclear Magnetic Resonance Spectroscopy of Tissue Extracts
Authors: Norbert W. Lutz, Evelyne Béraud, Patrick J. Cozzone.
Institutions: Aix-Marseille Université, Aix-Marseille Université.
Studies of gene expression on the RNA and protein levels have long been used to explore biological processes underlying disease. More recently, genomics and proteomics have been complemented by comprehensive quantitative analysis of the metabolite pool present in biological systems. This strategy, termed metabolomics, strives to provide a global characterization of the small-molecule complement involved in metabolism. While the genome and the proteome define the tasks cells can perform, the metabolome is part of the actual phenotype. Among the methods currently used in metabolomics, spectroscopic techniques are of special interest because they allow one to simultaneously analyze a large number of metabolites without prior selection for specific biochemical pathways, thus enabling a broad unbiased approach. Here, an optimized experimental protocol for metabolomic analysis by high-resolution NMR spectroscopy is presented, which is the method of choice for efficient quantification of tissue metabolites. Important strengths of this method are (i) the use of crude extracts, without the need to purify the sample and/or separate metabolites; (ii) the intrinsically quantitative nature of NMR, permitting quantitation of all metabolites represented by an NMR spectrum with one reference compound only; and (iii) the nondestructive nature of NMR enabling repeated use of the same sample for multiple measurements. The dynamic range of metabolite concentrations that can be covered is considerable due to the linear response of NMR signals, although metabolites occurring at extremely low concentrations may be difficult to detect. For the least abundant compounds, the highly sensitive mass spectrometry method may be advantageous although this technique requires more intricate sample preparation and quantification procedures than NMR spectroscopy. We present here an NMR protocol adjusted to rat brain analysis; however, the same protocol can be applied to other tissues with minor modifications.
Neuroscience, Issue 91, metabolomics, brain tissue, rodents, neurochemistry, tissue extracts, NMR spectroscopy, quantitative metabolite analysis, cerebral metabolism, metabolic profile
51829
Play Button
RNA-seq Analysis of Transcriptomes in Thrombin-treated and Control Human Pulmonary Microvascular Endothelial Cells
Authors: Dilyara Cheranova, Margaret Gibson, Suman Chaudhary, Li Qin Zhang, Daniel P. Heruth, Dmitry N. Grigoryev, Shui Qing Ye.
Institutions: Children's Mercy Hospital and Clinics, School of Medicine, University of Missouri-Kansas City.
The characterization of gene expression in cells via measurement of mRNA levels is a useful tool in determining how the transcriptional machinery of the cell is affected by external signals (e.g. drug treatment), or how cells differ between a healthy state and a diseased state. With the advent and continuous refinement of next-generation DNA sequencing technology, RNA-sequencing (RNA-seq) has become an increasingly popular method of transcriptome analysis to catalog all species of transcripts, to determine the transcriptional structure of all expressed genes and to quantify the changing expression levels of the total set of transcripts in a given cell, tissue or organism1,2 . RNA-seq is gradually replacing DNA microarrays as a preferred method for transcriptome analysis because it has the advantages of profiling a complete transcriptome, providing a digital type datum (copy number of any transcript) and not relying on any known genomic sequence3. Here, we present a complete and detailed protocol to apply RNA-seq to profile transcriptomes in human pulmonary microvascular endothelial cells with or without thrombin treatment. This protocol is based on our recent published study entitled "RNA-seq Reveals Novel Transcriptome of Genes and Their Isoforms in Human Pulmonary Microvascular Endothelial Cells Treated with Thrombin,"4 in which we successfully performed the first complete transcriptome analysis of human pulmonary microvascular endothelial cells treated with thrombin using RNA-seq. It yielded unprecedented resources for further experimentation to gain insights into molecular mechanisms underlying thrombin-mediated endothelial dysfunction in the pathogenesis of inflammatory conditions, cancer, diabetes, and coronary heart disease, and provides potential new leads for therapeutic targets to those diseases. The descriptive text of this protocol is divided into four parts. The first part describes the treatment of human pulmonary microvascular endothelial cells with thrombin and RNA isolation, quality analysis and quantification. The second part describes library construction and sequencing. The third part describes the data analysis. The fourth part describes an RT-PCR validation assay. Representative results of several key steps are displayed. Useful tips or precautions to boost success in key steps are provided in the Discussion section. Although this protocol uses human pulmonary microvascular endothelial cells treated with thrombin, it can be generalized to profile transcriptomes in both mammalian and non-mammalian cells and in tissues treated with different stimuli or inhibitors, or to compare transcriptomes in cells or tissues between a healthy state and a disease state.
Genetics, Issue 72, Molecular Biology, Immunology, Medicine, Genomics, Proteins, RNA-seq, Next Generation DNA Sequencing, Transcriptome, Transcription, Thrombin, Endothelial cells, high-throughput, DNA, genomic DNA, RT-PCR, PCR
4393
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
50645
Play Button
Investigating the Three-dimensional Flow Separation Induced by a Model Vocal Fold Polyp
Authors: Kelley C. Stewart, Byron D. Erath, Michael W. Plesniak.
Institutions: The George Washington University, Clarkson University.
The fluid-structure energy exchange process for normal speech has been studied extensively, but it is not well understood for pathological conditions. Polyps and nodules, which are geometric abnormalities that form on the medial surface of the vocal folds, can disrupt vocal fold dynamics and thus can have devastating consequences on a patient's ability to communicate. Our laboratory has reported particle image velocimetry (PIV) measurements, within an investigation of a model polyp located on the medial surface of an in vitro driven vocal fold model, which show that such a geometric abnormality considerably disrupts the glottal jet behavior. This flow field adjustment is a likely reason for the severe degradation of the vocal quality in patients with polyps. A more complete understanding of the formation and propagation of vortical structures from a geometric protuberance, such as a vocal fold polyp, and the resulting influence on the aerodynamic loadings that drive the vocal fold dynamics, is necessary for advancing the treatment of this pathological condition. The present investigation concerns the three-dimensional flow separation induced by a wall-mounted prolate hemispheroid with a 2:1 aspect ratio in cross flow, i.e. a model vocal fold polyp, using an oil-film visualization technique. Unsteady, three-dimensional flow separation and its impact of the wall pressure loading are examined using skin friction line visualization and wall pressure measurements.
Bioengineering, Issue 84, oil-flow visualization, vocal fold polyp, three-dimensional flow separation, aerodynamic pressure loadings
51080
Play Button
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Authors: Colin W. Bell, Barbara E. Fricks, Jennifer D. Rocca, Jessica M. Steinweg, Shawna K. McMahon, Matthew D. Wallenstein.
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample). Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
50961
Play Button
Preparation of Artificial Bilayers for Electrophysiology Experiments
Authors: Ruchi Kapoor, Jung H. Kim, Helgi Ingolfson, Olaf Sparre Andersen.
Institutions: Weill Cornell Medical College of Cornell University.
Planar lipid bilayers, also called artificial lipid bilayers, allow you to study ion-conducting channels in a well-defined environment. These bilayers can be used for many different studies, such as the characterization of membrane-active peptides, the reconstitution of ion channels or investigations on how changes in lipid bilayer properties alter the function of bilayer-spanning channels. Here, we show how to form a planar bilayer and how to isolate small patches from the bilayer, and in a second video will also demonstrate a procedure for using gramicidin channels to determine changes in lipid bilayer elastic properties. We also demonstrate the individual steps needed to prepare the bilayer chamber, the electrodes and how to test that the bilayer is suitable for single-channel measurements.
Cellular Biology, Issue 20, Springer Protocols, Artificial Bilayers, Bilayer Patch Experiments, Lipid Bilayers, Bilayer Punch Electrodes, Electrophysiology
1033
Play Button
Concentration Determination of Nucleic Acids and Proteins Using the Micro-volume Bio-spec Nano Spectrophotometer
Authors: Suja Sukumaran.
Institutions: Scientific Instruments.
Nucleic Acid quantitation procedures have advanced significantly in the last three decades. More and more, molecular biologists require consistent small-volume analysis of nucleic acid samples for their experiments. The BioSpec-nano provides a potential solution to the problems of inaccurate, non-reproducible results, inherent in current DNA quantitation methods, via specialized optics and a sensitive PDA detector. The BioSpec-nano also has automated functionality such that mounting, measurement, and cleaning are done by the instrument, thereby eliminating tedious, repetitive, and inconsistent placement of the fiber optic element and manual cleaning. In this study, data is presented on the quantification of DNA and protein, as well as on measurement reproducibility and accuracy. Automated sample contact and rapid scanning allows measurement in three seconds, resulting in excellent throughput. Data analysis is carried out using the built-in features of the software. The formula used for calculating DNA concentration is: Sample Concentration = DF · (OD260-OD320)· NACF (1) Where DF = sample dilution factor and NACF = nucleic acid concentration factor. The Nucleic Acid concentration factor is set in accordance with the analyte selected1. Protein concentration results can be expressed as μg/ mL or as moles/L by entering e280 and molecular weight values respectively. When residue values for Tyr, Trp and Cysteine (S-S bond) are entered in the e280Calc tab, the extinction coefficient values are calculated as e280 = 5500 x (Trp residues) + 1490 x (Tyr residues) + 125 x (cysteine S-S bond). The e280 value is used by the software for concentration calculation. In addition to concentration determination of nucleic acids and protein, the BioSpec-nano can be used as an ultra micro-volume spectrophotometer for many other analytes or as a standard spectrophotometer using 5 mm pathlength cells.
Molecular Biology, Issue 48, Nucleic acid quantitation, protein quantitation, micro-volume analysis, label quantitation
2699
Play Button
Tomato Analyzer: A Useful Software Application to Collect Accurate and Detailed Morphological and Colorimetric Data from Two-dimensional Objects
Authors: Gustavo R. Rodríguez, Jennifer B. Moyseenko, Matthew D. Robbins, Nancy Huarachi Morejón, David M. Francis, Esther van der Knaap.
Institutions: The Ohio State University.
Measuring fruit morphology and color traits of vegetable and fruit crops in an objective and reproducible way is important for detailed phenotypic analyses of these traits. Tomato Analyzer (TA) is a software program that measures 37 attributes related to two-dimensional shape in a semi-automatic and reproducible manner1,2. Many of these attributes, such as angles at the distal and proximal ends of the fruit and areas of indentation, are difficult to quantify manually. The attributes are organized in ten categories within the software: Basic Measurement, Fruit Shape Index, Blockiness, Homogeneity, Proximal Fruit End Shape, Distal Fruit End Shape, Asymmetry, Internal Eccentricity, Latitudinal Section and Morphometrics. The last category requires neither prior knowledge nor predetermined notions of the shape attributes, so morphometric analysis offers an unbiased option that may be better adapted to high-throughput analyses than attribute analysis. TA also offers the Color Test application that was designed to collect color measurements from scanned images and allow scanning devices to be calibrated using color standards3. TA provides several options to export and analyze shape attribute, morphometric, and color data. The data may be exported to an excel file in batch mode (more than 100 images at one time) or exported as individual images. The user can choose between output that displays the average for each attribute for the objects in each image (including standard deviation), or an output that displays the attribute values for each object on the image. TA has been a valuable and effective tool for indentifying and confirming tomato fruit shape Quantitative Trait Loci (QTL), as well as performing in-depth analyses of the effect of key fruit shape genes on plant morphology. Also, TA can be used to objectively classify fruit into various shape categories. Lastly, fruit shape and color traits in other plant species as well as other plant organs such as leaves and seeds can be evaluated with TA.
Plant Biology, Issue 37, morphology, color, image processing, quantitative trait loci, software
1856
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.