JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Clustering consumers based on trust, confidence and giving behaviour: data-driven model building for charitable involvement in the Australian not-for-profit sector.
.
PLoS ONE
PUBLISHED: 04-08-2015
Organisations in the Not-for-Profit and charity sector face increasing competition to win time, money and efforts from a common donor base. Consequently, these organisations need to be more proactive than ever. The increased level of communications between individuals and organisations today, heightens the need for investigating the drivers of charitable giving and understanding the various consumer groups, or donor segments, within a population. It is contended that `trust' is the cornerstone of the not-for-profit sector's survival, making it an inevitable topic for research in this context. It has become imperative for charities and not-for-profit organisations to adopt for-profit's research, marketing and targeting strategies. This study provides the not-for-profit sector with an easily-interpretable segmentation method based on a novel unsupervised clustering technique (MST-kNN) followed by a feature saliency method (the CM1 score). A sample of 1,562 respondents from a survey conducted by the Australian Charities and Not-for-profits Commission is analysed to reveal donor segments. Each cluster's most salient features are identified using the CM1 score. Furthermore, symbolic regression modelling is employed to find cluster-specific models to predict `low' or `high' involvement in clusters. The MST-kNN method found seven clusters. Based on their salient features they were labelled as: the `non-institutionalist charities supporters', the `resource allocation critics', the `information-seeking financial sceptics', the `non-questioning charity supporters', the `non-trusting sceptics', the `charity management believers' and the `institutionalist charity believers'. Each cluster exhibits their own characteristics as well as different drivers of `involvement'. The method in this study provides the not-for-profit sector with a guideline for clustering, segmenting, understanding and potentially targeting their donor base better. If charities and not-for-profit organisations adopt these strategies, they will be more successful in today's competitive environment.
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Published: 11-03-2011
ABSTRACT
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
25 Related JoVE Articles!
Play Button
Heterotopic Heart Transplantation in Mice
Authors: Fengchun Liu, Sang Mo Kang.
Institutions: University of California, San Francisco - UCSF.
The mouse heterotopic heart transplantation has been used widely since it was introduced by Drs. Corry and Russell in 1973. It is particularly valuable for studying rejection and immune response now that newer transgenic and gene knockout mice are available, and a large number of immunologic reagents have been developed. The heart transplant model is less stringent than the skin transplant models, although technically more challenging. We have developed a modified technique and have completed over 1000 successful cases of heterotopic heart transplantation in mice. When making anastomosis of the ascending aorta and abdominal aorta, two stay sutures are placed at the proximal and distal apexes of recipient abdominal aorta with the donor s ascending aorta, then using 11-0 suture for anastomosis on both side of aorta with continuing sutures. The stay sutures make the anastomosis easier and 11-0 is an ideal suture size to avoid bleeding and thrombosis. When making anastomosis of pulmonary artery and inferior vena cava, two stay sutures are made at the proximal apex and distal apex of the recipient s inferior vena cava with the donor s pulmonary artery. The left wall of the inferior vena cava and donor s pulmonary artery is closed with continuing sutures in the inside of the inferior vena cava after, one knot with the proximal apex stay suture the right wall of the inferior vena cava and the donor s pulmonary artery are closed with continuing sutures outside the inferior vena cave with 10-0 sutures. This method is easier to perform because anastomosis is made just on the one side of the inferior vena cava and 10-0 sutures is the right size to avoid bleeding and thrombosis. In this article, we provide details of the technique to supplement the video.
Developmental Biology, Issue 6, Microsurgical Techniques, Heart Transplant, Allograft Rejection Model
238
Play Button
A Modified Heterotopic Swine Hind Limb Transplant Model for Translational Vascularized Composite Allotransplantation (VCA) Research
Authors: Zuhaib Ibrahim, Damon S. Cooney, Jaimie T. Shores, Justin M. Sacks, Eric G. Wimmers, Steven C. Bonawitz, Chad Gordon, Dawn Ruben, Stefan Schneeberger, W. P. Andrew Lee, Gerald Brandacher.
Institutions: Johns Hopkins University School of Medicine.
Vascularized Composite Allotransplantation (VCA) such as hand and face transplants represent a viable treatment option for complex musculoskeletal trauma and devastating tissue loss. Despite favorable and highly encouraging early and intermediate functional outcomes, rejection of the highly immunogenic skin component of a VCA and potential adverse effects of chronic multi-drug immunosuppression continue to hamper widespread clinical application of VCA. Therefore, research in this novel field needs to focus on translational studies related to unique immunologic features of VCA and to develop novel immunomodulatory strategies for immunomodulation and tolerance induction following VCA without the need for long term immunosuppression. This article describes a reliable and reproducible translational large animal model of VCA that is comprised of an osteomyocutaneous flap in a MHC-defined swine heterotopic hind limb allotransplantation. Briefly, a well-vascularized skin paddle is identified in the anteromedial thigh region using near infrared laser angiography. The underlying muscles, knee joint, distal femur, and proximal tibia are harvested on a femoral vascular pedicle. This allograft can be considered both a VCA and a vascularized bone marrow transplant with its unique immune privileged features. The graft is transplanted to a subcutaneous abdominal pocket in the recipient animal with a skin component exteriorized to the dorsolateral region for immune monitoring. Three surgical teams work simultaneously in a well-coordinated manner to reduce anesthesia and ischemia times, thereby improving efficiency of this model and reducing potential confounders in experimental protocols. This model serves as the groundwork for future therapeutic strategies aimed at reducing and potentially eliminating the need for chronic multi-drug immunosuppression in VCA.
Medicine, Issue 80, Upper Extremity, Swine, Microsurgery, Tissue Transplantation, Transplantation Immunology, Surgical Procedures, Operative, Vascularized Composite Allografts, reconstructive transplantation, translational research, swine, hind limb allotransplantation, bone marrow, osteomyocutaneous, microvascular anastomosis, immunomodulation
50475
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Test Samples for Optimizing STORM Super-Resolution Microscopy
Authors: Daniel J. Metcalf, Rebecca Edwards, Neelam Kumarswami, Alex E. Knight.
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
50579
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
51242
Play Button
Identification of Protein Interaction Partners in Mammalian Cells Using SILAC-immunoprecipitation Quantitative Proteomics
Authors: Edward Emmott, Ian Goodfellow.
Institutions: University of Cambridge.
Quantitative proteomics combined with immuno-affinity purification, SILAC immunoprecipitation, represent a powerful means for the discovery of novel protein:protein interactions. By allowing the accurate relative quantification of protein abundance in both control and test samples, true interactions may be easily distinguished from experimental contaminants. Low affinity interactions can be preserved through the use of less-stringent buffer conditions and remain readily identifiable. This protocol discusses the labeling of tissue culture cells with stable isotope labeled amino acids, transfection and immunoprecipitation of an affinity tagged protein of interest, followed by the preparation for submission to a mass spectrometry facility. This protocol then discusses how to analyze and interpret the data returned from the mass spectrometer in order to identify cellular partners interacting with a protein of interest. As an example this technique is applied to identify proteins binding to the eukaryotic translation initiation factors: eIF4AI and eIF4AII.
Biochemistry, Issue 89, mass spectrometry, tissue culture techniques, isotope labeling, SILAC, Stable Isotope Labeling of Amino Acids in Cell Culture, proteomics, Interactomics, immunoprecipitation, pulldown, eIF4A, GFP, nanotrap, orbitrap
51656
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
Enhanced Reduced Representation Bisulfite Sequencing for Assessment of DNA Methylation at Base Pair Resolution
Authors: Francine E. Garrett-Bakelman, Caroline K. Sheridan, Thadeous J. Kacmarczyk, Jennifer Ishii, Doron Betel, Alicia Alonso, Christopher E. Mason, Maria E. Figueroa, Ari M. Melnick.
Institutions: Weill Cornell Medical College, Weill Cornell Medical College, Weill Cornell Medical College, University of Michigan.
DNA methylation pattern mapping is heavily studied in normal and diseased tissues. A variety of methods have been established to interrogate the cytosine methylation patterns in cells. Reduced representation of whole genome bisulfite sequencing was developed to detect quantitative base pair resolution cytosine methylation patterns at GC-rich genomic loci. This is accomplished by combining the use of a restriction enzyme followed by bisulfite conversion. Enhanced Reduced Representation Bisulfite Sequencing (ERRBS) increases the biologically relevant genomic loci covered and has been used to profile cytosine methylation in DNA from human, mouse and other organisms. ERRBS initiates with restriction enzyme digestion of DNA to generate low molecular weight fragments for use in library preparation. These fragments are subjected to standard library construction for next generation sequencing. Bisulfite conversion of unmethylated cytosines prior to the final amplification step allows for quantitative base resolution of cytosine methylation levels in covered genomic loci. The protocol can be completed within four days. Despite low complexity in the first three bases sequenced, ERRBS libraries yield high quality data when using a designated sequencing control lane. Mapping and bioinformatics analysis is then performed and yields data that can be easily integrated with a variety of genome-wide platforms. ERRBS can utilize small input material quantities making it feasible to process human clinical samples and applicable in a range of research applications. The video produced demonstrates critical steps of the ERRBS protocol.
Genetics, Issue 96, Epigenetics, bisulfite sequencing, DNA methylation, genomic DNA, 5-methylcytosine, high-throughput
52246
Play Button
Modulating Cognition Using Transcranial Direct Current Stimulation of the Cerebellum
Authors: Paul A. Pope.
Institutions: University of Birmingham.
Numerous studies have emerged recently that demonstrate the possibility of modulating, and in some cases enhancing, cognitive processes by exciting brain regions involved in working memory and attention using transcranial electrical brain stimulation. Some researchers now believe the cerebellum supports cognition, possibly via a remote neuromodulatory effect on the prefrontal cortex. This paper describes a procedure for investigating a role for the cerebellum in cognition using transcranial direct current stimulation (tDCS), and a selection of information-processing tasks of varying task difficulty, which have previously been shown to involve working memory, attention and cerebellar functioning. One task is called the Paced Auditory Serial Addition Task (PASAT) and the other a novel variant of this task called the Paced Auditory Serial Subtraction Task (PASST). A verb generation task and its two controls (noun and verb reading) were also investigated. All five tasks were performed by three separate groups of participants, before and after the modulation of cortico-cerebellar connectivity using anodal, cathodal or sham tDCS over the right cerebellar cortex. The procedure demonstrates how performance (accuracy, verbal response latency and variability) could be selectively improved after cathodal stimulation, but only during tasks that the participants rated as difficult, and not easy. Performance was unchanged by anodal or sham stimulation. These findings demonstrate a role for the cerebellum in cognition, whereby activity in the left prefrontal cortex is likely dis-inhibited by cathodal tDCS over the right cerebellar cortex. Transcranial brain stimulation is growing in popularity in various labs and clinics. However, the after-effects of tDCS are inconsistent between individuals and not always polarity-specific, and may even be task- or load-specific, all of which requires further study. Future efforts might also be guided towards neuro-enhancement in cerebellar patients presenting with cognitive impairment once a better understanding of brain stimulation mechanisms has emerged.
Behavior, Issue 96, Cognition, working memory, tDCS, cerebellum, brain stimulation, neuro-modulation, neuro-enhancement
52302
Play Button
A Comparative Approach to Characterize the Landscape of Host-Pathogen Protein-Protein Interactions
Authors: Mandy Muller, Patricia Cassonnet, Michel Favre, Yves Jacob, Caroline Demeret.
Institutions: Institut Pasteur , Université Sorbonne Paris Cité, Dana Farber Cancer Institute.
Significant efforts were gathered to generate large-scale comprehensive protein-protein interaction network maps. This is instrumental to understand the pathogen-host relationships and was essentially performed by genetic screenings in yeast two-hybrid systems. The recent improvement of protein-protein interaction detection by a Gaussia luciferase-based fragment complementation assay now offers the opportunity to develop integrative comparative interactomic approaches necessary to rigorously compare interaction profiles of proteins from different pathogen strain variants against a common set of cellular factors. This paper specifically focuses on the utility of combining two orthogonal methods to generate protein-protein interaction datasets: yeast two-hybrid (Y2H) and a new assay, high-throughput Gaussia princeps protein complementation assay (HT-GPCA) performed in mammalian cells. A large-scale identification of cellular partners of a pathogen protein is performed by mating-based yeast two-hybrid screenings of cDNA libraries using multiple pathogen strain variants. A subset of interacting partners selected on a high-confidence statistical scoring is further validated in mammalian cells for pair-wise interactions with the whole set of pathogen variants proteins using HT-GPCA. This combination of two complementary methods improves the robustness of the interaction dataset, and allows the performance of a stringent comparative interaction analysis. Such comparative interactomics constitute a reliable and powerful strategy to decipher any pathogen-host interplays.
Immunology, Issue 77, Genetics, Microbiology, Biochemistry, Molecular Biology, Cellular Biology, Biomedical Engineering, Infection, Cancer Biology, Virology, Medicine, Host-Pathogen Interactions, Host-Pathogen Interactions, Protein-protein interaction, High-throughput screening, Luminescence, Yeast two-hybrid, HT-GPCA, Network, protein, yeast, cell, culture
50404
Play Button
Trajectory Data Analyses for Pedestrian Space-time Activity Study
Authors: Feng Qi, Fei Du.
Institutions: Kean University, University of Wisconsin-Madison.
It is well recognized that human movement in the spatial and temporal dimensions has direct influence on disease transmission1-3. An infectious disease typically spreads via contact between infected and susceptible individuals in their overlapped activity spaces. Therefore, daily mobility-activity information can be used as an indicator to measure exposures to risk factors of infection. However, a major difficulty and thus the reason for paucity of studies of infectious disease transmission at the micro scale arise from the lack of detailed individual mobility data. Previously in transportation and tourism research detailed space-time activity data often relied on the time-space diary technique, which requires subjects to actively record their activities in time and space. This is highly demanding for the participants and collaboration from the participants greatly affects the quality of data4. Modern technologies such as GPS and mobile communications have made possible the automatic collection of trajectory data. The data collected, however, is not ideal for modeling human space-time activities, limited by the accuracies of existing devices. There is also no readily available tool for efficient processing of the data for human behavior study. We present here a suite of methods and an integrated ArcGIS desktop-based visual interface for the pre-processing and spatiotemporal analyses of trajectory data. We provide examples of how such processing may be used to model human space-time activities, especially with error-rich pedestrian trajectory data, that could be useful in public health studies such as infectious disease transmission modeling. The procedure presented includes pre-processing, trajectory segmentation, activity space characterization, density estimation and visualization, and a few other exploratory analysis methods. Pre-processing is the cleaning of noisy raw trajectory data. We introduce an interactive visual pre-processing interface as well as an automatic module. Trajectory segmentation5 involves the identification of indoor and outdoor parts from pre-processed space-time tracks. Again, both interactive visual segmentation and automatic segmentation are supported. Segmented space-time tracks are then analyzed to derive characteristics of one's activity space such as activity radius etc. Density estimation and visualization are used to examine large amount of trajectory data to model hot spots and interactions. We demonstrate both density surface mapping6 and density volume rendering7. We also include a couple of other exploratory data analyses (EDA) and visualizations tools, such as Google Earth animation support and connection analysis. The suite of analytical as well as visual methods presented in this paper may be applied to any trajectory data for space-time activity studies.
Environmental Sciences, Issue 72, Computer Science, Behavior, Infectious Diseases, Geography, Cartography, Data Display, Disease Outbreaks, cartography, human behavior, Trajectory data, space-time activity, GPS, GIS, ArcGIS, spatiotemporal analysis, visualization, segmentation, density surface, density volume, exploratory data analysis, modelling
50130
Play Button
RNA-seq Analysis of Transcriptomes in Thrombin-treated and Control Human Pulmonary Microvascular Endothelial Cells
Authors: Dilyara Cheranova, Margaret Gibson, Suman Chaudhary, Li Qin Zhang, Daniel P. Heruth, Dmitry N. Grigoryev, Shui Qing Ye.
Institutions: Children's Mercy Hospital and Clinics, School of Medicine, University of Missouri-Kansas City.
The characterization of gene expression in cells via measurement of mRNA levels is a useful tool in determining how the transcriptional machinery of the cell is affected by external signals (e.g. drug treatment), or how cells differ between a healthy state and a diseased state. With the advent and continuous refinement of next-generation DNA sequencing technology, RNA-sequencing (RNA-seq) has become an increasingly popular method of transcriptome analysis to catalog all species of transcripts, to determine the transcriptional structure of all expressed genes and to quantify the changing expression levels of the total set of transcripts in a given cell, tissue or organism1,2 . RNA-seq is gradually replacing DNA microarrays as a preferred method for transcriptome analysis because it has the advantages of profiling a complete transcriptome, providing a digital type datum (copy number of any transcript) and not relying on any known genomic sequence3. Here, we present a complete and detailed protocol to apply RNA-seq to profile transcriptomes in human pulmonary microvascular endothelial cells with or without thrombin treatment. This protocol is based on our recent published study entitled "RNA-seq Reveals Novel Transcriptome of Genes and Their Isoforms in Human Pulmonary Microvascular Endothelial Cells Treated with Thrombin,"4 in which we successfully performed the first complete transcriptome analysis of human pulmonary microvascular endothelial cells treated with thrombin using RNA-seq. It yielded unprecedented resources for further experimentation to gain insights into molecular mechanisms underlying thrombin-mediated endothelial dysfunction in the pathogenesis of inflammatory conditions, cancer, diabetes, and coronary heart disease, and provides potential new leads for therapeutic targets to those diseases. The descriptive text of this protocol is divided into four parts. The first part describes the treatment of human pulmonary microvascular endothelial cells with thrombin and RNA isolation, quality analysis and quantification. The second part describes library construction and sequencing. The third part describes the data analysis. The fourth part describes an RT-PCR validation assay. Representative results of several key steps are displayed. Useful tips or precautions to boost success in key steps are provided in the Discussion section. Although this protocol uses human pulmonary microvascular endothelial cells treated with thrombin, it can be generalized to profile transcriptomes in both mammalian and non-mammalian cells and in tissues treated with different stimuli or inhibitors, or to compare transcriptomes in cells or tissues between a healthy state and a disease state.
Genetics, Issue 72, Molecular Biology, Immunology, Medicine, Genomics, Proteins, RNA-seq, Next Generation DNA Sequencing, Transcriptome, Transcription, Thrombin, Endothelial cells, high-throughput, DNA, genomic DNA, RT-PCR, PCR
4393
Play Button
Small Bowel Transplantation In Mice
Authors: Fengchun Liu, Sang-Mo Kang.
Institutions: University of California, San Francisco - UCSF.
Since 1990, the development of tacrolimus-based immunosuppression and improved surgical techniques, the increased array of potent immunosuppressive medications, infection prophylaxis, and suitable patient selection helped improve actuarial graft and patient survival rates for all types of intestine transplantation. Patients with irreversible intestinal failure and complications of parenteral nutrition should now be routinely considered for small intestine transplantation. However, Survival rates for small intestinal transplantation have been slow to improve compares increasingly favorably with renal, liver, heart and lung. The small bowel transplantation is still unsatisfactory compared with other organs. Further progress may depend on better understanding of immunology and physiology of the graft and can be greatly facilitated by animal models. A wider use of mouse small bowel transplantation model is needed in the study of immunology and physiology of the transplantation gut as well as efficient methods in diagnosing early rejection. However, this model is limited to use because the techniques involved is an extremely technically challenging. We have developed a modified technique. When making anastomosis of portal vein and inferior vena cava, two stay sutures are made at the proximal apex and distal apex of the recipient s inferior vena cava with the donor s portal vein. The left wall of the inferior vena cava and donor s portal vein is closed with continuing sutures in the inside of the inferior vena cava after, after one knot with the proximal apex stay suture the right wall of the inferior vena cava and the donor s portal vein are closed with continuing sutures outside the inferior vena cave with 10-0 sutures. This method is easier to perform because anastomosis is made just on the one side of the inferior vena cava and 10-0 sutures is the right size to avoid bleeding and thrombosis. In this article, we provide details of the technique to supplement the video.
Issue 7, Immunology, Transplantation, Transplant Rejection, Small Bowel
258
Play Button
Murine Skin Transplantation
Authors: Kym R. Garrod, Michael D. Cahalan.
Institutions: University of California, Irvine (UCI).
As one of the most stringent and least technically challenging models, skin transplantation is a standard method to assay host T cell responses to MHC-disparate donor antigens. The aim of this video-article is to provide the viewer with a step-by-step visual demonstration of skin transplantation using the mouse model. The protocol is divided into 5 main components: 1) harvesting donor skin; 2) preparing recipient for transplant; 3) skin transplant; 4) bandage removal and monitoring graft rejection; 5) helpful hints. Once proficient, the procedure itself should take <10 min to perform.
Immunology, Issue 11, allograft rejection, skin transplant, mouse
634
Play Button
Assembly, Loading, and Alignment of an Analytical Ultracentrifuge Sample Cell
Authors: Andrea Balbo, Huaying Zhao, Patrick H. Brown, Peter Schuck.
Institutions: Dynamics of Macromolecular Assembly, Laboratory of Bioengineering and Physical Science.
The analytical ultracentrifuge (AUC) is a powerful biophysical tool that allows us to record macromolecular sedimentation profiles during high speed centrifugation. When properly planned and executed, an AUC sedimentation velocity or sedimentation equilibrium experiment can reveal a great deal about a protein in regards to size and shape, sample purity, sedimentation coefficient, oligomerization states and protein-protein interactions. This technique, however, requires a rigorous level of technical attention. Sample cells hold a sectored center piece sandwiched between two window assemblies. They are sealed with a torque pressure of around 120-140 in/lbs. Reference buffer and sample are loaded into the centerpiece sectors and then after sealing, the cells are precisely aligned into a titanium rotor so that the optical detection systems scan both sample and reference buffer in the same radial path midline through each centerpiece sector while rotating at speeds of up to 60, 000 rpm and under very high vacuum Not only is proper sample cell assembly critical, sample cell components are very expensive and must be properly cared for to ensure they are in optimum working condition in order to avoid leaks and breakage during experiments. Handle windows carefully, for even the slightest crack or scratch can lead to breakage in the centrifuge. The contact between centerpiece and windows must be as tight as possible; i.e. no Newton s rings should be visible after torque pressure is applied. Dust, lint, scratches and oils on either the windows or the centerpiece all compromise this contact and can very easily lead to leaking of solutions from one sector to another or leaking out of the centerpiece all together. Not only are precious samples lost, leaking of solutions during an experiment will cause an imbalance of pressure in the cell that often leads to broken windows and centerpieces. In addition, plug gaskets and housing plugs must be securely in place to avoid solutions being pulled out of the centerpiece sector through the loading holes by the high vacuum in the centrifuge chamber. Window liners and gaskets must be free of breaks and cracks that could cause movement resulting in broken windows. This video will demonstrate our procedures of sample cell assembly, torque, loading and rotor alignment to help minimize component damage, solution leaking and breakage during the perfect AUC experiment.
Basic Protocols, Issue 33, analytical ultracentrifugation, sedimentation velocity, sedimentation equilibrium, protein characterization, sedimentation coefficient
1530
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
1988
Play Button
One-step Metabolomics: Carbohydrates, Organic and Amino Acids Quantified in a Single Procedure
Authors: James D. Shoemaker.
Institutions: Saint Louis University School of Medicine.
Every infant born in the US is now screened for up to 42 rare genetic disorders called "inborn errors of metabolism". The screening method is based on tandem mass spectrometry and quantifies acylcarnitines as a screen for organic acidemias and also measures amino acids. All states also perform enzymatic testing for carbohydrate disorders such as galactosemia. Because the results can be non-specific, follow-up testing of positive results is required using a more definitive method. The present report describes the "urease" method of sample preparation for inborn error screening. Crystalline urease enzyme is used to remove urea from body fluids which permits most other water-soluble metabolites to be dehydrated and derivatized for gas chromatography in a single procedure. Dehydration by evaporation in a nitrogen stream is facilitated by adding acetonitrile and methylene chloride. Then, trimethylsilylation takes place in the presence of a unique catalyst, triethylammonium trifluoroacetate. Automated injection and chromatography is followed by macro-driven custom quantification of 192 metabolites and semi-quantification of every major component using specialized libraries of mass spectra of TMS derivatized biological compounds. The analysis may be performed on the widely-used Chemstation platform using the macros and libraries available from the author. In our laboratory, over 16,000 patient samples have been analyzed using the method with a diagnostic yield of about 17%--that is, 17% of the samples results reveal findings that should be acted upon by the ordering physician. Included in these are over 180 confirmed inborn errors, of which about 38% could not have been diagnosed using previous methods.
Biochemistry, Issue 40, metabolomics, gas chromatography/mass spectrometry, GC/MS, inborn errors, vitamin deficiency, BNA analyses, carbohydrate, amino acid, organic acid, urease
2014
Play Button
Single Port Donor Nephrectomy
Authors: David B Leeser, James Wysock, S Elena Gimenez, Sandip Kapur, Joseph Del Pizzo.
Institutions: Weill Cornell Medical College of Cornell University, Weill Cornell Medical College of Cornell University.
In 2007, Rane presented the first single port nephrectomy for a small non-functioning kidney at the World Congress of Endourology. Since that time, the use of single port surgery for nephrectomy has expanded to include donor nephrectomy. Over the next two years the technique was adopted for many others types of nephrectomies to include donor nephrectomy. We present our technique for single port donor nephrectomy using the Gelpoint device. We have successfully performed this surgery in over 100 patients and add this experience to our experience of over 1000 laparoscopic nephrectomies. With the proper equipment and technique, single port donor nephrectomy can be performed safely and effectively in the majority of live donors. We have found that our operative times and most importantly our transplant outcomes have not changed significantly with the adoption of the single port donor nephrectomy. We believe that single port donor nephrectomy represents a step forward in the care of living donors.
Medicine, Issue 49, Single Port, Laparoscopic, Donor Nephrectomy, Transplant
2368
Play Button
Behavioral Determination of Stimulus Pair Discrimination of Auditory Acoustic and Electrical Stimuli Using a Classical Conditioning and Heart-rate Approach
Authors: Simeon J. Morgan, Antonio G. Paolini.
Institutions: La Trobe University.
Acute animal preparations have been used in research prospectively investigating electrode designs and stimulation techniques for integration into neural auditory prostheses, such as auditory brainstem implants1-3 and auditory midbrain implants4,5. While acute experiments can give initial insight to the effectiveness of the implant, testing the chronically implanted and awake animals provides the advantage of examining the psychophysical properties of the sensations induced using implanted devices6,7. Several techniques such as reward-based operant conditioning6-8, conditioned avoidance9-11, or classical fear conditioning12 have been used to provide behavioral confirmation of detection of a relevant stimulus attribute. Selection of a technique involves balancing aspects including time efficiency (often poor in reward-based approaches), the ability to test a plurality of stimulus attributes simultaneously (limited in conditioned avoidance), and measure reliability of repeated stimuli (a potential constraint when physiological measures are employed). Here, a classical fear conditioning behavioral method is presented which may be used to simultaneously test both detection of a stimulus, and discrimination between two stimuli. Heart-rate is used as a measure of fear response, which reduces or eliminates the requirement for time-consuming video coding for freeze behaviour or other such measures (although such measures could be included to provide convergent evidence). Animals were conditioned using these techniques in three 2-hour conditioning sessions, each providing 48 stimulus trials. Subsequent 48-trial testing sessions were then used to test for detection of each stimulus in presented pairs, and test discrimination between the member stimuli of each pair. This behavioral method is presented in the context of its utilisation in auditory prosthetic research. The implantation of electrocardiogram telemetry devices is shown. Subsequent implantation of brain electrodes into the Cochlear Nucleus, guided by the monitoring of neural responses to acoustic stimuli, and the fixation of the electrode into place for chronic use is likewise shown.
Neuroscience, Issue 64, Physiology, auditory, hearing, brainstem, stimulation, rat, abi
3598
Play Button
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Authors: Alla Gagarinova, Mohan Babu, Jack Greenblatt, Andrew Emili.
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g. protein-protein) and functional (e.g. gene-gene or genetic) interactions (GI)1. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7, but GI information remains sparse for prokaryotes8, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10. Here, we present the key steps required to perform quantitative E. coli Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format. Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g. the 'Keio' collection11) and essential gene hypomorphic mutations (i.e. alleles conferring reduced protein expression, stability, or activity9, 12, 13) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e. slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2 as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9.
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
4056
Play Button
Comprehensive Profiling of Dopamine Regulation in Substantia Nigra and Ventral Tegmental Area
Authors: Michael F. Salvatore, Brandon S. Pruett, Charles Dempsey, Victoria Fields.
Institutions: Louisiana State University Health Sciences Center.
Dopamine is a vigorously studied neurotransmitter in the CNS. Indeed, its involvement in locomotor activity and reward-related behaviour has fostered five decades of inquiry into the molecular deficiencies associated with dopamine regulation. The majority of these inquiries of dopamine regulation in the brain focus upon the molecular basis for its regulation in the terminal field regions of the nigrostriatal and mesoaccumbens pathways; striatum and nucleus accumbens. Furthermore, such studies have concentrated on analysis of dopamine tissue content with normalization to only wet tissue weight. Investigation of the proteins that regulate dopamine, such as tyrosine hydroxylase (TH) protein, TH phosphorylation, dopamine transporter (DAT), and vesicular monoamine transporter 2 (VMAT2) protein often do not include analysis of dopamine tissue content in the same sample. The ability to analyze both dopamine tissue content and its regulating proteins (including post-translational modifications) not only gives inherent power to interpreting the relationship of dopamine with the protein level and function of TH, DAT, or VMAT2, but also extends sample economy. This translates into less cost, and yet produces insights into the molecular regulation of dopamine in virtually any paradigm of the investigators' choice. We focus the analyses in the midbrain. Although the SN and VTA are typically neglected in most studies of dopamine regulation, these nuclei are easily dissected with practice. A comprehensive readout of dopamine tissue content and TH, DAT, or VMAT2 can be conducted. There is burgeoning literature on the impact of dopamine function in the SN and VTA on behavior, and the impingements of exogenous substances or disease processes therein 1-5. Furthermore, compounds such as growth factors have a profound effect on dopamine and dopamine-regulating proteins, to a comparatively greater extent in the SN or VTA 6-8. Therefore, this methodology is presented for reference to laboratories that want to extend their inquiries on how specific treatments modulate behaviour and dopamine regulation. Here, a multi-step method is presented for the analyses of dopamine tissue content, the protein levels of TH, DAT, or VMAT2, and TH phosphorylation from the substantia nigra and VTA from rodent midbrain. The analysis of TH phosphorylation can yield significant insights into not only how TH activity is regulated, but also the signaling cascades affected in the somatodendritic nuclei in a given paradigm. We will illustrate the dissection technique to segregate these two nuclei and the sample processing of dissected tissue that produces a profile revealing molecular mechanisms of dopamine regulation in vivo, specific for each nuclei (Figure 1).
Neuroscience, Issue 66, Medicine, Physiology, midbrain, substantia nigra, ventral tegmental area, tyrosine hydroxylase, phosphorylation, nigrostriatal, mesoaccumbens, dopamine transporter
4171
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone
Authors: Sandra Zehentmeier, Zoltan Cseresnyes, Juan Escribano Navarro, Raluca A. Niesner, Anja E. Hauser.
Institutions: German Rheumatism Research Center, a Leibniz Institute, German Rheumatism Research Center, a Leibniz Institute, Max-Delbrück Center for Molecular Medicine, Wimasis GmbH, Charité - University of Medicine.
Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data.
Developmental Biology, Issue 98, Image analysis, neighborhood analysis, bone marrow, stromal cells, bone marrow niches, simulation, bone cryosectioning, bone histology
52544
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.