JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Using prior information from the medical literature in GWAS of oral cancer identifies novel susceptibility variant on chromosome 4--the AdAPT method.
Genome-wide association studies (GWAS) require large sample sizes to obtain adequate statistical power, but it may be possible to increase the power by incorporating complementary data. In this study we investigated the feasibility of automatically retrieving information from the medical literature and leveraging this information in GWAS.
Authors: Francesco Vallania, Enrique Ramos, Sharon Cresci, Robi D. Mitra, Todd E. Druley.
Published: 06-23-2012
As DNA sequencing technology has markedly advanced in recent years2, it has become increasingly evident that the amount of genetic variation between any two individuals is greater than previously thought3. In contrast, array-based genotyping has failed to identify a significant contribution of common sequence variants to the phenotypic variability of common disease4,5. Taken together, these observations have led to the evolution of the Common Disease / Rare Variant hypothesis suggesting that the majority of the "missing heritability" in common and complex phenotypes is instead due to an individual's personal profile of rare or private DNA variants6-8. However, characterizing how rare variation impacts complex phenotypes requires the analysis of many affected individuals at many genomic loci, and is ideally compared to a similar survey in an unaffected cohort. Despite the sequencing power offered by today's platforms, a population-based survey of many genomic loci and the subsequent computational analysis required remains prohibitive for many investigators. To address this need, we have developed a pooled sequencing approach1,9 and a novel software package1 for highly accurate rare variant detection from the resulting data. The ability to pool genomes from entire populations of affected individuals and survey the degree of genetic variation at multiple targeted regions in a single sequencing library provides excellent cost and time savings to traditional single-sample sequencing methodology. With a mean sequencing coverage per allele of 25-fold, our custom algorithm, SPLINTER, uses an internal variant calling control strategy to call insertions, deletions and substitutions up to four base pairs in length with high sensitivity and specificity from pools of up to 1 mutant allele in 500 individuals. Here we describe the method for preparing the pooled sequencing library followed by step-by-step instructions on how to use the SPLINTER package for pooled sequencing analysis ( We show a comparison between pooled sequencing of 947 individuals, all of whom also underwent genome-wide array, at over 20kb of sequencing per person. Concordance between genotyping of tagged and novel variants called in the pooled sample were excellent. This method can be easily scaled up to any number of genomic loci and any number of individuals. By incorporating the internal positive and negative amplicon controls at ratios that mimic the population under study, the algorithm can be calibrated for optimal performance. This strategy can also be modified for use with hybridization capture or individual-specific barcodes and can be applied to the sequencing of naturally heterogeneous samples, such as tumor DNA.
26 Related JoVE Articles!
Play Button
Applications of EEG Neuroimaging Data: Event-related Potentials, Spectral Power, and Multiscale Entropy
Authors: Jennifer J. Heisz, Anthony R. McIntosh.
Institutions: Baycrest.
When considering human neuroimaging data, an appreciation of signal variability represents a fundamental innovation in the way we think about brain signal. Typically, researchers represent the brain's response as the mean across repeated experimental trials and disregard signal fluctuations over time as "noise". However, it is becoming clear that brain signal variability conveys meaningful functional information about neural network dynamics. This article describes the novel method of multiscale entropy (MSE) for quantifying brain signal variability. MSE may be particularly informative of neural network dynamics because it shows timescale dependence and sensitivity to linear and nonlinear dynamics in the data.
Neuroscience, Issue 76, Neurobiology, Anatomy, Physiology, Medicine, Biomedical Engineering, Electroencephalography, EEG, electroencephalogram, Multiscale entropy, sample entropy, MEG, neuroimaging, variability, noise, timescale, non-linear, brain signal, information theory, brain, imaging
Play Button
Microwave Assisted Rapid Diagnosis of Plant Virus Diseases by Transmission Electron Microscopy
Authors: Bernd Zechmann, Gerhard Graggaber, Günther Zellnig.
Institutions: University of Graz, Graz University of Technology.
Investigations of ultrastructural changes induced by viruses are often necessary to clearly identify viral diseases in plants. With conventional sample preparation for transmission electron microscopy (TEM) such investigations can take several days1,2 and are therefore not suited for a rapid diagnosis of plant virus diseases. Microwave fixation can be used to drastically reduce sample preparation time for TEM investigations with similar ultrastructural results as observed after conventionally sample preparation3-5. Many different custom made microwave devices are currently available which can be used for the successful fixation and embedding of biological samples for TEM investigations5-8. In this study we demonstrate on Tobacco Mosaic Virus (TMV) infected Nicotiana tabacum plants that it is possible to diagnose ultrastructural alterations in leaves in about half a day by using microwave assisted sample preparation for TEM. We have chosen to perform this study with a commercially available microwave device as it performs sample preparation almost fully automatically5 in contrast to the other available devices where many steps still have to be performed manually6-8 and are therefore more time and labor consuming. As sample preparation is performed fully automatically negative staining of viral particles in the sap of the remaining TMV-infected leaves and the following examination of ultrastructure and size can be performed during fixation and embedding.
Immunology, Issue 56, diagnostics, electron microscopy, microwave, Nicotiana, negative staining, phytopathology, TMV, ultrastructure
Play Button
Isothermal Titration Calorimetry for Measuring Macromolecule-Ligand Affinity
Authors: Michael R. Duff, Jr., Jordan Grubbs, Elizabeth E. Howell.
Institutions: University of Tennessee .
Isothermal titration calorimetry (ITC) is a useful tool for understanding the complete thermodynamic picture of a binding reaction. In biological sciences, macromolecular interactions are essential in understanding the machinery of the cell. Experimental conditions, such as buffer and temperature, can be tailored to the particular binding system being studied. However, careful planning is needed since certain ligand and macromolecule concentration ranges are necessary to obtain useful data. Concentrations of the macromolecule and ligand need to be accurately determined for reliable results. Care also needs to be taken when preparing the samples as impurities can significantly affect the experiment. When ITC experiments, along with controls, are performed properly, useful binding information, such as the stoichiometry, affinity and enthalpy, are obtained. By running additional experiments under different buffer or temperature conditions, more detailed information can be obtained about the system. A protocol for the basic setup of an ITC experiment is given.
Molecular Biology, Issue 55, Isothermal titration calorimetry, thermodynamics, binding affinity, enthalpy, entropy, free energy
Play Button
A Mouse Model for Pathogen-induced Chronic Inflammation at Local and Systemic Sites
Authors: George Papadopoulos, Carolyn D. Kramer, Connie S. Slocum, Ellen O. Weinberg, Ning Hua, Cynthia V. Gudino, James A. Hamilton, Caroline A. Genco.
Institutions: Boston University School of Medicine, Boston University School of Medicine.
Chronic inflammation is a major driver of pathological tissue damage and a unifying characteristic of many chronic diseases in humans including neoplastic, autoimmune, and chronic inflammatory diseases. Emerging evidence implicates pathogen-induced chronic inflammation in the development and progression of chronic diseases with a wide variety of clinical manifestations. Due to the complex and multifactorial etiology of chronic disease, designing experiments for proof of causality and the establishment of mechanistic links is nearly impossible in humans. An advantage of using animal models is that both genetic and environmental factors that may influence the course of a particular disease can be controlled. Thus, designing relevant animal models of infection represents a key step in identifying host and pathogen specific mechanisms that contribute to chronic inflammation. Here we describe a mouse model of pathogen-induced chronic inflammation at local and systemic sites following infection with the oral pathogen Porphyromonas gingivalis, a bacterium closely associated with human periodontal disease. Oral infection of specific-pathogen free mice induces a local inflammatory response resulting in destruction of tooth supporting alveolar bone, a hallmark of periodontal disease. In an established mouse model of atherosclerosis, infection with P. gingivalis accelerates inflammatory plaque deposition within the aortic sinus and innominate artery, accompanied by activation of the vascular endothelium, an increased immune cell infiltrate, and elevated expression of inflammatory mediators within lesions. We detail methodologies for the assessment of inflammation at local and systemic sites. The use of transgenic mice and defined bacterial mutants makes this model particularly suitable for identifying both host and microbial factors involved in the initiation, progression, and outcome of disease. Additionally, the model can be used to screen for novel therapeutic strategies, including vaccination and pharmacological intervention.
Immunology, Issue 90, Pathogen-Induced Chronic Inflammation; Porphyromonas gingivalis; Oral Bone Loss; Periodontal Disease; Atherosclerosis; Chronic Inflammation; Host-Pathogen Interaction; microCT; MRI
Play Button
2D and 3D Chromosome Painting in Malaria Mosquitoes
Authors: Phillip George, Atashi Sharma, Igor V Sharakhov.
Institutions: Virginia Tech.
Fluorescent in situ hybridization (FISH) of whole arm chromosome probes is a robust technique for mapping genomic regions of interest, detecting chromosomal rearrangements, and studying three-dimensional (3D) organization of chromosomes in the cell nucleus. The advent of laser capture microdissection (LCM) and whole genome amplification (WGA) allows obtaining large quantities of DNA from single cells. The increased sensitivity of WGA kits prompted us to develop chromosome paints and to use them for exploring chromosome organization and evolution in non-model organisms. Here, we present a simple method for isolating and amplifying the euchromatic segments of single polytene chromosome arms from ovarian nurse cells of the African malaria mosquito Anopheles gambiae. This procedure provides an efficient platform for obtaining chromosome paints, while reducing the overall risk of introducing foreign DNA to the sample. The use of WGA allows for several rounds of re-amplification, resulting in high quantities of DNA that can be utilized for multiple experiments, including 2D and 3D FISH. We demonstrated that the developed chromosome paints can be successfully used to establish the correspondence between euchromatic portions of polytene and mitotic chromosome arms in An. gambiae. Overall, the union of LCM and single-chromosome WGA provides an efficient tool for creating significant amounts of target DNA for future cytogenetic and genomic studies.
Immunology, Issue 83, Microdissection, whole genome amplification, malaria mosquito, polytene chromosome, mitotic chromosomes, fluorescence in situ hybridization, chromosome painting
Play Button
Measuring Oral Fatty Acid Thresholds, Fat Perception, Fatty Food Liking, and Papillae Density in Humans
Authors: Rivkeh Y. Haryono, Madeline A. Sprajcer, Russell S. J. Keast.
Institutions: Deakin University.
Emerging evidence from a number of laboratories indicates that humans have the ability to identify fatty acids in the oral cavity, presumably via fatty acid receptors housed on taste cells. Previous research has shown that an individual's oral sensitivity to fatty acid, specifically oleic acid (C18:1) is associated with body mass index (BMI), dietary fat consumption, and the ability to identify fat in foods. We have developed a reliable and reproducible method to assess oral chemoreception of fatty acids, using a milk and C18:1 emulsion, together with an ascending forced choice triangle procedure. In parallel, a food matrix has been developed to assess an individual's ability to perceive fat, in addition to a simple method to assess fatty food liking. As an added measure tongue photography is used to assess papillae density, with higher density often being associated with increased taste sensitivity.
Neuroscience, Issue 88, taste, overweight and obesity, dietary fat, fatty acid, diet, fatty food liking, detection threshold
Play Button
gDNA Enrichment by a Transposase-based Technology for NGS Analysis of the Whole Sequence of BRCA1, BRCA2, and 9 Genes Involved in DNA Damage Repair
Authors: Sandy Chevrier, Romain Boidot.
Institutions: Centre Georges-François Leclerc.
The widespread use of Next Generation Sequencing has opened up new avenues for cancer research and diagnosis. NGS will bring huge amounts of new data on cancer, and especially cancer genetics. Current knowledge and future discoveries will make it necessary to study a huge number of genes that could be involved in a genetic predisposition to cancer. In this regard, we developed a Nextera design to study 11 complete genes involved in DNA damage repair. This protocol was developed to safely study 11 genes (ATM, BARD1, BRCA1, BRCA2, BRIP1, CHEK2, PALB2, RAD50, RAD51C, RAD80, and TP53) from promoter to 3'-UTR in 24 patients simultaneously. This protocol, based on transposase technology and gDNA enrichment, gives a great advantage in terms of time for the genetic diagnosis thanks to sample multiplexing. This protocol can be safely used with blood gDNA.
Genetics, Issue 92, gDNA enrichment, Nextera, NGS, DNA damage, BRCA1, BRCA2
Play Button
Sampling Human Indigenous Saliva Peptidome Using a Lollipop-Like Ultrafiltration Probe: Simplify and Enhance Peptide Detection for Clinical Mass Spectrometry
Authors: Wenhong Zhu, Richard L. Gallo, Chun-Ming Huang.
Institutions: Sanford-Burnham Medical Research Institute, University of California, San Diego , VA San Diego Healthcare Center, University of California, San Diego .
Although human saliva proteome and peptidome have been revealed 1-2 they were majorly identified from tryptic digests of saliva proteins. Identification of indigenous peptidome of human saliva without prior digestion with exogenous enzymes becomes imperative, since native peptides in human saliva provide potential values for diagnosing disease, predicting disease progression, and monitoring therapeutic efficacy. Appropriate sampling is a critical step for enhancement of identification of human indigenous saliva peptidome. Traditional methods of sampling human saliva involving centrifugation to remove debris 3-4 may be too time-consuming to be applicable for clinical use. Furthermore, debris removal by centrifugation may be unable to clean most of the infected pathogens and remove the high abundance proteins that often hinder the identification of low abundance peptidome. Conventional proteomic approaches that primarily utilize two-dimensional gel electrophoresis (2-DE) gels in conjugation with in-gel digestion are capable of identifying many saliva proteins 5-6. However, this approach is generally not sufficiently sensitive to detect low abundance peptides/proteins. Liquid chromatography-Mass spectrometry (LC-MS) based proteomics is an alternative that can identify proteins without prior 2-DE separation. Although this approach provides higher sensitivity, it generally needs prior sample pre-fractionation 7 and pre-digestion with trypsin, which makes it difficult for clinical use. To circumvent the hindrance in mass spectrometry due to sample preparation, we have developed a technique called capillary ultrafiltration (CUF) probes 8-11. Data from our laboratory demonstrated that the CUF probes are capable of capturing proteins in vivo from various microenvironments in animals in a dynamic and minimally invasive manner 8-11. No centrifugation is needed since a negative pressure is created by simply syringe withdrawing during sample collection. The CUF probes combined with LC-MS have successfully identified tryptic-digested proteins 8-11. In this study, we upgraded the ultrafiltration sampling technique by creating a lollipop-like ultrafiltration (LLUF) probe that can easily fit in the human oral cavity. The direct analysis by LC-MS without trypsin digestion showed that human saliva indigenously contains many peptide fragments derived from various proteins. Sampling saliva with LLUF probes avoided centrifugation but effectively removed many larger and high abundance proteins. Our mass spectrometric results illustrated that many low abundance peptides became detectable after filtering out larger proteins with LLUF probes. Detection of low abundance saliva peptides was independent of multiple-step sample separation with chromatography. For clinical application, the LLUF probes incorporated with LC-MS could potentially be used in the future to monitor disease progression from saliva.
Medicine, Issue 66, Molecular Biology, Genetics, Sampling, Saliva, Peptidome, Ultrafiltration, Mass spectrometry
Play Button
In Vivo Modeling of the Morbid Human Genome using Danio rerio
Authors: Adrienne R. Niederriter, Erica E. Davis, Christelle Golzio, Edwin C. Oh, I-Chun Tsai, Nicholas Katsanis.
Institutions: Duke University Medical Center, Duke University, Duke University Medical Center.
Here, we present methods for the development of assays to query potentially clinically significant nonsynonymous changes using in vivo complementation in zebrafish. Zebrafish (Danio rerio) are a useful animal system due to their experimental tractability; embryos are transparent to enable facile viewing, undergo rapid development ex vivo, and can be genetically manipulated.1 These aspects have allowed for significant advances in the analysis of embryogenesis, molecular processes, and morphogenetic signaling. Taken together, the advantages of this vertebrate model make zebrafish highly amenable to modeling the developmental defects in pediatric disease, and in some cases, adult-onset disorders. Because the zebrafish genome is highly conserved with that of humans (~70% orthologous), it is possible to recapitulate human disease states in zebrafish. This is accomplished either through the injection of mutant human mRNA to induce dominant negative or gain of function alleles, or utilization of morpholino (MO) antisense oligonucleotides to suppress genes to mimic loss of function variants. Through complementation of MO-induced phenotypes with capped human mRNA, our approach enables the interpretation of the deleterious effect of mutations on human protein sequence based on the ability of mutant mRNA to rescue a measurable, physiologically relevant phenotype. Modeling of the human disease alleles occurs through microinjection of zebrafish embryos with MO and/or human mRNA at the 1-4 cell stage, and phenotyping up to seven days post fertilization (dpf). This general strategy can be extended to a wide range of disease phenotypes, as demonstrated in the following protocol. We present our established models for morphogenetic signaling, craniofacial, cardiac, vascular integrity, renal function, and skeletal muscle disorder phenotypes, as well as others.
Molecular Biology, Issue 78, Genetics, Biomedical Engineering, Medicine, Developmental Biology, Biochemistry, Anatomy, Physiology, Bioengineering, Genomics, Medical, zebrafish, in vivo, morpholino, human disease modeling, transcription, PCR, mRNA, DNA, Danio rerio, animal model
Play Button
Telomere Length and Telomerase Activity; A Yin and Yang of Cell Senescence
Authors: Mary Derasmo Axelrad, Temuri Budagov, Gil Atzmon.
Institutions: Albert Einstein College of Medicine , Albert Einstein College of Medicine , Albert Einstein College of Medicine .
Telomeres are repeating DNA sequences at the tip ends of the chromosomes that are diverse in length and in humans can reach a length of 15,000 base pairs. The telomere serves as a bioprotective mechanism of chromosome attrition at each cell division. At a certain length, telomeres become too short to allow replication, a process that may lead to chromosome instability or cell death. Telomere length is regulated by two opposing mechanisms: attrition and elongation. Attrition occurs as each cell divides. In contrast, elongation is partially modulated by the enzyme telomerase, which adds repeating sequences to the ends of the chromosomes. In this way, telomerase could possibly reverse an aging mechanism and rejuvenates cell viability. These are crucial elements in maintaining cell life and are used to assess cellular aging. In this manuscript we will describe an accurate, short, sophisticated and cheap method to assess telomere length in multiple tissues and species. This method takes advantage of two key elements, the tandem repeat of the telomere sequence and the sensitivity of the qRT-PCR to detect differential copy numbers of tested samples. In addition, we will describe a simple assay to assess telomerase activity as a complementary backbone test for telomere length.
Genetics, Issue 75, Molecular Biology, Cellular Biology, Medicine, Biomedical Engineering, Genomics, Telomere length, telomerase activity, telomerase, telomeres, telomere, DNA, PCR, polymerase chain reaction, qRT-PCR, sequencing, aging, telomerase assay
Play Button
Lensless On-chip Imaging of Cells Provides a New Tool for High-throughput Cell-Biology and Medical Diagnostics
Authors: Onur Mudanyali, Anthony Erlinger, Sungkyu Seo, Ting-Wei Su, Derek Tseng, Aydogan Ozcan.
Institutions: University of California, Los Angeles, University of California, Los Angeles.
Conventional optical microscopes image cells by use of objective lenses that work together with other lenses and optical components. While quite effective, this classical approach has certain limitations for miniaturization of the imaging platform to make it compatible with the advanced state of the art in microfluidics. In this report, we introduce experimental details of a lensless on-chip imaging concept termed LUCAS (Lensless Ultra-wide field-of-view Cell monitoring Array platform based on Shadow imaging) that does not require any microscope objectives or other bulky optical components to image a heterogeneous cell solution over an ultra-wide field of view that can span as large as ~18 cm2. Moreover, unlike conventional microscopes, LUCAS can image a heterogeneous cell solution of interest over a depth-of-field of ~5 mm without the need for refocusing which corresponds to up to ~9 mL sample volume. This imaging platform records the shadows (i.e., lensless digital holograms) of each cell of interest within its field of view, and automated digital processing of these cell shadows can determine the type, the count and the relative positions of cells within the solution. Because it does not require any bulky optical components or mechanical scanning stages it offers a significantly miniaturized platform that at the same time reduces the cost, which is quite important for especially point of care diagnostic tools. Furthermore, the imaging throughput of this platform is orders of magnitude better than conventional optical microscopes, which could be exceedingly valuable for high-throughput cell-biology experiments.
Cellular Biology, Issue 34, LUCAS, lensfree imaging, on-chip imaging, point-of-care diagnostics, global health, cell-biology, telemedicine, wireless health, microscopy, red blood cells
Play Button
Small and Wide Angle X-Ray Scattering Studies of Biological Macromolecules in Solution
Authors: Li Liu, Lauren Boldon, Melissa Urquhart, Xiangyu Wang.
Institutions: Rensselaer Polytechnic Institute.
In this paper, Small and Wide Angle X-ray Scattering (SWAXS) analysis of macromolecules is demonstrated through experimentation. SWAXS is a technique where X-rays are elastically scattered by an inhomogeneous sample in the nm-range at small angles (typically 0.1 - 5°) and wide angles (typically > 5°). This technique provides information about the shape, size, and distribution of macromolecules, characteristic distances of partially ordered materials, pore sizes, and surface-to-volume ratio. Small Angle X-ray Scattering (SAXS) is capable of delivering structural information of macromolecules between 1 and 200 nm, whereas Wide Angle X-ray Scattering (WAXS) can resolve even smaller Bragg spacing of samples between 0.33 nm and 0.49 nm based on the specific system setup and detector. The spacing is determined from Bragg's law and is dependent on the wavelength and incident angle. In a SWAXS experiment, the materials can be solid or liquid and may contain solid, liquid or gaseous domains (so-called particles) of the same or another material in any combination. SWAXS applications are very broad and include colloids of all types: metals, composites, cement, oil, polymers, plastics, proteins, foods, and pharmaceuticals. For solid samples, the thickness is limited to approximately 5 mm. Usage of a lab-based SWAXS instrument is detailed in this paper. With the available software (e.g., GNOM-ATSAS 2.3 package by D. Svergun EMBL-Hamburg and EasySWAXS software) for the SWAXS system, an experiment can be conducted to determine certain parameters of interest for the given sample. One example of a biological macromolecule experiment is the analysis of 2 wt% lysozyme in a water-based aqueous buffer which can be chosen and prepared through numerous methods. The preparation of the sample follows the guidelines below in the Preparation of the Sample section. Through SWAXS experimentation, important structural parameters of lysozyme, e.g. the radius of gyration, can be analyzed.
Bioengineering, Issue 71, Biophysics, Structural Biology, Physics, Molecular Biology, Mechanical Engineering, Nanotechnology, Small angle X-ray scattering, wide angle X-ray scattering, X-ray, biological macromolecules
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Play Button
In Situ Neutron Powder Diffraction Using Custom-made Lithium-ion Batteries
Authors: William R. Brant, Siegbert Schmid, Guodong Du, Helen E. A. Brand, Wei Kong Pang, Vanessa K. Peterson, Zaiping Guo, Neeraj Sharma.
Institutions: University of Sydney, University of Wollongong, Australian Synchrotron, Australian Nuclear Science and Technology Organisation, University of Wollongong, University of New South Wales.
Li-ion batteries are widely used in portable electronic devices and are considered as promising candidates for higher-energy applications such as electric vehicles.1,2 However, many challenges, such as energy density and battery lifetimes, need to be overcome before this particular battery technology can be widely implemented in such applications.3 This research is challenging, and we outline a method to address these challenges using in situ NPD to probe the crystal structure of electrodes undergoing electrochemical cycling (charge/discharge) in a battery. NPD data help determine the underlying structural mechanism responsible for a range of electrode properties, and this information can direct the development of better electrodes and batteries. We briefly review six types of battery designs custom-made for NPD experiments and detail the method to construct the ‘roll-over’ cell that we have successfully used on the high-intensity NPD instrument, WOMBAT, at the Australian Nuclear Science and Technology Organisation (ANSTO). The design considerations and materials used for cell construction are discussed in conjunction with aspects of the actual in situ NPD experiment and initial directions are presented on how to analyze such complex in situ data.
Physics, Issue 93, In operando, structure-property relationships, electrochemical cycling, electrochemical cells, crystallography, battery performance
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Authors: Alla Gagarinova, Mohan Babu, Jack Greenblatt, Andrew Emili.
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g. protein-protein) and functional (e.g. gene-gene or genetic) interactions (GI)1. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7, but GI information remains sparse for prokaryotes8, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10. Here, we present the key steps required to perform quantitative E. coli Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format. Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g. the 'Keio' collection11) and essential gene hypomorphic mutations (i.e. alleles conferring reduced protein expression, stability, or activity9, 12, 13) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e. slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2 as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9.
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Infinium Assay for Large-scale SNP Genotyping Applications
Authors: Adam J. Adler, Graham B. Wiley, Patrick M. Gaffney.
Institutions: Oklahoma Medical Research Foundation.
Genotyping variants in the human genome has proven to be an efficient method to identify genetic associations with phenotypes. The distribution of variants within families or populations can facilitate identification of the genetic factors of disease. Illumina's panel of genotyping BeadChips allows investigators to genotype thousands or millions of single nucleotide polymorphisms (SNPs) or to analyze other genomic variants, such as copy number, across a large number of DNA samples. These SNPs can be spread throughout the genome or targeted in specific regions in order to maximize potential discovery. The Infinium assay has been optimized to yield high-quality, accurate results quickly. With proper setup, a single technician can process from a few hundred to over a thousand DNA samples per week, depending on the type of array. This assay guides users through every step, starting with genomic DNA and ending with the scanning of the array. Using propriety reagents, samples are amplified, fragmented, precipitated, resuspended, hybridized to the chip, extended by a single base, stained, and scanned on either an iScan or Hi Scan high-resolution optical imaging system. One overnight step is required to amplify the DNA. The DNA is denatured and isothermally amplified by whole-genome amplification; therefore, no PCR is required. Samples are hybridized to the arrays during a second overnight step. By the third day, the samples are ready to be scanned and analyzed. Amplified DNA may be stockpiled in large quantities, allowing bead arrays to be processed every day of the week, thereby maximizing throughput.
Basic Protocol, Issue 81, genomics, SNP, Genotyping, Infinium, iScan, HiScan, Illumina
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Play Button
Semi-automated Optical Heartbeat Analysis of Small Hearts
Authors: Karen Ocorr, Martin Fink, Anthony Cammarato, Sanford I. Bernstein, Rolf Bodmer.
Institutions: The Sanford Burnham Institute for Medical Research, The Sanford Burnham Institute for Medical Research, San Diego State University.
We have developed a method for analyzing high speed optical recordings from Drosophila, zebrafish and embryonic mouse hearts (Fink, et. al., 2009). Our Semi-automatic Optical Heartbeat Analysis (SOHA) uses a novel movement detection algorithm that is able to detect cardiac movements associated with individual contractile and relaxation events. The program provides a host of physiologically relevant readouts including systolic and diastolic intervals, heart rate, as well as qualitative and quantitative measures of heartbeat arrhythmicity. The program also calculates heart diameter measurements during both diastole and systole from which fractional shortening and fractional area changes are calculated. Output is provided as a digital file compatible with most spreadsheet programs. Measurements are made for every heartbeat in a record increasing the statistical power of the output. We demonstrate each of the steps where user input is required and show the application of our methodology to the analysis of heart function in all three genetically tractable heart models.
Physiology, Issue 31, Drosophila, zebrafish, mouse, heart, myosin, dilated, restricted, cardiomyopathy, KCNQ, movement detection
Play Button
Cross-Modal Multivariate Pattern Analysis
Authors: Kaspar Meyer, Jonas T. Kaplan.
Institutions: University of Southern California.
Multivariate pattern analysis (MVPA) is an increasingly popular method of analyzing functional magnetic resonance imaging (fMRI) data1-4. Typically, the method is used to identify a subject's perceptual experience from neural activity in certain regions of the brain. For instance, it has been employed to predict the orientation of visual gratings a subject perceives from activity in early visual cortices5 or, analogously, the content of speech from activity in early auditory cortices6. Here, we present an extension of the classical MVPA paradigm, according to which perceptual stimuli are not predicted within, but across sensory systems. Specifically, the method we describe addresses the question of whether stimuli that evoke memory associations in modalities other than the one through which they are presented induce content-specific activity patterns in the sensory cortices of those other modalities. For instance, seeing a muted video clip of a glass vase shattering on the ground automatically triggers in most observers an auditory image of the associated sound; is the experience of this image in the "mind's ear" correlated with a specific neural activity pattern in early auditory cortices? Furthermore, is this activity pattern distinct from the pattern that could be observed if the subject were, instead, watching a video clip of a howling dog? In two previous studies7,8, we were able to predict sound- and touch-implying video clips based on neural activity in early auditory and somatosensory cortices, respectively. Our results are in line with a neuroarchitectural framework proposed by Damasio9,10, according to which the experience of mental images that are based on memories - such as hearing the shattering sound of a vase in the "mind's ear" upon seeing the corresponding video clip - is supported by the re-construction of content-specific neural activity patterns in early sensory cortices.
Neuroscience, Issue 57, perception, sensory, cross-modal, top-down, mental imagery, fMRI, MRI, neuroimaging, multivariate pattern analysis, MVPA
Play Button
Measuring Diffusion Coefficients via Two-photon Fluorescence Recovery After Photobleaching
Authors: Kelley D. Sullivan, Edward B. Brown.
Institutions: University of Rochester, University of Rochester.
Multi-fluorescence recovery after photobleaching is a microscopy technique used to measure the diffusion coefficient (or analogous transport parameters) of macromolecules, and can be applied to both in vitro and in vivo biological systems. Multi-fluorescence recovery after photobleaching is performed by photobleaching a region of interest within a fluorescent sample using an intense laser flash, then attenuating the beam and monitoring the fluorescence as still-fluorescent molecules from outside the region of interest diffuse in to replace the photobleached molecules. We will begin our demonstration by aligning the laser beam through the Pockels Cell (laser modulator) and along the optical path through the laser scan box and objective lens to the sample. For simplicity, we will use a sample of aqueous fluorescent dye. We will then determine the proper experimental parameters for our sample including, monitor and bleaching powers, bleach duration, bin widths (for photon counting), and fluorescence recovery time. Next, we will describe the procedure for taking recovery curves, a process that can be largely automated via LabVIEW (National Instruments, Austin, TX) for enhanced throughput. Finally, the diffusion coefficient is determined by fitting the recovery data to the appropriate mathematical model using a least-squares fitting algorithm, readily programmable using software such as MATLAB (The Mathworks, Natick, MA).
Cellular Biology, Issue 36, Diffusion, fluorescence recovery after photobleaching, MP-FRAP, FPR, multi-photon
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.