JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Documenting Differences between Early Stone Age Flake Production Systems: An Experimental Model and Archaeological Verification.
PUBLISHED: 06-26-2015
This study investigates morphological differences between flakes produced via "core and flake" technologies and those resulting from bifacial shaping strategies. We investigate systematic variation between two technological groups of flakes using experimentally produced assemblages, and then apply the experimental model to the Cutting 10 Mid -Pleistocene archaeological collection from Elandsfontein, South Africa. We argue that a specific set of independent variables-and their interactions-including external platform angle, platform depth, measures of thickness variance and flake curvature should distinguish between these two technological groups. The role of these variables in technological group separation was further investigated using the Generalized Linear Model as well as Linear Discriminant Analysis. The Discriminant model was used to classify archaeological flakes from the Cutting 10 locality in terms of their probability of association, within either experimentally developed technological group. The results indicate that the selected independent variables play a central role in separating core and flake from bifacial technologies. Thickness evenness and curvature had the greatest effect sizes in both the Generalized Linear and Discriminant models. Interestingly the interaction between thickness evenness and platform depth was significant and played an important role in influencing technological group membership. The identified interaction emphasizes the complexity in attempting to distinguish flake production strategies based on flake morphological attributes. The results of the discriminant function analysis demonstrate that the majority of flakes at the Cutting 10 locality were not associated with the production of the numerous Large Cutting Tools found at the site, which corresponds with previous suggestions regarding technological behaviors reflected in this assemblage.
Authors: Robert D. Kirch, Richard C. Pinnell, Ulrich G. Hofmann, Jean-Christophe Cassel.
Published: 07-08-2015
Spatial cognition research in rodents typically employs the use of maze tasks, whose attributes vary from one maze to the next. These tasks vary by their behavioral flexibility and required memory duration, the number of goals and pathways, and also the overall task complexity. A confounding feature in many of these tasks is the lack of control over the strategy employed by the rodents to reach the goal, e.g., allocentric (declarative-like) or egocentric (procedural) based strategies. The double-H maze is a novel water-escape memory task that addresses this issue, by allowing the experimenter to direct the type of strategy learned during the training period. The double-H maze is a transparent device, which consists of a central alleyway with three arms protruding on both sides, along with an escape platform submerged at the extremity of one of these arms. Rats can be trained using an allocentric strategy by alternating the start position in the maze in an unpredictable manner (see protocol 1; §4.7), thus requiring them to learn the location of the platform based on the available allothetic cues. Alternatively, an egocentric learning strategy (protocol 2; §4.8) can be employed by releasing the rats from the same position during each trial, until they learn the procedural pattern required to reach the goal. This task has been proven to allow for the formation of stable memory traces. Memory can be probed following the training period in a misleading probe trial, in which the starting position for the rats alternates. Following an egocentric learning paradigm, rats typically resort to an allocentric-based strategy, but only when their initial view on the extra-maze cues differs markedly from their original position. This task is ideally suited to explore the effects of drugs/perturbations on allocentric/egocentric memory performance, as well as the interactions between these two memory systems.
24 Related JoVE Articles!
Play Button
Morris Water Maze Experiment
Authors: Joseph Nunez.
Institutions: Michigan State University (MSU).
The Morris water maze is widely used to study spatial memory and learning. Animals are placed in a pool of water that is colored opaque with powdered non-fat milk or non-toxic tempera paint, where they must swim to a hidden escape platform. Because they are in opaque water, the animals cannot see the platform, and cannot rely on scent to find the escape route. Instead, they must rely on external/extra-maze cues. As the animals become more familiar with the task, they are able to find the platform more quickly. Developed by Richard G. Morris in 1984, this paradigm has become one of the "gold standards" of behavioral neuroscience.
Behavior, Issue 19, Declarative, Hippocampus, Memory, Procedural, Rodent, Spatial Learning
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Construction of Vapor Chambers Used to Expose Mice to Alcohol During the Equivalent of all Three Trimesters of Human Development
Authors: Russell A. Morton, Marvin R. Diaz, Lauren A. Topper, C. Fernando Valenzuela.
Institutions: University of New Mexico Health Sciences Center.
Exposure to alcohol during development can result in a constellation of morphological and behavioral abnormalities that are collectively known as Fetal Alcohol Spectrum Disorders (FASDs). At the most severe end of the spectrum is Fetal Alcohol Syndrome (FAS), characterized by growth retardation, craniofacial dysmorphology, and neurobehavioral deficits. Studies with animal models, including rodents, have elucidated many molecular and cellular mechanisms involved in the pathophysiology of FASDs. Ethanol administration to pregnant rodents has been used to model human exposure during the first and second trimesters of pregnancy. Third trimester ethanol consumption in humans has been modeled using neonatal rodents. However, few rodent studies have characterized the effect of ethanol exposure during the equivalent to all three trimesters of human pregnancy, a pattern of exposure that is common in pregnant women. Here, we show how to build vapor chambers from readily obtainable materials that can each accommodate up to six standard mouse cages. We describe a vapor chamber paradigm that can be used to model exposure to ethanol, with minimal handling, during all three trimesters. Our studies demonstrate that pregnant dams developed significant metabolic tolerance to ethanol. However, neonatal mice did not develop metabolic tolerance and the number of fetuses, fetus weight, placenta weight, number of pups/litter, number of dead pups/litter, and pup weight were not significantly affected by ethanol exposure. An important advantage of this paradigm is its applicability to studies with genetically-modified mice. Additionally, this paradigm minimizes handling of animals, a major confound in fetal alcohol research.
Medicine, Issue 89, fetal, ethanol, exposure, paradigm, vapor, development, alcoholism, teratogenic, animal, mouse, model
Play Button
Combining Magnetic Sorting of Mother Cells and Fluctuation Tests to Analyze Genome Instability During Mitotic Cell Aging in Saccharomyces cerevisiae
Authors: Melissa N. Patterson, Patrick H. Maxwell.
Institutions: Rensselaer Polytechnic Institute.
Saccharomyces cerevisiae has been an excellent model system for examining mechanisms and consequences of genome instability. Information gained from this yeast model is relevant to many organisms, including humans, since DNA repair and DNA damage response factors are well conserved across diverse species. However, S. cerevisiae has not yet been used to fully address whether the rate of accumulating mutations changes with increasing replicative (mitotic) age due to technical constraints. For instance, measurements of yeast replicative lifespan through micromanipulation involve very small populations of cells, which prohibit detection of rare mutations. Genetic methods to enrich for mother cells in populations by inducing death of daughter cells have been developed, but population sizes are still limited by the frequency with which random mutations that compromise the selection systems occur. The current protocol takes advantage of magnetic sorting of surface-labeled yeast mother cells to obtain large enough populations of aging mother cells to quantify rare mutations through phenotypic selections. Mutation rates, measured through fluctuation tests, and mutation frequencies are first established for young cells and used to predict the frequency of mutations in mother cells of various replicative ages. Mutation frequencies are then determined for sorted mother cells, and the age of the mother cells is determined using flow cytometry by staining with a fluorescent reagent that detects bud scars formed on their cell surfaces during cell division. Comparison of predicted mutation frequencies based on the number of cell divisions to the frequencies experimentally observed for mother cells of a given replicative age can then identify whether there are age-related changes in the rate of accumulating mutations. Variations of this basic protocol provide the means to investigate the influence of alterations in specific gene functions or specific environmental conditions on mutation accumulation to address mechanisms underlying genome instability during replicative aging.
Microbiology, Issue 92, Aging, mutations, genome instability, Saccharomyces cerevisiae, fluctuation test, magnetic sorting, mother cell, replicative aging
Play Button
Rapid Genotyping of Animals Followed by Establishing Primary Cultures of Brain Neurons
Authors: Jin-Young Koh, Sadahiro Iwabuchi, Zhengmin Huang, N. Charles Harata.
Institutions: University of Iowa Carver College of Medicine, University of Iowa Carver College of Medicine, EZ BioResearch LLC.
High-resolution analysis of the morphology and function of mammalian neurons often requires the genotyping of individual animals followed by the analysis of primary cultures of neurons. We describe a set of procedures for: labeling newborn mice to be genotyped, rapid genotyping, and establishing low-density cultures of brain neurons from these mice. Individual mice are labeled by tattooing, which allows for long-term identification lasting into adulthood. Genotyping by the described protocol is fast and efficient, and allows for automated extraction of nucleic acid with good reliability. This is useful under circumstances where sufficient time for conventional genotyping is not available, e.g., in mice that suffer from neonatal lethality. Primary neuronal cultures are generated at low density, which enables imaging experiments at high spatial resolution. This culture method requires the preparation of glial feeder layers prior to neuronal plating. The protocol is applied in its entirety to a mouse model of the movement disorder DYT1 dystonia (ΔE-torsinA knock-in mice), and neuronal cultures are prepared from the hippocampus, cerebral cortex and striatum of these mice. This protocol can be applied to mice with other genetic mutations, as well as to animals of other species. Furthermore, individual components of the protocol can be used for isolated sub-projects. Thus this protocol will have wide applications, not only in neuroscience but also in other fields of biological and medical sciences.
Neuroscience, Issue 95, AP2, genotyping, glial feeder layer, mouse tail, neuronal culture, nucleic-acid extraction, PCR, tattoo, torsinA
Play Button
A Manual Small Molecule Screen Approaching High-throughput Using Zebrafish Embryos
Authors: Shahram Jevin Poureetezadi, Eric K. Donahue, Rebecca A. Wingert.
Institutions: University of Notre Dame.
Zebrafish have become a widely used model organism to investigate the mechanisms that underlie developmental biology and to study human disease pathology due to their considerable degree of genetic conservation with humans. Chemical genetics entails testing the effect that small molecules have on a biological process and is becoming a popular translational research method to identify therapeutic compounds. Zebrafish are specifically appealing to use for chemical genetics because of their ability to produce large clutches of transparent embryos, which are externally fertilized. Furthermore, zebrafish embryos can be easily drug treated by the simple addition of a compound to the embryo media. Using whole-mount in situ hybridization (WISH), mRNA expression can be clearly visualized within zebrafish embryos. Together, using chemical genetics and WISH, the zebrafish becomes a potent whole organism context in which to determine the cellular and physiological effects of small molecules. Innovative advances have been made in technologies that utilize machine-based screening procedures, however for many labs such options are not accessible or remain cost-prohibitive. The protocol described here explains how to execute a manual high-throughput chemical genetic screen that requires basic resources and can be accomplished by a single individual or small team in an efficient period of time. Thus, this protocol provides a feasible strategy that can be implemented by research groups to perform chemical genetics in zebrafish, which can be useful for gaining fundamental insights into developmental processes, disease mechanisms, and to identify novel compounds and signaling pathways that have medically relevant applications.
Developmental Biology, Issue 93, zebrafish, chemical genetics, chemical screen, in vivo small molecule screen, drug discovery, whole mount in situ hybridization (WISH), high-throughput screening (HTS), high-content screening (HCS)
Play Button
Moderate Prenatal Alcohol Exposure and Quantification of Social Behavior in Adult Rats
Authors: Derek A. Hamilton, Christy M. Magcalas, Daniel Barto, Clark W. Bird, Carlos I. Rodriguez, Brandi C. Fink, Sergio M. Pellis, Suzy Davies, Daniel D. Savage.
Institutions: University of New Mexico, University of New Mexico, University of New Mexico, University of Lethbridge.
Alterations in social behavior are among the major negative consequences observed in children with Fetal Alcohol Spectrum Disorders (FASDs). Several independent laboratories have demonstrated robust alterations in the social behavior of rodents exposed to alcohol during brain development across a wide range of exposure durations, timing, doses, and ages at the time of behavioral quantification. Prior work from this laboratory has identified reliable alterations in specific forms of social interaction following moderate prenatal alcohol exposure (PAE) in the rat that persist well into adulthood, including increased wrestling and decreased investigation. These behavioral alterations have been useful in identifying neural circuits altered by moderate PAE1, and may hold importance for progressing toward a more complete understanding of the neural bases of PAE-related alterations in social behavior. This paper describes procedures for performing moderate PAE in which rat dams voluntarily consume ethanol or saccharin (control) throughout gestation, and measurement of social behaviors in adult offspring.
Neuroscience, Issue 94, Aggression, Alcohol Teratogenesis, Alcohol-related Neurodevelopmental Disorders, ARND, Fetal Alcohol Spectrum Disorders, FASD, Fetal Alcohol Syndrome, FAS, Social interaction
Play Button
Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone
Authors: Sandra Zehentmeier, Zoltan Cseresnyes, Juan Escribano Navarro, Raluca A. Niesner, Anja E. Hauser.
Institutions: German Rheumatism Research Center, a Leibniz Institute, German Rheumatism Research Center, a Leibniz Institute, Max-Delbrück Center for Molecular Medicine, Wimasis GmbH, Charité - University of Medicine.
Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data.
Developmental Biology, Issue 98, Image analysis, neighborhood analysis, bone marrow, stromal cells, bone marrow niches, simulation, bone cryosectioning, bone histology
Play Button
Barnes Maze Testing Strategies with Small and Large Rodent Models
Authors: Cheryl S. Rosenfeld, Sherry A. Ferguson.
Institutions: University of Missouri, Food and Drug Administration.
Spatial learning and memory of laboratory rodents is often assessed via navigational ability in mazes, most popular of which are the water and dry-land (Barnes) mazes. Improved performance over sessions or trials is thought to reflect learning and memory of the escape cage/platform location. Considered less stressful than water mazes, the Barnes maze is a relatively simple design of a circular platform top with several holes equally spaced around the perimeter edge. All but one of the holes are false-bottomed or blind-ending, while one leads to an escape cage. Mildly aversive stimuli (e.g. bright overhead lights) provide motivation to locate the escape cage. Latency to locate the escape cage can be measured during the session; however, additional endpoints typically require video recording. From those video recordings, use of automated tracking software can generate a variety of endpoints that are similar to those produced in water mazes (e.g. distance traveled, velocity/speed, time spent in the correct quadrant, time spent moving/resting, and confirmation of latency). Type of search strategy (i.e. random, serial, or direct) can be categorized as well. Barnes maze construction and testing methodologies can differ for small rodents, such as mice, and large rodents, such as rats. For example, while extra-maze cues are effective for rats, smaller wild rodents may require intra-maze cues with a visual barrier around the maze. Appropriate stimuli must be identified which motivate the rodent to locate the escape cage. Both Barnes and water mazes can be time consuming as 4-7 test trials are typically required to detect improved learning and memory performance (e.g. shorter latencies or path lengths to locate the escape platform or cage) and/or differences between experimental groups. Even so, the Barnes maze is a widely employed behavioral assessment measuring spatial navigational abilities and their potential disruption by genetic, neurobehavioral manipulations, or drug/ toxicant exposure.
Behavior, Issue 84, spatial navigation, rats, Peromyscus, mice, intra- and extra-maze cues, learning, memory, latency, search strategy, escape motivation
Play Button
PLGA Nanoparticles Formed by Single- or Double-emulsion with Vitamin E-TPGS
Authors: Rebecca L. McCall, Rachael W. Sirianni.
Institutions: Barrow Neurological Institute.
Poly(lactic-co-glycolic acid) (PLGA) is a biocompatible member of the aliphatic polyester family of biodegradable polymers. PLGA has long been a popular choice for drug delivery applications, particularly since it is already FDA-approved for use in humans in the form of resorbable sutures. Hydrophobic and hydrophilic drugs are encapsulated in PLGA particles via single- or double-emulsion. Briefly, the drug is dissolved with polymer or emulsified with polymer in an organic phase that is then emulsified with the aqueous phase. After the solvent has evaporated, particles are washed and collected via centrifugation for lyophilization and long term storage. PLGA degrades slowly via hydrolysis in aqueous environments, and encapsulated agents are released over a period of weeks to months. Although PLGA is a material that possesses many advantages for drug delivery, reproducible formation of nanoparticles can be challenging; considerable variability is introduced by the use of different equipment, reagents batch, and precise method of emulsification. Here, we describe in great detail the formation and characterization of microparticles and nanoparticles formed by single- or double-emulsion using the emulsifying agent vitamin E-TPGS. Particle morphology and size are determined with scanning electron microscopy (SEM). We provide representative SEM images for nanoparticles produced with varying emulsifier concentration, as well as examples of imaging artifacts and failed emulsifications. This protocol can be readily adapted to use alternative emulsifiers (e.g. poly(vinyl alcohol), PVA) or solvents (e.g. dichloromethane, DCM).
Chemistry, Issue 82, Nanoparticles, Microparticles, PLGA, TPGS, drug delivery, scanning electron microscopy, emulsion, polymers
Play Button
Using an Automated 3D-tracking System to Record Individual and Shoals of Adult Zebrafish
Authors: Hans Maaswinkel, Liqun Zhu, Wei Weng.
Institutions: xyZfish.
Like many aquatic animals, zebrafish (Danio rerio) moves in a 3D space. It is thus preferable to use a 3D recording system to study its behavior. The presented automatic video tracking system accomplishes this by using a mirror system and a calibration procedure that corrects for the considerable error introduced by the transition of light from water to air. With this system it is possible to record both single and groups of adult zebrafish. Before use, the system has to be calibrated. The system consists of three modules: Recording, Path Reconstruction, and Data Processing. The step-by-step protocols for calibration and using the three modules are presented. Depending on the experimental setup, the system can be used for testing neophobia, white aversion, social cohesion, motor impairments, novel object exploration etc. It is especially promising as a first-step tool to study the effects of drugs or mutations on basic behavioral patterns. The system provides information about vertical and horizontal distribution of the zebrafish, about the xyz-components of kinematic parameters (such as locomotion, velocity, acceleration, and turning angle) and it provides the data necessary to calculate parameters for social cohesions when testing shoals.
Behavior, Issue 82, neuroscience, Zebrafish, Danio rerio, anxiety, Shoaling, Pharmacology, 3D-tracking, MK801
Play Button
Tomato Analyzer: A Useful Software Application to Collect Accurate and Detailed Morphological and Colorimetric Data from Two-dimensional Objects
Authors: Gustavo R. Rodríguez, Jennifer B. Moyseenko, Matthew D. Robbins, Nancy Huarachi Morejón, David M. Francis, Esther van der Knaap.
Institutions: The Ohio State University.
Measuring fruit morphology and color traits of vegetable and fruit crops in an objective and reproducible way is important for detailed phenotypic analyses of these traits. Tomato Analyzer (TA) is a software program that measures 37 attributes related to two-dimensional shape in a semi-automatic and reproducible manner1,2. Many of these attributes, such as angles at the distal and proximal ends of the fruit and areas of indentation, are difficult to quantify manually. The attributes are organized in ten categories within the software: Basic Measurement, Fruit Shape Index, Blockiness, Homogeneity, Proximal Fruit End Shape, Distal Fruit End Shape, Asymmetry, Internal Eccentricity, Latitudinal Section and Morphometrics. The last category requires neither prior knowledge nor predetermined notions of the shape attributes, so morphometric analysis offers an unbiased option that may be better adapted to high-throughput analyses than attribute analysis. TA also offers the Color Test application that was designed to collect color measurements from scanned images and allow scanning devices to be calibrated using color standards3. TA provides several options to export and analyze shape attribute, morphometric, and color data. The data may be exported to an excel file in batch mode (more than 100 images at one time) or exported as individual images. The user can choose between output that displays the average for each attribute for the objects in each image (including standard deviation), or an output that displays the attribute values for each object on the image. TA has been a valuable and effective tool for indentifying and confirming tomato fruit shape Quantitative Trait Loci (QTL), as well as performing in-depth analyses of the effect of key fruit shape genes on plant morphology. Also, TA can be used to objectively classify fruit into various shape categories. Lastly, fruit shape and color traits in other plant species as well as other plant organs such as leaves and seeds can be evaluated with TA.
Plant Biology, Issue 37, morphology, color, image processing, quantitative trait loci, software
Play Button
A Mouse Model of in Utero Transplantation
Authors: Amar Nijagal, Tom Le, Marta Wegorzewska, Tippi C. MacKenzie.
Institutions: University of California, University of California, University of California.
The transplantation of stem cells and viruses in utero has tremendous potential for treating congenital disorders in the human fetus. For example, in utero transplantation (IUT) of hematopoietic stem cells has been used to successfully treat patients with severe combined immunodeficiency.1,2 In several other conditions, however, IUT has been attempted without success.3 Given these mixed results, the availability of an efficient non-human model to study the biological sequelae of stem cell transplantation and gene therapy is critical to advance this field. We and others have used the mouse model of IUT to study factors affecting successful engraftment of in utero transplanted hematopoietic stem cells in both wild-type mice4-7 and those with genetic diseases.8,9 The fetal environment also offers considerable advantages for the success of in utero gene therapy. For example, the delivery of adenoviral10, adeno-associated viral10, retroviral11, and lentiviral vectors12,13 into the fetus has resulted in the transduction of multiple organs distant from the site of injection with long-term gene expression. in utero gene therapy may therefore be considered as a possible treatment strategy for single gene disorders such as muscular dystrophy or cystic fibrosis. Another potential advantage of IUT is the ability to induce immune tolerance to a specific antigen. As seen in mice with hemophilia, the introduction of Factor IX early in development results in tolerance to this protein.14 In addition to its use in investigating potential human therapies, the mouse model of IUT can be a powerful tool to study basic questions in developmental and stem cell biology. For example, one can deliver various small molecules to induce or inhibit specific gene expression at defined gestational stages and manipulate developmental pathways. The impact of these alterations can be assessed at various timepoints after the initial transplantation. Furthermore, one can transplant pluripotent or lineage specific progenitor cells into the fetal environment to study stem cell differentiation in a non-irradiated and unperturbed host environment. The mouse model of IUT has already provided numerous insights within the fields of immunology, and developmental and stem cell biology. In this video-based protocol, we describe a step-by-step approach to performing IUT in mouse fetuses and outline the critical steps and potential pitfalls of this technique.
Medicine, Issue 47, development, stem cells, transplantation, in utero
Play Button
Oral Biofilm Analysis of Palatal Expanders by Fluorescence In-Situ Hybridization and Confocal Laser Scanning Microscopy
Authors: Barbara Klug, Claudia Rodler, Martin Koller, Gernot Wimmer, Harald H. Kessler, Martin Grube, Elisabeth Santigli.
Institutions: Medical University of Graz, Medical University of Graz, Medical University of Graz, Karl-Franzens-University Graz.
Confocal laser scanning microscopy (CLSM) of natural heterogeneous biofilm is today facilitated by a comprehensive range of staining techniques, one of them being fluorescence in situ hybridization (FISH).1,2 We performed a pilot study in which oral biofilm samples collected from fixed orthodontic appliances (palatal expanders) were stained by FISH, the objective being to assess the three-dimensional organization of natural biofilm and plaque accumulation.3,4 FISH creates an opportunity to stain cells in their native biofilm environment by the use of fluorescently labeled 16S rRNA-targeting probes.4-7,19 Compared to alternative techniques like immunofluorescent labeling, this is an inexpensive, precise and straightforward labeling technique to investigate different bacterial groups in mixed biofilm consortia.18,20 General probes were used that bind to Eubacteria (EUB338 + EUB338II + EUB338III; hereafter EUBmix),8-10 Firmicutes (LGC354 A-C; hereafter LGCmix),9,10 and Bacteroidetes (Bac303).11 In addition, specific probes binding to Streptococcus mutans (MUT590)12,13 and Porphyromonas gingivalis (POGI)13,14 were used. The extreme hardness of the surface materials involved (stainless steel and acrylic resin) compelled us to find new ways of preparing the biofilm. As these surface materials could not be readily cut with a cryotome, various sampling methods were explored to obtain intact oral biofilm. The most workable of these approaches is presented in this communication. Small flakes of the biofilm-carrying acrylic resin were scraped off with a sterile scalpel, taking care not to damage the biofilm structure. Forceps were used to collect biofilm from the steel surfaces. Once collected, the samples were fixed and placed directly on polysine coated glass slides. FISH was performed directly on these slides with the probes mentioned above. Various FISH protocols were combined and modified to create a new protocol that was easy to handle.5,10,14,15 Subsequently the samples were analyzed by confocal laser scanning microscopy. Well-known configurations3,4,16,17 could be visualized, including mushroom-style formations and clusters of coccoid bacteria pervaded by channels. In addition, the bacterial composition of these typical biofilm structures were analyzed and 2D and 3D images created.
Medicine, Issue 56, fluorescence in situ hybridization, FISH, confocal laser scanning microscopy, CLSM, orthodontic appliances, oral biofilm
Play Button
Simultaneous Synthesis of Single-walled Carbon Nanotubes and Graphene in a Magnetically-enhanced Arc Plasma
Authors: Jian Li, Alexey Shashurin, Madhusudhan Kundrapu, Michael Keidar.
Institutions: The George Washington University.
Carbon nanostructures such as single-walled carbon nanotubes (SWCNT) and graphene attract a deluge of interest of scholars nowadays due to their very promising application for molecular sensors, field effect transistor and super thin and flexible electronic devices1-4. Anodic arc discharge supported by the erosion of the anode material is one of the most practical and efficient methods, which can provide specific non-equilibrium processes and a high influx of carbon material to the developing structures at relatively higher temperature, and consequently the as-synthesized products have few structural defects and better crystallinity. To further improve the controllability and flexibility of the synthesis of carbon nanostructures in arc discharge, magnetic fields can be applied during the synthesis process according to the strong magnetic responses of arc plasmas. It was demonstrated that the magnetically-enhanced arc discharge can increase the average length of SWCNT 5, narrow the diameter distribution of metallic catalyst particles and carbon nanotubes 6, and change the ratio of metallic and semiconducting carbon nanotubes 7, as well as lead to graphene synthesis 8. Furthermore, it is worthwhile to remark that when we introduce a non-uniform magnetic field with the component normal to the current in arc, the Lorentz force along the J×B direction can generate the plasmas jet and make effective delivery of carbon ion particles and heat flux to samples. As a result, large-scale graphene flakes and high-purity single-walled carbon nanotubes were simultaneously generated by such new magnetically-enhanced anodic arc method. Arc imaging, scanning electron microscope (SEM), transmission electron microscope (TEM) and Raman spectroscopy were employed to analyze the characterization of carbon nanostructures. These findings indicate a wide spectrum of opportunities to manipulate with the properties of nanostructures produced in plasmas by means of controlling the arc conditions.
Bioengineering, Issue 60, Arc discharge, magnetic control, single-walled carbon nanotubes, graphene
Play Button
Fruit Volatile Analysis Using an Electronic Nose
Authors: Simona Vallone, Nathan W. Lloyd, Susan E. Ebeler, Florence Zakharov.
Institutions: University of California, Davis, University of California, Davis, University of California, Davis.
Numerous and diverse physiological changes occur during fruit ripening, including the development of a specific volatile blend that characterizes fruit aroma. Maturity at harvest is one of the key factors influencing the flavor quality of fruits and vegetables1. The validation of robust methods that rapidly assess fruit maturity and aroma quality would allow improved management of advanced breeding programs, production practices and postharvest handling. Over the last three decades, much research has been conducted to develop so-called electronic noses, which are devices able to rapidly detect odors and flavors2-4. Currently there are several commercially available electronic noses able to perform volatile analysis, based on different technologies. The electronic nose used in our work (zNose, EST, Newbury Park, CA, USA), consists of ultra-fast gas chromatography coupled with a surface acoustic wave sensor (UFGC-SAW). This technology has already been tested for its ability to monitor quality of various commodities, including detection of deterioration in apple5; ripeness and rot evaluation in mango6; aroma profiling of thymus species7; C6 volatile compounds in grape berries8; characterization of vegetable oil9 and detection of adulterants in virgin coconut oil10. This system can perform the three major steps of aroma analysis: headspace sampling, separation of volatile compounds, and detection. In about one minute, the output, a chromatogram, is produced and, after a purging cycle, the instrument is ready for further analysis. The results obtained with the zNose can be compared to those of other gas-chromatographic systems by calculation of Kovats Indices (KI). Once the instrument has been tuned with an alkane standard solution, the retention times are automatically converted into KIs. However, slight changes in temperature and flow rate are expected to occur over time, causing retention times to drift. Also, depending on the polarity of the column stationary phase, the reproducibility of KI calculations can vary by several index units11. A series of programs and graphical interfaces were therefore developed to compare calculated KIs among samples in a semi-automated fashion. These programs reduce the time required for chromatogram analysis of large data sets and minimize the potential for misinterpretation of the data when chromatograms are not perfectly aligned. We present a method for rapid volatile compound analysis in fruit. Sample preparation, data acquisition and handling procedures are also discussed.
Plant Biology, Issue 61, zNose, volatile profiling, aroma, Kovats Index, electronic nose, gas chromatography, retention time shift
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Measurement of Lifespan in Drosophila melanogaster
Authors: Nancy J. Linford, Ceyda Bilgir, Jennifer Ro, Scott D. Pletcher.
Institutions: University of Michigan , University of Michigan .
Aging is a phenomenon that results in steady physiological deterioration in nearly all organisms in which it has been examined, leading to reduced physical performance and increased risk of disease. Individual aging is manifest at the population level as an increase in age-dependent mortality, which is often measured in the laboratory by observing lifespan in large cohorts of age-matched individuals. Experiments that seek to quantify the extent to which genetic or environmental manipulations impact lifespan in simple model organisms have been remarkably successful for understanding the aspects of aging that are conserved across taxa and for inspiring new strategies for extending lifespan and preventing age-associated disease in mammals. The vinegar fly, Drosophila melanogaster, is an attractive model organism for studying the mechanisms of aging due to its relatively short lifespan, convenient husbandry, and facile genetics. However, demographic measures of aging, including age-specific survival and mortality, are extraordinarily susceptible to even minor variations in experimental design and environment, and the maintenance of strict laboratory practices for the duration of aging experiments is required. These considerations, together with the need to practice careful control of genetic background, are essential for generating robust measurements. Indeed, there are many notable controversies surrounding inference from longevity experiments in yeast, worms, flies and mice that have been traced to environmental or genetic artifacts1-4. In this protocol, we describe a set of procedures that have been optimized over many years of measuring longevity in Drosophila using laboratory vials. We also describe the use of the dLife software, which was developed by our laboratory and is available for download ( dLife accelerates throughput and promotes good practices by incorporating optimal experimental design, simplifying fly handling and data collection, and standardizing data analysis. We will also discuss the many potential pitfalls in the design, collection, and interpretation of lifespan data, and we provide steps to avoid these dangers.
Developmental Biology, Issue 71, Cellular Biology, Molecular Biology, Anatomy, Physiology, Entomology, longevity, lifespan, aging, Drosophila melanogaster, fruit fly, Drosophila, mortality, animal model
Play Button
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Authors: Phoebe Spetsieris, Yilong Ma, Shichun Peng, Ji Hyun Ko, Vijay Dhawan, Chris C. Tang, David Eidelberg.
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
Play Button
Quantification of Heavy Metals and Other Inorganic Contaminants on the Productivity of Microalgae
Authors: Katerine Napan, Derek Hess, Brian McNeil, Jason C. Quinn.
Institutions: Utah State University.
Increasing demand for renewable fuels has researchers investigating the feasibility of alternative feedstocks, such as microalgae. Inherent advantages include high potential yield, use of non-arable land and integration with waste streams. The nutrient requirements of a large-scale microalgae production system will require the coupling of cultivation systems with industrial waste resources, such as carbon dioxide from flue gas and nutrients from wastewater. Inorganic contaminants present in these wastes can potentially lead to bioaccumulation in microalgal biomass negatively impact productivity and limiting end use. This study focuses on the experimental evaluation of the impact and the fate of 14 inorganic contaminants (As, Cd, Co, Cr, Cu, Hg, Mn, Ni, Pb, Sb, Se, Sn, V and Zn) on Nannochloropsis salina growth. Microalgae were cultivated in photobioreactors illuminated at 984 µmol m-2 sec-1 and maintained at pH 7 in a growth media polluted with inorganic contaminants at levels expected based on the composition found in commercial coal flue gas systems. Contaminants present in the biomass and the medium at the end of a 7 day growth period were analytically quantified through cold vapor atomic absorption spectrometry for Hg and through inductively coupled plasma mass spectrometry for As, Cd, Co, Cr, Cu, Mn, Ni, Pb, Sb, Se, Sn, V and Zn. Results show N. salina is a sensitive strain to the multi-metal environment with a statistical decrease in biomass yieldwith the introduction of these contaminants. The techniques presented here are adequate for quantifying algal growth and determining the fate of inorganic contaminants.
Environmental Sciences, Issue 101, algae, heavy metals, Nannochloropsis salina, photobioreactor, flue gas, inductively coupled plasma mass spectrometry, ICPMS, cold vapor atomic absorption spectrometry, CVAAS
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.