The non-human primate is an important translational species for understanding the normal function and disease processes of the human brain. Unbiased stereology, the method accepted as state-of-the-art for quantification of biological objects in tissue sections2, generates reliable structural data for biological features in the mammalian brain3. The key components of the approach are unbiased (systematic-random) sampling of anatomically defined structures (reference spaces), combined with quantification of cell numbers and size, fiber and capillary lengths, surface areas, regional volumes and spatial distributions of biological objects within the reference space4. Among the advantages of these stereological approaches over previous methods is the avoidance of all known sources of systematic (non-random) error arising from faulty assumptions and non-verifiable models. This study documents a biological application of computerized stereology to estimate the total neuronal population in the frontal cortex of the vervet monkey brain (Chlorocebus aethiops sabeus), with assistance from two commercially available stereology programs, BioQuant Life Sciences and Stereologer (Figure 1). In addition to contrast and comparison of results from both the BioQuant and Stereologer systems, this study provides a detailed protocol for the Stereologer system.
26 Related JoVE Articles!
Test Samples for Optimizing STORM Super-Resolution Microscopy
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
Viability Assays for Cells in Culture
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Transgenic Rodent Assay for Quantifying Male Germ Cell Mutant Frequency
Institutions: Environmental Health Centre.
mutations arise mostly in the male germline and may contribute to adverse health outcomes in subsequent generations. Traditional methods for assessing the induction of germ cell mutations require the use of large numbers of animals, making them impractical. As such, germ cell mutagenicity is rarely assessed during chemical testing and risk assessment. Herein, we describe an in vivo
male germ cell mutation assay using a transgenic rodent model that is based on a recently approved Organisation for Economic Co-operation and Development (OECD) test guideline. This method uses an in vitro
positive selection assay to measure in vivo
mutations induced in a transgenic λgt10 vector bearing a reporter gene directly in the germ cells of exposed males. We further describe how the detection of mutations in the transgene recovered from germ cells can be used to characterize the stage-specific sensitivity of the various spermatogenic cell types to mutagen exposure by controlling three experimental parameters: the duration of exposure (administration time), the time between exposure and sample collection (sampling time), and the cell population collected for analysis. Because a large number of germ cells can be assayed from a single male, this method has superior sensitivity compared with traditional methods, requires fewer animals and therefore much less time and resources.
Genetics, Issue 90, sperm, spermatogonia, male germ cells, spermatogenesis, de novo mutation, OECD TG 488, transgenic rodent mutation assay, N-ethyl-N-nitrosourea, genetic toxicology
Experimental Protocol for Manipulating Plant-induced Soil Heterogeneity
Institutions: Case Western Reserve University.
Coexistence theory has often treated environmental heterogeneity as being independent of the community composition; however biotic feedbacks such as plant-soil feedbacks (PSF) have large effects on plant performance, and create environmental heterogeneity that depends on the community composition. Understanding the importance of PSF for plant community assembly necessitates understanding of the role of heterogeneity in PSF, in addition to mean PSF effects. Here, we describe a protocol for manipulating plant-induced soil heterogeneity. Two example experiments are presented: (1) a field experiment with a 6-patch grid of soils to measure plant population responses and (2) a greenhouse experiment with 2-patch soils to measure individual plant responses. Soils can be collected from the zone of root influence (soils from the rhizosphere and directly adjacent to the rhizosphere) of plants in the field from conspecific and heterospecific plant species. Replicate collections are used to avoid pseudoreplicating soil samples. These soils are then placed into separate patches for heterogeneous treatments or mixed for a homogenized treatment. Care should be taken to ensure that heterogeneous and homogenized treatments experience the same degree of soil disturbance. Plants can then be placed in these soil treatments to determine the effect of plant-induced soil heterogeneity on plant performance. We demonstrate that plant-induced heterogeneity results in different outcomes than predicted by traditional coexistence models, perhaps because of the dynamic nature of these feedbacks. Theory that incorporates environmental heterogeneity influenced by the assembling community and additional empirical work is needed to determine when heterogeneity intrinsic to the assembling community will result in different assembly outcomes compared with heterogeneity extrinsic to the community composition.
Environmental Sciences, Issue 85, Coexistence, community assembly, environmental drivers, plant-soil feedback, soil heterogeneity, soil microbial communities, soil patch
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Soil Sampling and Isolation of Entomopathogenic Nematodes (Steinernematidae, Heterorhabditidae)
Institutions: University of Arizona.
Entomopathogenic nematodes (a.k.a. EPN) represent a group of soil-inhabiting nematodes that parasitize a wide range of insects. These nematodes belong to two families: Steinernematidae and Heterorhabditidae. Until now, more than 70 species have been described in the Steinernematidae and there are about 20 species in the Heterorhabditidae. The nematodes have a mutualistic partnership with Enterobacteriaceae bacteria and together they act as a potent insecticidal complex that kills a wide range of insect species.
Herein, we focus on the most common techniques considered for collecting EPN from soil. The second part of this presentation focuses on the insect-baiting technique, a widely used approach for the isolation of EPN from soil samples, and the modified White trap technique which is used for the recovery of these nematodes from infected insects. These methods and techniques are key steps for the successful establishment of EPN cultures in the laboratory and also form the basis for other bioassays that consider these nematodes as model organisms for research in other biological disciplines. The techniques shown in this presentation correspond to those performed and/or designed by members of S. P. Stock laboratory as well as those described by various authors.
Environmental Sciences, Issue 89, Entomology, Nematology, Steinernema, Heterorhabditis, nematodes, soil sampling, insect-bait, modified White-trap
Measurement of Greenhouse Gas Flux from Agricultural Soils Using Static Chambers
Institutions: University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, USDA-ARS Dairy Forage Research Center, USDA-ARS Pasture Systems Watershed Management Research Unit.
Measurement of greenhouse gas (GHG) fluxes between the soil and the atmosphere, in both managed and unmanaged ecosystems, is critical to understanding the biogeochemical drivers of climate change and to the development and evaluation of GHG mitigation strategies based on modulation of landscape management practices. The static chamber-based method described here is based on trapping gases emitted from the soil surface within a chamber and collecting samples from the chamber headspace at regular intervals for analysis by gas chromatography. Change in gas concentration over time is used to calculate flux. This method can be utilized to measure landscape-based flux of carbon dioxide, nitrous oxide, and methane, and to estimate differences between treatments or explore system dynamics over seasons or years. Infrastructure requirements are modest, but a comprehensive experimental design is essential. This method is easily deployed in the field, conforms to established guidelines, and produces data suitable to large-scale GHG emissions studies.
Environmental Sciences, Issue 90, greenhouse gas, trace gas, gas flux, static chamber, soil, field, agriculture, climate
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
Setting Limits on Supersymmetry Using Simplified Models
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
Peptide-based Identification of Functional Motifs and their Binding Partners
Institutions: Morehouse School of Medicine, Institute for Systems Biology, Universiti Sains Malaysia.
Specific short peptides derived from motifs found in full-length proteins, in our case HIV-1 Nef, not only retain their biological function, but can also competitively inhibit the function of the full-length protein. A set of 20 Nef scanning peptides, 20 amino acids in length with each overlapping 10 amino acids of its neighbor, were used to identify motifs in Nef responsible for its induction of apoptosis. Peptides containing these apoptotic motifs induced apoptosis at levels comparable to the full-length Nef protein. A second peptide, derived from the Secretion Modification Region (SMR) of Nef, retained the ability to interact with cellular proteins involved in Nef's secretion in exosomes (exNef). This SMRwt peptide was used as the "bait" protein in co-immunoprecipitation experiments to isolate cellular proteins that bind specifically to Nef's SMR motif. Protein transfection and antibody inhibition was used to physically disrupt the interaction between Nef and mortalin, one of the isolated SMR-binding proteins, and the effect was measured with a fluorescent-based exNef secretion assay. The SMRwt peptide's ability to outcompete full-length Nef for cellular proteins that bind the SMR motif, make it the first inhibitor of exNef secretion. Thus, by employing the techniques described here, which utilize the unique properties of specific short peptides derived from motifs found in full-length proteins, one may accelerate the identification of functional motifs in proteins and the development of peptide-based inhibitors of pathogenic functions.
Virology, Issue 76, Biochemistry, Immunology, Infection, Infectious Diseases, Molecular Biology, Medicine, Genetics, Microbiology, Genomics, Proteins, Exosomes, HIV, Peptides, Exocytosis, protein trafficking, secretion, HIV-1, Nef, Secretion Modification Region, SMR, peptide, AIDS, assay
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Bioassays for Monitoring Insecticide Resistance
Institutions: University of Missouri, Delta Research Center, Louisiana State University Agricultural Center.
Pest resistance to pesticides is an increasing problem because pesticides are an integral part of high-yielding production agriculture. When few products are labeled for an individual pest within a particular crop system, chemical control options are limited. Therefore, the same product(s) are used repeatedly and continual selection pressure is placed on the target pest. There are both financial and environmental costs associated with the development of resistant populations. The cost of pesticide resistance has been estimated at approximately $ 1.5 billion annually in the United States. This paper will describe protocols, currently used to monitor arthropod (specifically insects) populations for the development of resistance. The adult vial test is used to measure the toxicity to contact insecticides and a modification of this test is used for plant-systemic insecticides. In these bioassays, insects are exposed to technical grade insecticide and responses (mortality) recorded at a specific post-exposure interval. The mortality data are subjected to Log Dose probit analysis to generate estimates of a lethal concentration that provides mortality to 50% (LC50
) of the target populations and a series of confidence limits (CL's) as estimates of data variability. When these data are collected for a range of insecticide-susceptible populations, the LC50
can be used as baseline data for future monitoring purposes. After populations have been exposed to products, the results can be compared to a previously determined LC50
using the same methodology.
Microbiology, Issue 46, Resistance monitoring, Insecticide Resistance, Pesticide Resistance, glass-vial bioassay
One Dimensional Turing-Like Handshake Test for Motor Intelligence
Institutions: Ben-Gurion University.
In the Turing test, a computer model is deemed to "think intelligently" if it can generate answers that are not distinguishable from those of a human. However, this test is limited to the linguistic aspects of machine intelligence. A salient function of the brain is the control of movement, and the movement of the human hand is a sophisticated demonstration of this function. Therefore, we propose a Turing-like handshake test, for machine motor intelligence. We administer the test through a telerobotic system in which the interrogator is engaged in a task of holding a robotic stylus and interacting with another party (human or artificial). Instead of asking the interrogator whether the other party is a person or a computer program, we employ a two-alternative forced choice method and ask which of two systems is more human-like. We extract a quantitative grade for each model according to its resemblance to the human handshake motion and name it "Model Human-Likeness Grade" (MHLG). We present three methods to estimate the MHLG. (i) By calculating the proportion of subjects' answers that the model is more human-like than the human; (ii) By comparing two weighted sums of human and model handshakes we fit a psychometric curve and extract the point of subjective equality (PSE); (iii) By comparing a given model with a weighted sum of human and random signal, we fit a psychometric curve to the answers of the interrogator and extract the PSE for the weight of the human in the weighted sum. Altogether, we provide a protocol to test computational models of the human handshake. We believe that building a model is a necessary step in understanding any phenomenon and, in this case, in understanding the neural mechanisms responsible for the generation of the human handshake.
Neuroscience, Issue 46, Turing test, Human Machine Interface, Haptics, Teleoperation, Motor Control, Motor Behavior, Diagnostics, Perception, handshake, telepresence
Aseptic Laboratory Techniques: Plating Methods
Institutions: University of California, Los Angeles .
Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories
(BMBL) as well as Material Safety Data Sheets
(MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection
(ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to:
● Perform plating procedures without contaminating media.
● Isolate single bacterial colonies by the streak-plating method.
● Use pour-plating and spread-plating methods to determine the concentration of bacteria.
● Perform soft agar overlays when working with phage.
● Transfer bacterial cells from one plate to another using the replica-plating procedure.
● Given an experimental task, select the appropriate plating method.
Basic Protocols, Issue 63, Streak plates, pour plates, soft agar overlays, spread plates, replica plates, bacteria, colonies, phage, plaques, dilutions
Long-term Lethal Toxicity Test with the Crustacean Artemia franciscana
Institutions: Institute for Environmental Protection and Research, Regional Agency for Environmental Protection in Emilia-Romagna.
Our research activities target the use of biological methods for the evaluation of environmental quality, with particular reference to saltwater/brackish water and sediment. The choice of biological indicators must be based on reliable scientific knowledge and, possibly, on the availability of standardized procedures. In this article, we present a standardized protocol that used the marine crustacean Artemia
to evaluate the toxicity of chemicals and/or of marine environmental matrices. Scientists propose that the brine shrimp (Artemia
) is a suitable candidate for the development of a standard bioassay for worldwide utilization. A number of papers have been published on the toxic effects of various chemicals and toxicants on brine shrimp (Artemia
). The major advantage of this crustacean for toxicity studies is the overall availability of the dry cysts; these can be immediately used in testing and difficult cultivation is not demanded1,2
. Cyst-based toxicity assays are cheap, continuously available, simple and reliable and are thus an important answer to routine needs of toxicity screening, for industrial monitoring requirements or for regulatory purposes3
. The proposed method involves the mortality as an endpoint. The numbers of survivors were counted and percentage of deaths were calculated. Larvae were considered dead if they did not exhibit any internal or external movement during several seconds of observation4
. This procedure was standardized testing a reference substance (Sodium Dodecyl Sulfate); some results are reported in this work. This article accompanies a video that describes the performance of procedural toxicity testing, showing all the steps related to the protocol.
Chemistry, Issue 62, Artemia franciscana, bioassays, chemical substances, crustaceans, marine environment
Sampling Human Indigenous Saliva Peptidome Using a Lollipop-Like Ultrafiltration Probe: Simplify and Enhance Peptide Detection for Clinical Mass Spectrometry
Institutions: Sanford-Burnham Medical Research Institute, University of California, San Diego , VA San Diego Healthcare Center, University of California, San Diego .
Although human saliva proteome and peptidome have been revealed 1-2
they were majorly identified from tryptic digests of saliva proteins. Identification of indigenous peptidome of human saliva without prior digestion with exogenous enzymes becomes imperative, since native peptides in human saliva provide potential values for diagnosing disease, predicting disease progression, and monitoring therapeutic efficacy. Appropriate sampling is a critical step for enhancement of identification of human indigenous saliva peptidome. Traditional methods of sampling human saliva involving centrifugation to remove debris 3-4
may be too time-consuming to be applicable for clinical use. Furthermore, debris removal by centrifugation may be unable to clean most of the infected pathogens and remove the high abundance proteins that often hinder the identification of low abundance peptidome.
Conventional proteomic approaches that primarily utilize two-dimensional gel electrophoresis (2-DE) gels in conjugation with in-gel digestion are capable of identifying many saliva proteins 5-6
. However, this approach is generally not sufficiently sensitive to detect low abundance peptides/proteins. Liquid chromatography-Mass spectrometry (LC-MS) based proteomics is an alternative that can identify proteins without prior 2-DE separation. Although this approach provides higher sensitivity, it generally needs prior sample pre-fractionation 7
and pre-digestion with trypsin, which makes it difficult for clinical use.
To circumvent the hindrance in mass spectrometry due to sample preparation, we have developed a technique called capillary ultrafiltration (CUF) probes 8-11
. Data from our laboratory demonstrated that the CUF probes are capable of capturing proteins in vivo
from various microenvironments in animals in a dynamic and minimally invasive manner 8-11
. No centrifugation is needed since a negative pressure is created by simply syringe withdrawing during sample collection. The CUF probes combined with LC-MS have successfully identified tryptic-digested proteins 8-11
. In this study, we upgraded the ultrafiltration sampling technique by creating a lollipop-like ultrafiltration (LLUF) probe that can easily fit in the human oral cavity. The direct analysis by LC-MS without trypsin digestion showed that human saliva indigenously contains many peptide fragments derived from various proteins. Sampling saliva with LLUF probes avoided centrifugation but effectively removed many larger and high abundance proteins. Our mass spectrometric results illustrated that many low abundance peptides became detectable after filtering out larger proteins with LLUF probes. Detection of low abundance saliva peptides was independent of multiple-step sample separation with chromatography. For clinical application, the LLUF probes incorporated with LC-MS could potentially be used in the future to monitor disease progression from saliva.
Medicine, Issue 66, Molecular Biology, Genetics, Sampling, Saliva, Peptidome, Ultrafiltration, Mass spectrometry
A Low Mortality Rat Model to Assess Delayed Cerebral Vasospasm After Experimental Subarachnoid Hemorrhage
Institutions: SUNY Upstate Medical University, SUNY Upstate Medical University.
Objective: To characterize and establish a reproducible model that demonstrates delayed cerebral vasospasm after aneurysmal subarachnoid hemorrhage (SAH) in rats, in order to identify the initiating events, pathophysiological changes and potential targets for treatment.
Methods: Twenty-eight male Sprague-Dawley rats (250 - 300 g) were arbitrarily assigned to one of two groups - SAH or saline control. Rat subarachnoid hemorrhage in the SAH group (n=15) was induced by double injection of autologous blood, 48 hr apart, into the cisterna magna. Similarly, normal saline (n=13) was injected into the cisterna magna of the saline control group. Rats were sacrificed on day five after the second blood injection and the brains were preserved for histological analysis. The degree of vasospasm was measured using sections of the basilar artery, by measuring the internal luminal cross sectional area using NIH Image-J software. The significance was tested using Tukey/Kramer's statistical analysis.
Results: After analysis of histological sections, basilar artery luminal cross sectional area were smaller in the SAH than in the saline group, consistent with cerebral vasospasm in the former group. In the SAH group, basilar artery internal area (.056 μm ± 3) were significantly smaller from vasospasm five days after the second blood injection (seven days after the initial blood injection), compared to the saline control group with internal area (.069 ± 3; p=0.004). There were no mortalities from cerebral vasospasm.
Conclusion: The rat double SAH model induces a mild, survivable, basilar artery vasospasm that can be used to study the pathophysiological mechanisms of cerebral vasospasm in a small animal model. A low and acceptable mortality rate is a significant criterion to be satisfied for an ideal SAH animal model so that the mechanisms of vasospasm can be elucidated 7, 8
. Further modifications of the model can be made to adjust for increased severity of vasospasm and neurological exams.
Medicine, Issue 71, Anatomy, Physiology, Neurobiology, Neuroscience, Immunology, Surgery, Aneurysm, cerebral, hemorrhage, model, mortality, rat, rodent, subarachnoid, vasospasm, animal model
Linking Predation Risk, Herbivore Physiological Stress and Microbial Decomposition of Plant Litter
Institutions: Yale University, Virginia Tech, The Hebrew University of Jerusalem.
The quantity and quality of detritus entering the soil determines the rate of decomposition by microbial communities as well as recycle rates of nitrogen (N) and carbon (C) sequestration1,2
. Plant litter comprises the majority of detritus3
, and so it is assumed that decomposition is only marginally influenced by biomass inputs from animals such as herbivores and carnivores4,5
. However, carnivores may influence microbial decomposition of plant litter via a chain of interactions in which predation risk alters the physiology of their herbivore prey that in turn alters soil microbial functioning when the herbivore carcasses are decomposed6
. A physiological stress response by herbivores to the risk of predation can change the C:N elemental composition of herbivore biomass7,8,9
because stress from predation risk increases herbivore basal energy demands that in nutrient-limited systems forces herbivores to shift their consumption from N-rich resources to support growth and reproduction to C-rich carbohydrate resources to support heightened metabolism6
. Herbivores have limited ability to store excess nutrients, so stressed herbivores excrete N as they increase carbohydrate-C consumption7
. Ultimately, prey stressed by predation risk increase their body C:N ratio7,10
, making them poorer quality resources for the soil microbial pool likely due to lower availability of labile N for microbial enzyme production6
. Thus, decomposition of carcasses of stressed herbivores has a priming effect on the functioning of microbial communities that decreases subsequent ability to of microbes to decompose plant litter6,10,11
We present the methodology to evaluate linkages between predation risk and litter decomposition by soil microbes. We describe how to: induce stress in herbivores from predation risk; measure those stress responses, and measure the consequences on microbial decomposition. We use insights from a model grassland ecosystem comprising the hunting spider predator (Pisuarina mira
), a dominant grasshopper herbivore (Melanoplus femurrubrum
),and a variety of grass and forb plants9
Environmental Sciences, Issue 73, Microbiology, Plant Biology, Entomology, Organisms, Investigative Techniques, Biological Phenomena, Chemical Phenomena, Metabolic Phenomena, Microbiological Phenomena, Earth Resources and Remote Sensing, Life Sciences (General), Litter Decomposition, Ecological Stoichiometry, Physiological Stress and Ecosystem Function, Predation Risk, Soil Respiration, Carbon Sequestration, Soil Science, respiration, spider, grasshoper, model system
Whole-Body Nanoparticle Aerosol Inhalation Exposures
Institutions: West Virginia University , West Virginia University , National Institute for Occupational Safety and Health.
Inhalation is the most likely exposure route for individuals working with aerosolizable engineered nano-materials (ENM). To properly perform nanoparticle inhalation toxicology studies, the aerosols in a chamber housing the experimental animals must have: 1) a steady concentration maintained at a desired level for the entire exposure period; 2) a homogenous composition free of contaminants; and 3) a stable size distribution with a geometric mean diameter < 200 nm and a geometric standard deviation σg
< 2.5 5
. The generation of aerosols containing nanoparticles is quite challenging because nanoparticles easily agglomerate. This is largely due to very strong inter-particle forces and the formation of large fractal structures in tens or hundreds of microns in size 6
, which are difficult to be broken up. Several common aerosol generators, including nebulizers, fluidized beds, Venturi aspirators and the Wright dust feed, were tested; however, none were able to produce nanoparticle aerosols which satisfy all criteria 5
A whole-body nanoparticle aerosol inhalation exposure system was fabricated, validated and utilized for nano-TiO2
inhalation toxicology studies. Critical components: 1) novel nano-TiO2
aerosol generator; 2) 0.5 m3
whole-body inhalation exposure chamber; and 3) monitor and control system. Nano-TiO2
aerosols generated from bulk dry nano-TiO2
powders (primary diameter of 21 nm, bulk density of 3.8 g/cm3
) were delivered into the exposure chamber at a flow rate of 90 LPM (10.8 air changes/hr). Particle size distribution and mass concentration profiles were measured continuously with a scanning mobility particle sizer (SMPS), and an electric low pressure impactor (ELPI). The aerosol mass concentration (C
) was verified gravimetrically (mg/m3
). The mass (M
) of the collected particles was determined as M = (Mpost-Mpre), where Mpre
are masses of the filter before and after sampling (mg
). The mass concentration was calculated as C = M/(Q*t),
is sampling flowrate (m3/min
is the sampling time (minute
). The chamber pressure, temperature, relative humidity (RH), O2
concentrations were monitored and controlled continuously. Nano-TiO2
aerosols collected on Nuclepore filters were analyzed with a scanning electron microscope (SEM) and energy dispersive X-ray (EDX) analysis.
In summary, we report that the nano-particle aerosols generated and delivered to our exposure chamber have: 1) steady mass concentration; 2) homogenous composition free of contaminants; 3) stable particle size distributions with a count-median aerodynamic diameter of 157 nm during aerosol generation. This system reliably and repeatedly creates test atmospheres that simulate occupational, environmental or domestic ENM aerosol exposures.
Medicine, Issue 75, Physiology, Anatomy, Chemistry, Biomedical Engineering, Pharmacology, Titanium dioxide, engineered nanomaterials, nanoparticle, toxicology, inhalation exposure, aerosols, dry powder, animal model
The Rabbit Blood-shunt Model for the Study of Acute and Late Sequelae of Subarachnoid Hemorrhage: Technical Aspects
Institutions: University and Bern University Hospital (Inselspital), Kantonsspital Aarau, Boston Children's Hospital, Boston Children's Hospital, University and Bern University Hospital (Inselspital), University Hospital Cologne, Länggasse Bern.
Early brain injury and delayed cerebral vasospasm both contribute to unfavorable outcomes after subarachnoid hemorrhage (SAH). Reproducible and controllable animal models that simulate both conditions are presently uncommon. Therefore, new models are needed in order to mimic human pathophysiological conditions resulting from SAH.
This report describes the technical nuances of a rabbit blood-shunt SAH model that enables control of intracerebral pressure (ICP). An extracorporeal shunt is placed between the arterial system and the subarachnoid space, which enables examiner-independent SAH in a closed cranium. Step-by-step procedural instructions and necessary equipment are described, as well as technical considerations to produce the model with minimal mortality and morbidity. Important details required for successful surgical creation of this robust, simple and consistent ICP-controlled SAH rabbit model are described.
Medicine, Issue 92,
Subarachnoid hemorrhage, animal models, rabbit, extracorporeal blood shunt, early brain injury, delayed cerebral vasospasm, microsurgery.
Methods for ECG Evaluation of Indicators of Cardiac Risk, and Susceptibility to Aconitine-induced Arrhythmias in Rats Following Status Epilepticus
Institutions: University of Utah.
Lethal cardiac arrhythmias contribute to mortality in a number of pathological conditions. Several parameters obtained from a
non-invasive, easily obtained electrocardiogram (ECG) are established, well-validated prognostic indicators of cardiac risk in patients suffering
from a number of cardiomyopathies. Increased heart rate, decreased heart rate variability (HRV), and increased duration and variability of cardiac
ventricular electrical activity (QT interval) are all indicative of enhanced cardiac risk 1-4
. In animal models, it is valuable to compare
these ECG-derived variables and susceptibility to experimentally induced arrhythmias. Intravenous infusion of the arrhythmogenic agent aconitine has
been widely used to evaluate susceptibility to arrhythmias in a range of experimental conditions, including animal models of depression 5
, following exercise 7
and exposure to air pollutants 8
, as well as determination of the antiarrhythmic efficacy of pharmacological
It should be noted that QT dispersion in humans is a measure of QT interval variation across the full set of leads from a
standard 12-lead ECG. Consequently, the measure of QT dispersion from the 2-lead ECG in the rat described in this protocol is different than that
calculated from human ECG records. This represents a limitation in the translation of the data obtained from rodents to human clinical medicine.
Status epilepticus (SE) is a single seizure or series of continuously recurring seizures lasting more than 30 min
, and results in mortality in 20% of cases 13
. Many individuals survive the SE, but die within 30 days 14,15
The mechanism(s) of this delayed mortality is not fully understood. It has been suggested that lethal ventricular arrhythmias contribute to many of these
. In addition to SE, patients experiencing spontaneously recurring seizures, i.e. epilepsy, are at risk of premature
sudden and unexpected death associated with epilepsy (SUDEP) 18
. As with SE, the precise mechanisms mediating SUDEP are not known.
It has been proposed that ventricular abnormalities and resulting arrhythmias make a significant contribution 18-22
To investigate the mechanisms of seizure-related cardiac death, and the efficacy of cardioprotective therapies, it is
necessary to obtain both ECG-derived indicators of risk and evaluate susceptibility to cardiac arrhythmias in animal models of seizure disorders
. Here we describe methods for implanting ECG electrodes in the Sprague-Dawley laboratory rat (Rattus norvegicus), following SE,
collection and analysis of ECG recordings, and induction of arrhythmias during iv infusion of aconitine.
These procedures can be used to directly determine the relationships between ECG-derived measures of cardiac electrical
activity and susceptibility to ventricular arrhythmias in rat models of seizure disorders, or any pathology associated with increased risk of sudden
Medicine, Issue 50, cardiac, seizure disorders, QTc, QTd, cardiac arrhythmias, rat
NanoDrop Microvolume Quantitation of Nucleic Acids
Institutions: Wilmington, Delaware.
Biomolecular assays are continually being developed that use progressively smaller amounts of material, often precluding the use of conventional cuvette-based instruments for nucleic acid quantitation for those that can perform microvolume quantitation.
The NanoDrop microvolume sample retention system (Thermo Scientific NanoDrop Products) functions by combining fiber optic technology and natural surface tension properties to capture and retain minute amounts of sample independent of traditional containment apparatus such as cuvettes or capillaries. Furthermore, the system employs shorter path lengths, which result in a broad range of nucleic acid concentration measurements, essentially eliminating the need to perform dilutions. Reducing the volume of sample required for spectroscopic analysis also facilitates the inclusion of additional quality control steps throughout many molecular workflows, increasing efficiency and ultimately leading to greater confidence in downstream results.
The need for high-sensitivity fluorescent analysis of limited mass has also emerged with recent experimental advances. Using the same microvolume sample retention technology, fluorescent measurements may be performed with 2 μL of material, allowing fluorescent assays volume requirements to be significantly reduced. Such microreactions of 10 μL or less are now possible using a dedicated microvolume fluorospectrometer.
Two microvolume nucleic acid quantitation protocols will be demonstrated that use integrated sample retention systems as practical alternatives to traditional cuvette-based protocols. First, a direct A260 absorbance method using a microvolume spectrophotometer is described. This is followed by a demonstration of a fluorescence-based method that enables reduced-volume fluorescence reactions with a microvolume fluorospectrometer. These novel techniques enable the assessment of nucleic acid concentrations ranging from 1 pg/ μL to 15,000 ng/ μL with minimal consumption of sample.
Basic Protocols, Issue 45, NanoDrop, Microvolume Quantitation, DNA Quantitation, Nucleic Acid Quantitation, DNA Quantification, RNA Quantification, Microvolume Spectrophotometer, Microvolume Fluorometer, DNA A260, Fluorescence PicoGreen
Microsurgical Clip Obliteration of Middle Cerebral Aneurysm Using Intraoperative Flow Assessment
Institutions: Havard Medical School, Massachusetts General Hospital.
Cerebral aneurysms are abnormal widening or ballooning of a localized segment of an intracranial blood vessel. Surgical clipping is an important treatment for aneurysms which attempts to exclude blood from flowing into the aneurysmal segment of the vessel while preserving blood flow in a normal fashion. Improper clip placement may result in residual aneurysm with the potential for subsequent aneurysm rupture or partial or full occlusion of distal arteries resulting in cerebral infarction. Here we describe the use of an ultrasonic flow probe to provide quantitative evaluation of arterial flow before and after microsurgical clip placement at the base of a middle cerebral artery aneurysm. This information helps ensure adequate aneurysm reconstruction with preservation of normal distal blood flow.
Medicine, Issue 31, Aneurysm, intraoperative, brain, surgery, surgical clipping, blood flow, aneurysmal segment, ultrasonic flow probe