The efficacy of DNA extraction protocols can be highly dependent upon both the type of sample being investigated and the types of downstream analyses performed. Considering that the use of new bacterial community analysis techniques (e.g., microbiomics, metagenomics) is becoming more prevalent in the agricultural and environmental sciences and many environmental samples within these disciplines can be physiochemically and microbiologically unique (e.g., fecal and litter/bedding samples from the poultry production spectrum), appropriate and effective DNA extraction methods need to be carefully chosen. Therefore, a novel semi-automated hybrid DNA extraction method was developed specifically for use with environmental poultry production samples. This method is a combination of the two major types of DNA extraction: mechanical and enzymatic. A two-step intense mechanical homogenization step (using bead-beating specifically formulated for environmental samples) was added to the beginning of the “gold standard” enzymatic DNA extraction method for fecal samples to enhance the removal of bacteria and DNA from the sample matrix and improve the recovery of Gram-positive bacterial community members. Once the enzymatic extraction portion of the hybrid method was initiated, the remaining purification process was automated using a robotic workstation to increase sample throughput and decrease sample processing error. In comparison to the strict mechanical and enzymatic DNA extraction methods, this novel hybrid method provided the best overall combined performance when considering quantitative (using 16S rRNA qPCR) and qualitative (using microbiomics) estimates of the total bacterial communities when processing poultry feces and litter samples.
27 Related JoVE Articles!
The ITS2 Database
Institutions: University of Würzburg, University of Würzburg.
The internal transcribed spacer 2 (ITS2) has been used as a phylogenetic marker for more than two decades. As ITS2 research mainly focused on the very variable ITS2 sequence, it confined this marker to low-level phylogenetics only. However, the combination of the ITS2 sequence and its highly conserved secondary structure improves the phylogenetic resolution1
and allows phylogenetic inference at multiple taxonomic ranks, including species delimitation2-8
The ITS2 Database9
presents an exhaustive dataset of internal transcribed spacer 2 sequences from NCBI GenBank11
. Following an annotation by profile Hidden Markov Models (HMMs), the secondary structure of each sequence is predicted. First, it is tested whether a minimum energy based fold12
(direct fold) results in a correct, four helix conformation. If this is not the case, the structure is predicted by homology modeling13
. In homology modeling, an already known secondary structure is transferred to another ITS2 sequence, whose secondary structure was not able to fold correctly in a direct fold.
The ITS2 Database is not only a database for storage and retrieval of ITS2 sequence-structures. It also provides several tools to process your own ITS2 sequences, including annotation, structural prediction, motif detection and BLAST14
search on the combined sequence-structure information. Moreover, it integrates trimmed versions of 4SALE15,16
for multiple sequence-structure alignment calculation and Neighbor Joining18
tree reconstruction. Together they form a coherent analysis pipeline from an initial set of sequences to a phylogeny based on sequence and secondary structure.
In a nutshell, this workbench simplifies first phylogenetic analyses to only a few mouse-clicks, while additionally providing tools and data for comprehensive large-scale analyses.
Genetics, Issue 61, alignment, internal transcribed spacer 2, molecular systematics, secondary structure, ribosomal RNA, phylogenetic tree, homology modeling, phylogeny
Next-generation Sequencing of 16S Ribosomal RNA Gene Amplicons
Institutions: National Research Council Canada.
One of the major questions in microbial ecology is “who is there?” This question can be answered using various tools, but one of the long-lasting gold standards is to sequence 16S ribosomal RNA (rRNA) gene amplicons generated by domain-level PCR reactions amplifying from genomic DNA. Traditionally, this was performed by cloning and Sanger (capillary electrophoresis) sequencing of PCR amplicons. The advent of next-generation sequencing has tremendously simplified and increased the sequencing depth for 16S rRNA gene sequencing. The introduction of benchtop sequencers now allows small labs to perform their 16S rRNA sequencing in-house in a matter of days. Here, an approach for 16S rRNA gene amplicon sequencing using a benchtop next-generation sequencer is detailed. The environmental DNA is first amplified by PCR using primers that contain sequencing adapters and barcodes. They are then coupled to spherical particles via emulsion PCR. The particles are loaded on a disposable chip and the chip is inserted in the sequencing machine after which the sequencing is performed. The sequences are retrieved in fastq format, filtered and the barcodes are used to establish the sample membership of the reads. The filtered and binned reads are then further analyzed using publically available tools. An example analysis where the reads were classified with a taxonomy-finding algorithm within the software package Mothur is given. The method outlined here is simple, inexpensive and straightforward and should help smaller labs to take advantage from the ongoing genomic revolution.
Molecular Biology, Issue 90, Metagenomics, Bacteria, 16S ribosomal RNA gene, Amplicon sequencing, Next-generation sequencing, benchtop sequencers
High Throughput Fluorometric Technique for Assessment of Macrophage Phagocytosis and Actin Polymerization
Institutions: University of Minnesota, University of Minnesota, 3M Corporate Research Laboratory.
The goal of fluorometric analysis is to serve as an efficient, cost effective, high throughput method of analyzing phagocytosis and other cellular processes. This technique can be used on a variety of cell types, both adherent and non-adherent, to examine a variety of cellular properties. When studying phagocytosis, fluorometric technique utilizes phagocytic cell types such as macrophages, and fluorescently labeled opsonized particles whose fluorescence can be extinguished in the presence of trypan blue. Following plating of adherent macrophages in 96-well plates, fluorescent particles (green or red) are administered and cells are allowed to phagocytose for varied amounts of time. Following internalization of fluorescent particles, cells are washed with trypan blue, which facilitates extinction of fluorescent signal from bacteria which are not internalized, or are merely adhering to the cell surface. Following the trypan wash, cells are washed with PBS, fixed, and stained with DAPI (nuclear blue fluorescent label), which serves to label nuclei of cells. By a simple fluorometric quantification through plate reading of nuclear (blue) or particle (red/green) fluorescence we can examine the ratio of relative fluorescence units of green:blue and determine a phagocytic index indicative of amount of fluorescent bacteria internalized per cell. The duration of assay using a 96-well method and multichannel pipettes for washing, from end of phagocytosis to end of data acquisition, is less than 45 min. Flow cytometry could be used in a similar manner but the advantage of fluorometry is its high throughput, rapid method of assessment with minimal manipulation of samples and quick quantification of fluorescent intensity per cell. Similar strategies can be applied to non adherent cells, live labeled bacteria, actin polymerization, and essentially any process utilizing fluorescence. Therefore, fluorometry is a promising method for its low cost, high throughput capabilities in the study of cellular processes.
Immunology, Issue 93, Fluorometry, phagocytosis, high throughput assay, actin polymerization, immunology
Use of Animal Model of Sepsis to Evaluate Novel Herbal Therapies
Institutions: North Shore – LIJ Health System.
Sepsis refers to a systemic inflammatory response syndrome resulting from a microbial infection. It has been routinely simulated in animals by several techniques, including infusion of exogenous bacterial toxin (endotoxemia) or bacteria (bacteremia), as well as surgical perforation of the cecum by cecal ligation and puncture (CLP)1-3
. CLP allows bacteria spillage and fecal contamination of the peritoneal cavity, mimicking the human clinical disease of perforated appendicitis or diverticulitis. The severity of sepsis, as reflected by the eventual mortality rates, can be controlled surgically by varying the size of the needle used for cecal puncture2
. In animals, CLP induces similar, biphasic hemodynamic cardiovascular, metabolic, and immunological responses as observed during the clinical course of human sepsis3
. Thus, the CLP model is considered as one of the most clinically relevant models for experimental sepsis1-3
Various animal models have been used to elucidate the intricate mechanisms underlying the pathogenesis of experimental sepsis. The lethal consequence of sepsis is attributable partly to an excessive accumulation of early cytokines (such as TNF, IL-1 and IFN-γ)4-6
and late proinflammatory mediators (e.g., HMGB1)7
. Compared with early proinflammatory cytokines, late-acting mediators have a wider therapeutic window for clinical applications. For instance, delayed administration of HMGB1-neutralizing antibodies beginning 24 hours after
CLP, still rescued mice from lethality8,9
, establishing HMGB1 as a late mediator of lethal sepsis. The discovery of HMGB1 as a late-acting mediator has initiated a new field of investigation for the development of sepsis therapies using Traditional Chinese Herbal Medicine. In this paper, we describe a procedure of CLP-induced sepsis, and its usage in screening herbal medicine for HMGB1-targeting therapies.
Medicine, Issue 62, Herbal therapies, innate immune cells, cytokines, HMGB1, experimental animal model of sepsis, cecal ligation and puncture
High-definition Fourier Transform Infrared (FT-IR) Spectroscopic Imaging of Human Tissue Sections towards Improving Pathology
Institutions: University of Illinois at Chicago, University of Illinois at Chicago, University of Illinois at Chicago, University of Illinois at Chicago, University of Illinois at Chicago.
High-definition Fourier Transform Infrared (FT-IR) spectroscopic imaging is an emerging approach to obtain detailed images that have associated biochemical information. FT-IR imaging of tissue is based on the principle that different regions of the mid-infrared are absorbed by different chemical bonds (e.g.,
C=O, C-H, N-H) within cells or tissue that can then be related to the presence and composition of biomolecules (e.g.,
lipids, DNA, glycogen, protein, collagen). In an FT-IR image, every pixel within the image comprises an entire Infrared (IR) spectrum that can give information on the biochemical status of the cells that can then be exploited for cell-type or disease-type classification. In this paper, we show: how to obtain IR images from human tissues using an FT-IR system, how to modify existing instrumentation to allow for high-definition imaging capabilities, and how to visualize FT-IR images. We then present some applications of FT-IR for pathology using the liver and kidney as examples. FT-IR imaging holds exciting applications in providing a novel route to obtain biochemical information from cells and tissue in an entirely label-free non-perturbing route towards giving new insight into biomolecular changes as part of disease processes. Additionally, this biochemical information can potentially allow for objective and automated analysis of certain aspects of disease diagnosis.
Medicine, Issue 95, Spectroscopy, Imaging, Fourier Transform, Pathology, Cancer, Liver, Kidney, Hyperspectral, Biopsy, Infrared, Optics, Tissue
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Absolute Quantum Yield Measurement of Powder Samples
Institutions: Hitachi High Technologies America.
Measurement of fluorescence quantum yield has become an important tool in the search for new solutions in the development, evaluation, quality control and research of illumination, AV equipment, organic EL material, films, filters and fluorescent probes for bio-industry.
Quantum yield is calculated as the ratio of the number of photons absorbed, to the number of photons emitted by a material. The higher the quantum yield, the better the efficiency of the fluorescent material.
For the measurements featured in this video, we will use the Hitachi F-7000 fluorescence spectrophotometer equipped with the Quantum Yield measuring accessory and Report Generator program. All the information provided applies to this system.
Measurement of quantum yield in powder samples is performed following these steps:
Generation of instrument correction factors for the excitation and emission monochromators. This is an important requirement for the correct measurement of quantum yield. It has been performed in advance for the full measurement range of the instrument and will not be shown in this video due to time limitations.
Measurement of integrating sphere correction factors. The purpose of this step is to take into consideration reflectivity characteristics of the integrating sphere used for the measurements.
Reference and Sample measurement using direct excitation and indirect excitation.
Quantum Yield calculation using Direct and Indirect excitation. Direct excitation is when the sample is facing directly the excitation beam, which would be the normal measurement setup. However, because we use an integrating sphere, a portion of the emitted photons resulting from the sample fluorescence are reflected by the integrating sphere and will re-excite the sample, so we need to take into consideration indirect excitation. This is accomplished by measuring the sample placed in the port facing the emission monochromator, calculating indirect quantum yield and correcting the direct quantum yield calculation.
Corrected quantum yield calculation.
Chromaticity coordinates calculation using Report Generator program.
The Hitachi F-7000 Quantum Yield Measurement System offer advantages for this
application, as follows:
High sensitivity (S/N ratio 800 or better RMS). Signal is the Raman band of water measured under the following conditions: Ex wavelength 350 nm, band pass Ex and Em 5 nm, response 2 sec), noise is measured at the maximum of the Raman peak. High sensitivity allows measurement of samples even with low quantum yield. Using this system we have measured quantum yields as low as 0.1 for a sample of salicylic acid and as high as 0.8 for a sample of magnesium tungstate.
Highly accurate measurement with a dynamic range of 6 orders of magnitude allows for measurements of both sharp scattering peaks with high intensity, as well as broad fluorescence peaks of low intensity under the same conditions.
High measuring throughput and reduced light exposure to the sample, due to a high scanning speed of up to 60,000 nm/minute and automatic shutter function.
Measurement of quantum yield over a wide wavelength range from 240 to 800 nm.
Accurate quantum yield measurements are the result of collecting instrument spectral response and integrating sphere correction factors before measuring the sample.
Large selection of calculated parameters provided by dedicated and easy to use software.
During this video we will measure sodium salicylate in powder form which is known to have a quantum yield value of 0.4 to 0.5.
Molecular Biology, Issue 63, Powders, Quantum, Yield, F-7000, Quantum Yield, phosphor, chromaticity, Photo-luminescence
Training Synesthetic Letter-color Associations by Reading in Color
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
Tracking the Mammary Architectural Features and Detecting Breast Cancer with Magnetic Resonance Diffusion Tensor Imaging
Institutions: Weizmann Institute of Science, Weizmann Institute of Science, Meir Medical Center, Meir Medical Center.
Breast cancer is the most common cause of cancer among women worldwide. Early detection of breast cancer has a critical role in improving the quality of life and survival of breast cancer patients. In this paper a new approach for the detection of breast cancer is described, based on tracking the mammary architectural elements using diffusion tensor imaging (DTI).
The paper focuses on the scanning protocols and image processing algorithms and software that were designed to fit the diffusion properties of the mammary fibroglandular tissue and its changes during malignant transformation. The final output yields pixel by pixel vector maps that track the architecture of the entire mammary ductal glandular trees and parametric maps of the diffusion tensor coefficients and anisotropy indices.
The efficiency of the method to detect breast cancer was tested by scanning women volunteers including 68 patients with breast cancer confirmed by histopathology findings. Regions with cancer cells exhibited a marked reduction in the diffusion coefficients and in the maximal anisotropy index as compared to the normal breast tissue, providing an intrinsic contrast for delineating the boundaries of malignant growth. Overall, the sensitivity of the DTI parameters to detect breast cancer was found to be high, particularly in dense breasts, and comparable to the current standard breast MRI method that requires injection of a contrast agent. Thus, this method offers a completely non-invasive, safe and sensitive tool for breast cancer detection.
Medicine, Issue 94, Magnetic Resonance Imaging, breast, breast cancer, diagnosis, water diffusion, diffusion tensor imaging
Measurement of the Pressure-volume Curve in Mouse Lungs
Institutions: Johns Hopkins University.
In recent decades the mouse has become the primary animal model of a variety of lung diseases. In models of emphysema or fibrosis, the essential phenotypic changes are best assessed by measurement of the changes in lung elasticity. To best understand specific mechanisms underlying such pathologies in mice, it is essential to make functional measurements that can reflect the developing pathology. Although there are many ways to measure elasticity, the classical method is that of the total lung pressure-volume (PV) curve done over the whole range of lung volumes. This measurement has been made on adult lungs from nearly all mammalian species dating back almost 100 years, and such PV curves also played a major role in the discovery and understanding of the function of pulmonary surfactant in fetal lung development. Unfortunately, such total PV curves have not been widely reported in the mouse, despite the fact that they can provide useful information on the macroscopic effects of structural changes in the lung. Although partial PV curves measuring just the changes in lung volume are sometimes reported, without a measure of absolute volume, the nonlinear nature of the total PV curve makes these partial ones very difficult to interpret. In the present study, we describe a standardized way to measure the total PV curve. We have then tested the ability of these curves to detect changes in mouse lung structure in two common lung pathologies, emphysema and fibrosis. Results showed significant changes in several variables consistent with expected structural changes with these pathologies. This measurement of the lung PV curve in mice thus provides a straightforward means to monitor the progression of the pathophysiologic changes over time and the potential effect of therapeutic procedures.
Medicine, Issue 95, Lung compliance, Lung hysteresis, Pulmonary surfactant, Lung elasticity, Quasistatic compliance, Fibrosis, Emphysema
Best Current Practice for Obtaining High Quality EEG Data During Simultaneous fMRI
Institutions: University of Nottingham , Brain Products GmbH.
Simultaneous EEG-fMRI allows the excellent temporal resolution of EEG to be combined with the high spatial accuracy of fMRI. The data from these two modalities can be combined in a number of ways, but all rely on the acquisition of high quality EEG and fMRI data. EEG data acquired during simultaneous fMRI are affected by several artifacts, including the gradient artefact (due to the changing magnetic field gradients required for fMRI), the pulse artefact (linked to the cardiac cycle) and movement artifacts (resulting from movements in the strong magnetic field of the scanner, and muscle activity). Post-processing methods for successfully correcting the gradient and pulse artifacts require a number of criteria to be satisfied during data acquisition. Minimizing head motion during EEG-fMRI is also imperative for limiting the generation of artifacts.
Interactions between the radio frequency (RF) pulses required for MRI and the EEG hardware may occur and can cause heating. This is only a significant risk if safety guidelines are not satisfied. Hardware design and set-up, as well as careful selection of which MR sequences are run with the EEG hardware present must therefore be considered.
The above issues highlight the importance of the choice of the experimental protocol employed when performing a simultaneous EEG-fMRI experiment. Based on previous research we describe an optimal experimental set-up. This provides high quality EEG data during simultaneous fMRI when using commercial EEG and fMRI systems, with safety risks to the subject minimized. We demonstrate this set-up in an EEG-fMRI experiment using a simple visual stimulus. However, much more complex stimuli can be used. Here we show the EEG-fMRI set-up using a Brain Products GmbH (Gilching, Germany) MRplus, 32 channel EEG system in conjunction with a Philips Achieva (Best, Netherlands) 3T MR scanner, although many of the techniques are transferable to other systems.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Biophysics, Medicine, Neuroimaging, Functional Neuroimaging, Investigative Techniques, neurosciences, EEG, functional magnetic resonance imaging, fMRI, magnetic resonance imaging, MRI, simultaneous, recording, imaging, clinical techniques
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33
. To help improve this understanding, proton magnetic resonance spectroscopy (1
H-MRS) can be used as it allows the in vivo
quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41
. In fact, a recent study demonstrated that 1
H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34
. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1
H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31
. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Acquisition of High-Quality Digital Video of Drosophila Larval and Adult Behaviors from a Lateral Perspective
Institutions: Willamette University.
is a powerful experimental model system for studying the function of the nervous system. Gene mutations that cause dysfunction of the nervous system often produce viable larvae and adults that have locomotion defective phenotypes that are difficult to adequately describe with text or completely represent with a single photographic image. Current modes of scientific publishing, however, support the submission of digital video media as supplemental material to accompany a manuscript. Here we describe a simple and widely accessible microscopy technique for acquiring high-quality digital video of both Drosophila
larval and adult phenotypes from a lateral perspective. Video of larval and adult locomotion from a side-view is advantageous because it allows the observation and analysis of subtle distinctions and variations in aberrant locomotive behaviors. We have successfully used the technique to visualize and quantify aberrant crawling behaviors in third instar larvae, in addition to adult mutant phenotypes and behaviors including grooming.
Neuroscience, Issue 92, Drosophila, behavior, coordination, crawling, locomotion, nervous system, neurodegeneration, larva
C. elegans Tracking and Behavioral Measurement
Institutions: University of Toronto, Vrije Universiteit, Okinawa Institute of Science and Technology, University of Toronto.
We have developed instrumentation, image processing, and data analysis techniques to quantify the locomotory behavior of C. elegans
as it crawls on the surface of an agar plate. For the study of the genetic, biochemical, and neuronal basis of behavior, C. elegans
is an ideal organism because it is genetically tractable, amenable to microscopy, and shows a number of complex behaviors, including taxis, learning, and social interaction1,2
. Behavioral analysis based on tracking the movements of worms as they crawl on agar plates have been particularly useful in the study of sensory behavior3
, and general mutational phenotyping5
. Our system works by moving the camera and illumination system as the worms crawls on a stationary agar plate, which ensures no mechanical stimulus is transmitted to the worm. Our tracking system is easy to use and includes a semi-automatic calibration feature. A challenge of all video tracking systems is that it generates an enormous amount of data that is intrinsically high dimensional. Our image processing and data analysis programs deal with this challenge by reducing the worms shape into a set of independent components, which comprehensively reconstruct the worms behavior as a function of only 3-4 dimensions6,7
. As an example of the process we show that the worm enters and exits its reversal state in a phase specific manner.
Neuroscience, Issue 69, Physics, Biophysics, Anatomy, Microscopy, Ethology, Behavior, Machine Vision, C. elegans, animal model
Viability Assays for Cells in Culture
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
High Efficiency Differentiation of Human Pluripotent Stem Cells to Cardiomyocytes and Characterization by Flow Cytometry
Institutions: Medical College of Wisconsin, Stanford University School of Medicine, Medical College of Wisconsin, Hong Kong University, Johns Hopkins University School of Medicine, Medical College of Wisconsin.
There is an urgent need to develop approaches for repairing the damaged heart, discovering new therapeutic drugs that do not have toxic effects on the heart, and improving strategies to accurately model heart disease. The potential of exploiting human induced pluripotent stem cell (hiPSC) technology to generate cardiac muscle “in a dish” for these applications continues to generate high enthusiasm. In recent years, the ability to efficiently generate cardiomyogenic cells from human pluripotent stem cells (hPSCs) has greatly improved, offering us new opportunities to model very early stages of human cardiac development not otherwise accessible. In contrast to many previous methods, the cardiomyocyte differentiation protocol described here does not require cell aggregation or the addition of Activin A or BMP4 and robustly generates cultures of cells that are highly positive for cardiac troponin I and T (TNNI3, TNNT2), iroquois-class homeodomain protein IRX-4 (IRX4), myosin regulatory light chain 2, ventricular/cardiac muscle isoform (MLC2v) and myosin regulatory light chain 2, atrial isoform (MLC2a) by day 10 across all human embryonic stem cell (hESC) and hiPSC lines tested to date. Cells can be passaged and maintained for more than 90 days in culture. The strategy is technically simple to implement and cost-effective. Characterization of cardiomyocytes derived from pluripotent cells often includes the analysis of reference markers, both at the mRNA and protein level. For protein analysis, flow cytometry is a powerful analytical tool for assessing quality of cells in culture and determining subpopulation homogeneity. However, technical variation in sample preparation can significantly affect quality of flow cytometry data. Thus, standardization of staining protocols should facilitate comparisons among various differentiation strategies. Accordingly, optimized staining protocols for the analysis of IRX4, MLC2v, MLC2a, TNNI3, and TNNT2 by flow cytometry are described.
Cellular Biology, Issue 91, human induced pluripotent stem cell, flow cytometry, directed differentiation, cardiomyocyte, IRX4, TNNI3, TNNT2, MCL2v, MLC2a
Rapid Genotyping of Animals Followed by Establishing Primary Cultures of Brain Neurons
Institutions: University of Iowa Carver College of Medicine, University of Iowa Carver College of Medicine, EZ BioResearch LLC.
High-resolution analysis of the morphology and function of mammalian neurons often requires the genotyping of individual animals followed by the analysis of primary cultures of neurons. We describe a set of procedures for: labeling newborn mice to be genotyped, rapid genotyping, and establishing low-density cultures of brain neurons from these mice. Individual mice are labeled by tattooing, which allows for long-term identification lasting into adulthood. Genotyping by the described protocol is fast and efficient, and allows for automated extraction of nucleic acid with good reliability. This is useful under circumstances where sufficient time for conventional genotyping is not available, e.g.,
in mice that suffer from neonatal lethality. Primary neuronal cultures are generated at low density, which enables imaging experiments at high spatial resolution. This culture method requires the preparation of glial feeder layers prior to neuronal plating. The protocol is applied in its entirety to a mouse model of the movement disorder DYT1 dystonia (ΔE-torsinA knock-in mice), and neuronal cultures are prepared from the hippocampus, cerebral cortex and striatum of these mice. This protocol can be applied to mice with other genetic mutations, as well as to animals of other species. Furthermore, individual components of the protocol can be used for isolated sub-projects. Thus this protocol will have wide applications, not only in neuroscience but also in other fields of biological and medical sciences.
Neuroscience, Issue 95, AP2, genotyping, glial feeder layer, mouse tail, neuronal culture, nucleic-acid extraction, PCR, tattoo, torsinA
Purifying the Impure: Sequencing Metagenomes and Metatranscriptomes from Complex Animal-associated Samples
Institutions: San Diego State University, DOE Joint Genome Institute, University of Colorado, University of Colorado.
The accessibility of high-throughput sequencing has revolutionized many fields of biology. In order to better understand host-associated viral and microbial communities, a comprehensive workflow for DNA and RNA extraction was developed. The workflow concurrently generates viral and microbial metagenomes, as well as metatranscriptomes, from a single sample for next-generation sequencing. The coupling of these approaches provides an overview of both the taxonomical characteristics and the community encoded functions. The presented methods use Cystic Fibrosis (CF) sputum, a problematic sample type, because it is exceptionally viscous and contains high amount of mucins, free neutrophil DNA, and other unknown contaminants. The protocols described here target these problems and successfully recover viral and microbial DNA with minimal human DNA contamination. To complement the metagenomics studies, a metatranscriptomics protocol was optimized to recover both microbial and host mRNA that contains relatively few ribosomal RNA (rRNA) sequences. An overview of the data characteristics is presented to serve as a reference for assessing the success of the methods. Additional CF sputum samples were also collected to (i) evaluate the consistency of the microbiome profiles across seven consecutive days within a single patient, and (ii) compare the consistency of metagenomic approach to a 16S ribosomal RNA gene-based sequencing. The results showed that daily fluctuation of microbial profiles without antibiotic perturbation was minimal and the taxonomy profiles of the common CF-associated bacteria were highly similar between the 16S rDNA libraries and metagenomes generated from the hypotonic lysis (HL)-derived DNA. However, the differences between 16S rDNA taxonomical profiles generated from total DNA and HL-derived DNA suggest that hypotonic lysis and the washing steps benefit in not only removing the human-derived DNA, but also microbial-derived extracellular DNA that may misrepresent the actual microbial profiles.
Molecular Biology, Issue 94, virome, microbiome, metagenomics, metatranscriptomics, cystic fibrosis, mucosal-surface
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro
. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro
replication of HIV-1 as influenced by the gag
gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag
gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro
replication of chronically derived gag-pro
sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
High Yield Purification of Plasmodium falciparum Merozoites For Use in Opsonizing Antibody Assays
Institutions: Walter and Eliza Hall Institute of Medical Research, University of Melbourne.
merozoite antigens are under development as potential malaria vaccines. One aspect of immunity against malaria is the removal of free merozoites from the blood by phagocytic cells. However assessing the functional efficacy of merozoite specific opsonizing antibodies is challenging due to the short half-life of merozoites and the variability of primary phagocytic cells. Described in detail herein is a method for generating viable merozoites using the E64 protease inhibitor, and an assay of merozoite opsonin-dependent phagocytosis using the pro-monocytic cell line THP-1. E64 prevents schizont rupture while allowing the development of merozoites which are released by filtration of treated schizonts. Ethidium bromide labelled merozoites are opsonized with human plasma samples and added to THP-1 cells. Phagocytosis is assessed by a standardized high throughput protocol. Viable merozoites are a valuable resource for assessing numerous aspects of P. falciparum
biology, including assessment of immune function. Antibody levels measured by this assay are associated with clinical immunity to malaria in naturally exposed individuals. The assay may also be of use for assessing vaccine induced antibodies.
Immunology, Issue 89, Parasitic Diseases, malaria, Plasmodium falciparum, hemozoin, antibody, Fc Receptor, opsonization, merozoite, phagocytosis, THP-1
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
Test Samples for Optimizing STORM Super-Resolution Microscopy
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
A cGMP-applicable Expansion Method for Aggregates of Human Neural Stem and Progenitor Cells Derived From Pluripotent Stem Cells or Fetal Brain Tissue
Institutions: Cedars-Sinai Medical Center.
A cell expansion technique to amass large numbers of cells from a single specimen for research experiments and clinical trials would greatly benefit the stem cell community. Many current expansion methods are laborious and costly, and those involving complete dissociation may cause several stem and progenitor cell types to undergo differentiation or early senescence. To overcome these problems, we have developed an automated mechanical passaging method referred to as “chopping” that is simple and inexpensive. This technique avoids chemical or enzymatic dissociation into single cells and instead allows for the large-scale expansion of suspended, spheroid cultures that maintain constant cell/cell contact. The chopping method has primarily been used for fetal brain-derived neural progenitor cells or neurospheres, and has recently been published for use with neural stem cells derived from embryonic and induced pluripotent stem cells. The procedure involves seeding neurospheres onto a tissue culture Petri dish and subsequently passing a sharp, sterile blade through the cells effectively automating the tedious process of manually mechanically dissociating each sphere. Suspending cells in culture provides a favorable surface area-to-volume ratio; as over 500,000 cells can be grown within a single neurosphere of less than 0.5 mm in diameter. In one T175 flask, over 50 million cells can grow in suspension cultures compared to only 15 million in adherent cultures. Importantly, the chopping procedure has been used under current good manufacturing practice (cGMP), permitting mass quantity production of clinical-grade cell products.
Neuroscience, Issue 88, neural progenitor cell, neural precursor cell, neural stem cell, passaging, neurosphere, chopping, stem cell, neuroscience, suspension culture, good manufacturing practice, GMP
Differentiation of Newborn Mouse Skin Derived Stem Cells into Germ-like Cells In vitro
Institutions: The University of Western Ontario, Children's Health Research Institute.
Studying germ cell formation and differentiation has traditionally been very difficult due to low cell numbers and their location deep within developing embryos. The availability of a "closed" in vitro
based system could prove invaluable for our understanding of gametogenesis. The formation of oocyte-like cells (OLCs) from somatic stem cells, isolated from newborn mouse skin, has been demonstrated and can be visualized in this video protocol. The resulting OLCs express various markers consistent with oocytes such as Oct4 , Vasa , Bmp15
, and Scp3
. However, they remain unable to undergo maturation or fertilization due to a failure to complete meiosis. This protocol will provide a system that is useful for studying the early stage formation and differentiation of germ cells into more mature gametes. During early differentiation the number of cells expressing Oct4 (potential germ-like cells) reaches ~5%, however currently the formation of OLCs remains relatively inefficient. The protocol is relatively straight forward though special care should be taken to ensure the starting cell population is healthy and at an early passage.
Stem Cell Biology, Issue 77, Developmental Biology, Cellular Biology, Molecular Biology, Bioengineering, Biomedical Engineering, Medicine, Physiology, Adult Stem Cells, Pluripotent Stem Cells, Germ Cells, Oocytes, Reproductive Physiological Processes, Stem cell, skin, germ cell, oocyte, cell, differentiation, cell culture, mouse, animal model
Neonatal Subventricular Zone Electroporation
Institutions: Yale University School of Medicine .
Neural stem cells (NSCs) line the postnatal lateral ventricles and give rise to multiple cell types which include neurons, astrocytes, and ependymal cells1
. Understanding the molecular pathways responsible for NSC self-renewal, commitment, and differentiation is critical for harnessing their unique potential to repair the brain and better understand central nervous system disorders. Previous methods for the manipulation of mammalian systems required the time consuming and expensive endeavor of genetic engineering at the whole animal level2
. Thus, the vast majority of studies have explored the functions of NSC molecules in vitro
or in invertebrates.
Here, we demonstrate the simple and rapid technique to manipulate neonatal NPCs that is referred to as neonatal subventricular zone (SVZ) electroporation. Similar techniques were developed a decade ago to study embryonic NSCs and have aided studies on cortical development3,4
. More recently this was applied to study the postnatal rodent forebrain5-7
. This technique results in robust labeling of SVZ NSCs and their progeny. Thus, postnatal SVZ electroporation provides a cost and time effective alternative for mammalian NSC genetic engineering.
Neuroscience, Issue 72, Developmental Biology, Neurobiology, Molecular Biology, Cellular Biology, Physiology, Anatomy, Biomedical Engineering, Stem Cell Biology, Genetics, Neurogenesis, Growth and Development, Surgery, Subventricular Zone, Electroporation, Neural Stem Cells, NSC, subventricular zone, brain, DNA, injection, genetic engineering, neonatal pups, animal model
Depletion of Ribosomal RNA for Mosquito Gut Metagenomic RNA-seq
Institutions: New Mexico State University.
The mosquito gut accommodates dynamic microbial communities across different stages of the insect's life cycle. Characterization of the genetic capacity and functionality of the gut community will provide insight into the effects of gut microbiota on mosquito life traits. Metagenomic RNA-Seq has become an important tool to analyze transcriptomes from various microbes present in a microbial community. Messenger RNA usually comprises only 1-3% of total RNA, while rRNA constitutes approximately 90%. It is challenging to enrich messenger RNA from a metagenomic microbial RNA sample because most prokaryotic mRNA species lack stable poly(A) tails. This prevents oligo d(T) mediated mRNA isolation. Here, we describe a protocol that employs sample derived rRNA capture probes to remove rRNA from a metagenomic total RNA sample. To begin, both mosquito and microbial small and large subunit rRNA fragments are amplified from a metagenomic community DNA sample. Then, the community specific biotinylated antisense ribosomal RNA probes are synthesized in vitro
using T7 RNA polymerase. The biotinylated rRNA probes are hybridized to the total RNA. The hybrids are captured by streptavidin-coated beads and removed from the total RNA. This subtraction-based protocol efficiently removes both mosquito and microbial rRNA from the total RNA sample. The mRNA enriched sample is further processed for RNA amplification and RNA-Seq.
Genetics, Issue 74, Infection, Infectious Diseases, Molecular Biology, Cellular Biology, Microbiology, Genomics, biology (general), genetics (animal and plant), life sciences, Eukaryota, Bacteria, metagenomics, metatranscriptome, RNA-seq, rRNA depletion, mRNA enrichment, mosquito gut microbiome, RNA, DNA, sequencing
Developing Neuroimaging Phenotypes of the Default Mode Network in PTSD: Integrating the Resting State, Working Memory, and Structural Connectivity
Institutions: Alpert Medical School, Brown University, University of Georgia.
Complementary structural and functional neuroimaging techniques used to examine the Default Mode Network (DMN) could potentially improve assessments of psychiatric illness severity and provide added validity to the clinical diagnostic process. Recent neuroimaging research suggests that DMN processes may be disrupted in a number of stress-related psychiatric illnesses, such as posttraumatic stress disorder (PTSD).
Although specific DMN functions remain under investigation, it is generally thought to be involved in introspection and self-processing. In healthy individuals it exhibits greatest activity during periods of rest, with less activity, observed as deactivation, during cognitive tasks, e.g.
, working memory. This network consists of the medial prefrontal cortex, posterior cingulate cortex/precuneus, lateral parietal cortices and medial temporal regions.
Multiple functional and structural imaging approaches have been developed to study the DMN. These have unprecedented potential to further the understanding of the function and dysfunction of this network. Functional approaches, such as the evaluation of resting state connectivity and task-induced deactivation, have excellent potential to identify targeted neurocognitive and neuroaffective (functional) diagnostic markers and may indicate illness severity and prognosis with increased accuracy or specificity. Structural approaches, such as evaluation of morphometry and connectivity, may provide unique markers of etiology and long-term outcomes. Combined, functional and structural methods provide strong multimodal, complementary and synergistic approaches to develop valid DMN-based imaging phenotypes in stress-related psychiatric conditions. This protocol aims to integrate these methods to investigate DMN structure and function in PTSD, relating findings to illness severity and relevant clinical factors.
Medicine, Issue 89, default mode network, neuroimaging, functional magnetic resonance imaging, diffusion tensor imaging, structural connectivity, functional connectivity, posttraumatic stress disorder