Current commercial PCRs tests for identifying Salmonella target genes unique to this genus. However, there are two species, six subspecies, and over 2,500 different Salmonella serovars, and not all are equal in their significance to public health. For example, finding S. enterica subspecies IIIa Arizona on a table egg layer farm is insignificant compared to the isolation of S. enterica subspecies I serovar Enteritidis, the leading cause of salmonellosis linked to the consumption of table eggs. Serovars are identified based on antigenic differences in lipopolysaccharide (LPS)(O antigen) and flagellin (H1 and H2 antigens). These antigenic differences are the outward appearance of the diversity of genes and gene alleles associated with this phenotype.
We have developed an allelotyping, multiplex PCR that keys on genetic differences between four major S. enterica subspecies I serovars found in poultry and associated with significant human disease in the US. The PCR primer pairs were targeted to key genes or sequences unique to a specific Salmonella serovar and designed to produce an amplicon with size specific for that gene or allele. Salmonella serovar is assigned to an isolate based on the combination of PCR test results for specific LPS and flagellin gene alleles. The multiplex PCRs described in this article are specific for the detection of S. enterica subspecies I serovars Enteritidis, Hadar, Heidelberg, and Typhimurium.
Here we demonstrate how to use the multiplex PCRs to identify serovar for a Salmonella isolate.
23 Related JoVE Articles!
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+
release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
A Practical Guide to Phylogenetics for Nonexperts
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Scalable Nanohelices for Predictive Studies and Enhanced 3D Visualization
Institutions: University of California Merced, University of California Merced.
Spring-like materials are ubiquitous in nature and of interest in nanotechnology for energy harvesting, hydrogen storage, and biological sensing applications. For predictive simulations, it has become increasingly important to be able to model the structure of nanohelices accurately. To study the effect of local structure on the properties of these complex geometries one must develop realistic models. To date, software packages are rather limited in creating atomistic helical models. This work focuses on producing atomistic models of silica glass (SiO2
) nanoribbons and nanosprings for molecular dynamics (MD) simulations. Using an MD model of “bulk” silica glass, two computational procedures to precisely create the shape of nanoribbons and nanosprings are presented. The first method employs the AWK programming language and open-source software to effectively carve various shapes of silica nanoribbons from the initial bulk model, using desired dimensions and parametric equations to define a helix. With this method, accurate atomistic silica nanoribbons can be generated for a range of pitch values and dimensions. The second method involves a more robust code which allows flexibility in modeling nanohelical structures. This approach utilizes a C++ code particularly written to implement pre-screening methods as well as the mathematical equations for a helix, resulting in greater precision and efficiency when creating nanospring models. Using these codes, well-defined and scalable nanoribbons and nanosprings suited for atomistic simulations can be effectively created. An added value in both open-source codes is that they can be adapted to reproduce different helical structures, independent of material. In addition, a MATLAB graphical user interface (GUI) is used to enhance learning through visualization and interaction for a general user with the atomistic helical structures. One application of these methods is the recent study of nanohelices via MD simulations for mechanical energy harvesting purposes.
Physics, Issue 93, Helical atomistic models; open-source coding; graphical user interface; visualization software; molecular dynamics simulations; graphical processing unit accelerated simulations.
Polysome Fractionation and Analysis of Mammalian Translatomes on a Genome-wide Scale
Institutions: McGill University, Karolinska Institutet, McGill University.
mRNA translation plays a central role in the regulation of gene expression and represents the most energy consuming process in mammalian cells. Accordingly, dysregulation of mRNA translation is considered to play a major role in a variety of pathological states including cancer. Ribosomes also host chaperones, which facilitate folding of nascent polypeptides, thereby modulating function and stability of newly synthesized polypeptides. In addition, emerging data indicate that ribosomes serve as a platform for a repertoire of signaling molecules, which are implicated in a variety of post-translational modifications of newly synthesized polypeptides as they emerge from the ribosome, and/or components of translational machinery. Herein, a well-established method of ribosome fractionation using sucrose density gradient centrifugation is described. In conjunction with the in-house developed “anota” algorithm this method allows direct determination of differential translation of individual mRNAs on a genome-wide scale. Moreover, this versatile protocol can be used for a variety of biochemical studies aiming to dissect the function of ribosome-associated protein complexes, including those that play a central role in folding and degradation of newly synthesized polypeptides.
Biochemistry, Issue 87, Cells, Eukaryota, Nutritional and Metabolic Diseases, Neoplasms, Metabolic Phenomena, Cell Physiological Phenomena, mRNA translation, ribosomes,
protein synthesis, genome-wide analysis, translatome, mTOR, eIF4E, 4E-BP1
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro
. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro
replication of HIV-1 as influenced by the gag
gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag
gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro
replication of chronically derived gag-pro
sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Culturing and Maintaining Clostridium difficile in an Anaerobic Environment
Institutions: Emory University School of Medicine.
is a Gram-positive, anaerobic, sporogenic bacterium that is primarily responsible for antibiotic associated diarrhea (AAD) and is a significant nosocomial pathogen. C. difficile
is notoriously difficult to isolate and cultivate and is extremely sensitive to even low levels of oxygen in the environment. Here, methods for isolating C. difficile
from fecal samples and subsequently culturing C. difficile
for preparation of glycerol stocks for long-term storage are presented. Techniques for preparing and enumerating spore stocks in the laboratory for a variety of downstream applications including microscopy and animal studies are also described. These techniques necessitate an anaerobic chamber, which maintains a consistent anaerobic environment to ensure proper conditions for optimal C. difficile
growth. We provide protocols for transferring materials in and out of the chamber without causing significant oxygen contamination along with suggestions for regular maintenance required to sustain the appropriate anaerobic environment for efficient and consistent C. difficile
Immunology, Issue 79, Genetics, Bacteria, Anaerobic, Gram-Positive Endospore-Forming Rods, Spores, Bacterial, Gram-Positive Bacterial Infections, Clostridium Infections, Bacteriology, Clostridium difficile, Gram-positive, anaerobic chamber, spore, culturing, maintenance, cell culture
Multiplex PCR Assay for Typing of Staphylococcal Cassette Chromosome Mec Types I to V in Methicillin-resistant Staphylococcus aureus
Institutions: Alberta Health Services / Calgary Laboratory Services / University of Calgary, University of Calgary, University of Calgary, University of Calgary, University of Calgary.
Staphylococcal Cassette Chromosome mec
typing is a very important molecular tool for understanding the epidemiology and clonal strain relatedness of methicillin-resistant Staphylococcus aureus
(MRSA), particularly with the emerging outbreaks of community-associated MRSA (CA-MRSA) occurring on a worldwide basis. Traditional PCR typing schemes classify SCCmec
by targeting and identifying the individual mec
gene complex types, but require the use of many primer sets and multiple individual PCR experiments. We designed and published a simple multiplex PCR assay for quick-screening of major SCCmec
types and subtypes I to V, and later updated it as new sequence information became available. This simple assay targets individual SCCmec
types in a single reaction, is easy to interpret and has been extensively used worldwide. However, due to the sophisticated nature of the assay and the large number of primers present in the reaction, there is the potential for difficulties while adapting this assay to individual laboratories. To facilitate the process of establishing a MRSA SCCmec
assay, here we demonstrate how to set up our multiplex PCR assay, and discuss some of the vital steps and procedural nuances that make it successful.
Infection, Issue 79, Microbiology, Genetics, Medicine, Cellular Biology, Molecular Biology, Biomedical Engineering, Bacteria, Bacterial Infections and Mycoses, Life Sciences (General), Methicillin-resistant Staphylococcus aureus (MRSA), Staphylococcal cassette chromosome mec (SCCmec), SCCmec typing, Multiplex PCR, PCR, sequencing
Enteric Bacterial Invasion Of Intestinal Epithelial Cells In Vitro Is Dramatically Enhanced Using a Vertical Diffusion Chamber Model
Institutions: London School of Hygiene & Tropical Medicine.
The interactions of bacterial pathogens with host cells have been investigated extensively using in vitro
cell culture methods. However as such cell culture assays are performed under aerobic conditions, these in vitro
models may not accurately represent the in vivo
environment in which the host-pathogen interactions take place. We have developed an in vitro
model of infection that permits the coculture of bacteria and host cells under different medium and gas conditions. The Vertical Diffusion Chamber (VDC) model mimics the conditions in the human intestine where bacteria will be under conditions of very low oxygen whilst tissue will be supplied with oxygen from the blood stream. Placing polarized intestinal epithelial cell (IEC) monolayers grown in Snapwell inserts into a VDC creates separate apical and basolateral compartments. The basolateral compartment is filled with cell culture medium, sealed and perfused with oxygen whilst the apical compartment is filled with broth, kept open and incubated under microaerobic conditions. Both Caco-2 and T84 IECs can be maintained in the VDC under these conditions without any apparent detrimental effects on cell survival or monolayer integrity. Coculturing experiments performed with different C. jejuni
wild-type strains and different IEC lines in the VDC model with microaerobic conditions in the apical compartment reproducibly result in an increase in the number of interacting (almost 10-fold) and intracellular (almost 100-fold) bacteria compared to aerobic culture conditions1
. The environment created in the VDC model more closely mimics the environment encountered by C. jejuni
in the human intestine and highlights the importance of performing in vitro
infection assays under conditions that more closely mimic the in vivo
reality. We propose that use of the VDC model will allow new interpretations of the interactions between bacterial pathogens and host cells.
Infection, Issue 80, Gram-Negative Bacteria, Bacterial Infections, Gastrointestinal Diseases, Campylobacter jejuni, bacterial invasion, intestinal epithelial cells, models of infection
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Use of Artificial Sputum Medium to Test Antibiotic Efficacy Against Pseudomonas aeruginosa in Conditions More Relevant to the Cystic Fibrosis Lung
Institutions: University of Liverpool , University of Liverpool .
There is growing concern about the relevance of in vitro
antimicrobial susceptibility tests when applied to isolates of P. aeruginosa
from cystic fibrosis (CF) patients. Existing methods rely on single or a few isolates grown aerobically and planktonically. Predetermined cut-offs are used to define whether the bacteria are sensitive or resistant to any given antibiotic1
. However, during chronic lung infections in CF, P. aeruginosa
populations exist in biofilms and there is evidence that the environment is largely microaerophilic2
. The stark difference in conditions between bacteria in the lung and those during diagnostic testing has called into question the reliability and even relevance of these tests3
Artificial sputum medium (ASM) is a culture medium containing the components of CF patient sputum, including amino acids, mucin and free DNA. P. aeruginosa
growth in ASM mimics growth during CF infections, with the formation of self-aggregating biofilm structures and population divergence4,5,6
. The aim of this study was to develop a microtitre-plate assay to study antimicrobial susceptibility of P. aeruginosa
based on growth in ASM, which is applicable to both microaerophilic and aerobic conditions.
An ASM assay was developed in a microtitre plate format. P. aeruginosa
biofilms were allowed to develop for 3 days prior to incubation with antimicrobial agents at different concentrations for 24 hours. After biofilm disruption, cell viability was measured by staining with resazurin. This assay was used to ascertain the sessile cell minimum inhibitory concentration (SMIC) of tobramycin for 15 different P. aeruginosa
isolates under aerobic and microaerophilic conditions and SMIC values were compared to those obtained with standard broth growth. Whilst there was some evidence for increased MIC values for isolates grown in ASM when compared to their planktonic counterparts, the biggest differences were found with bacteria tested in microaerophilic conditions, which showed a much increased resistance up to a >128 fold, towards tobramycin in the ASM system when compared to assays carried out in aerobic conditions.
The lack of association between current susceptibility testing methods and clinical outcome has questioned the validity of current methods3
. Several in vitro
models have been used previously to study P. aeruginosa
. However, these methods rely on surface attached biofilms, whereas the ASM biofilms resemble those observed in the CF lung9
. In addition, reduced oxygen concentration in the mucus has been shown to alter the behavior of P. aeruginosa2
and affect antibiotic susceptibility10
. Therefore using ASM under microaerophilic conditions may provide a more realistic environment in which to study antimicrobial susceptibility.
Immunology, Issue 64, Microbiology, Pseudomonas aeruginosa, antimicrobial susceptibility, artificial sputum media, lung infection, cystic fibrosis, diagnostics, plankton
Capsular Serotyping of Streptococcus pneumoniae by Latex Agglutination
Institutions: Murdoch Childrens Research Institute, The University of Melbourne.
Latex agglutination reagents are widely used in microbial diagnosis, identification and serotyping. Streptococcus pneumonia
e (the pneumococcus) is a major cause of morbidity and mortality world-wide. Current vaccines target the pneumococcal capsule, and there are over 90 capsular serotypes. Serotyping pneumococcal isolates is therefore important for assessing the impact of vaccination programs and for epidemiological purposes. The World Health Organization has recommended latex agglutination as an alternative method to the ‘gold standard’ Quellung test for serotyping pneumococci. Latex agglutination is a relatively simple, quick and inexpensive method; and is therefore suitable for resource-poor settings as well as laboratories with high-volume workloads. Latex agglutination reagents can be prepared in-house utilizing commercially-sourced antibodies that are passively attached to latex particles. This manuscript describes a method of production and quality control of latex agglutination reagents, and details a sequential testing approach which is time- and cost-effective.
This method of production and quality control may also be suitable for other testing purposes.
Immunology, Issue 91, Antisera, pneumococci, polysaccharide capsule, slide agglutination
Multiplex PCR and Reverse Line Blot Hybridization Assay (mPCR/RLB)
Institutions: University of Sydney.
Multiplex PCR/Reverse Line Blot Hybridization assay allows the detection of up to 43 molecular targets in 43 samples using one multiplex PCR reaction followed by probe hybridization on a nylon membrane, which is re-usable. Probes are 5' amine modified to allow fixation to the membrane. Primers are 5' biotin modified which allows detection of hybridized PCR products using streptavidin-peroxidase and a chemiluminescent substrate via photosensitive film. With low setup and consumable costs, this technique is inexpensive (approximately US$2 per sample), high throughput (multiple membranes can be processed simultaneously) and has a short turnaround time (approximately 10 hours).
The technique can be utilized in a number of ways. Multiple probes can be designed to detect sequence variation within a single amplified product, or multiple products can be amplified
simultaneously, with one (or more) probes used for subsequent detection. A combination of both approaches can also be used within a single assay. The ability to include multiple probes for a single target sequence makes the assay highly specific.
Published applications of mPCR/RLB include detection of antibiotic resistance genes1,2
, typing of methicillin-resistant Staphylococcus aureus3-5
, molecular serotyping of Streptococcus pneumoniae7,8
, Streptococcus agalactiae9
, identification of Mycobacterium
, detection of genital13-15
and respiratory tract16
pathogens and detection and identification of mollicutes18
. However, the versatility of the technique means the applications are virtually limitless and not restricted to molecular analysis of micro-organisms.
The five steps in mPCR/RLB are a) Primer and Probe design, b) DNA extraction and PCR amplification c) Preparation of the membrane, d) Hybridization and detection, and e) Regeneration of the Membrane.
Molecular Biology, Issue 54, Typing, MRSA, macroarray, molecular epidemiology
Particle Agglutination Method for Poliovirus Identification
Institutions: National Institute of Infectious Diseases, Fujirebio Inc..
In the Global Polio Eradication Initiative, laboratory diagnosis plays a critical role by isolating and identifying PV from the stool samples of acute flaccid paralysis (AFP) cases. In the World Health Organization (WHO) Global Polio Laboratory Network, PV isolation and identification are currently being performed by using cell culture system and real-time RT-PCR, respectively. In the post-eradication era of PV, simple and rapid identification procedures would be helpful for rapid confirmation of polio cases at the national laboratories. In the present study, we will show the procedure of novel PA assay developed for PV identification. This PA assay utilizes interaction of PV receptor (PVR) molecule and virion that is specific and uniform affinity to all the serotypes of PV. The procedure is simple (one step procedure in reaction plates) and rapid (results can be obtained within 2 h of reaction), and the result is visually observed (observation of agglutination of gelatin particles).
Immunology, Issue 50, Poliovirus, identification, particle agglutination, virus receptor
DNA Fingerprinting of Mycobacterium leprae Strains Using Variable Number Tandem Repeat (VNTR) - Fragment Length Analysis (FLA)
Institutions: Colorado State University.
The study of the transmission of leprosy is particularly difficult since the causative agent, Mycobacterium leprae
, cannot be cultured in the laboratory. The only sources of the bacteria are leprosy patients, and experimentally infected armadillos and nude mice. Thus, many of the methods used in modern epidemiology are not available for the study of leprosy. Despite an extensive global drug treatment program for leprosy implemented by the WHO1
, leprosy remains endemic in many countries with approximately 250,000 new cases each year.2
The entire M. leprae
genome has been mapped3,4
and many loci have been identified that have repeated segments of 2 or more base pairs (called micro- and minisatellites).5
Clinical strains of M. leprae
may vary in the number of tandem repeated segments (short tandem repeats, STR) at many of these loci.5,6,7
Variable number tandem repeat (VNTR)5
analysis has been used to distinguish different strains of the leprosy bacilli. Some of the loci appear to be more stable than others, showing less variation in repeat numbers, while others seem to change more rapidly, sometimes in the same patient. While the variability of certain VNTRs has brought up questions regarding their suitability for strain typing7,8,9
, the emerging data suggest that analyzing multiple loci, which are diverse in their stability, can be used as a valuable epidemiological tool. Multiple locus VNTR analysis (MLVA)10
has been used to study leprosy evolution and transmission in several countries including China11,12
, the Philippines10,13
, and Brazil14
. MLVA involves multiple steps. First, bacterial DNA is extracted along with host tissue DNA from clinical biopsies or slit skin smears (SSS).10
The desired loci are then amplified from the extracted DNA via polymerase chain reaction (PCR). Fluorescently-labeled primers for 4-5 different loci are used per reaction, with 18 loci being amplified in a total of four reactions.10
The PCR products may be subjected to agarose gel electrophoresis to verify the presence of the desired DNA segments, and then submitted for fluorescent fragment length analysis (FLA) using capillary electrophoresis. DNA from armadillo passaged bacteria with a known number of repeat copies for each locus is used as a positive control. The FLA chromatograms are then examined using Peak Scanner
software and fragment length is converted to number of VNTR copies (allele). Finally, the VNTR haplotypes are analyzed for patterns, and when combined with patient clinical data can be used to track distribution of strain types.
Immunology, Issue 53, Mycobacterium leprae, leprosy, biopsy, STR, VNTR, PCR, fragment length analysis
Diagnosing Pulmonary Tuberculosis with the Xpert MTB/RIF Test
Institutions: University of Bern, MCL Laboratories Inc..
Tuberculosis (TB) due to Mycobacterium tuberculosis
(MTB) remains a major public health issue: the infection affects up to one third of the world population1
, and almost two million people are killed by TB each year.2
Universal access to high-quality, patient-centered treatment for all TB patients is emphasized by WHO's Stop TB Strategy.3
The rapid detection of MTB in respiratory specimens and drug therapy based on reliable drug resistance testing results are a prerequisite for the successful implementation of this strategy. However, in many areas of the world, TB diagnosis still relies on insensitive, poorly standardized sputum microscopy methods. Ineffective TB detection and the emergence and transmission of drug-resistant MTB strains increasingly jeopardize global TB control activities.2
Effective diagnosis of pulmonary TB requires the availability - on a global scale - of standardized, easy-to-use, and robust diagnostic tools that would allow the direct detection of both the MTB complex and resistance to key antibiotics, such as rifampicin (RIF). The latter result can serve as marker for multidrug-resistant MTB (MDR TB) and has been reported in > 95% of the MDR-TB isolates.4, 5
The rapid availability of reliable test results is likely to directly translate into sound patient management decisions that, ultimately, will cure the individual patient and break the chain of TB transmission in the community.2
Cepheid's (Sunnyvale, CA, U.S.A.) Xpert MTB/RIF assay6, 7
meets the demands outlined above in a remarkable manner. It is a nucleic-acids amplification test for 1) the detection of MTB complex DNA in sputum or concentrated sputum sediments; and 2) the detection of RIF resistance-associated mutations of the rpoB
It is designed for use with Cepheid's GeneXpert Dx System that integrates and automates sample processing, nucleic acid amplification, and detection of the target sequences using real-time PCR and reverse transcriptase PCR. The system consists of an instrument, personal computer, barcode scanner, and preloaded software for running tests and viewing the results.9
It employs single-use disposable Xpert MTB/RIF cartridges that hold PCR reagents and host the PCR process. Because the cartridges are self-contained, cross-contamination between samples is eliminated.6
Current nucleic acid amplification methods used to detect MTB are complex, labor-intensive, and technically demanding. The Xpert MTB/RIF assay has the potential to bring standardized, sensitive and very specific diagnostic testing for both TB and drug resistance to universal-access point-of-care settings3
, provided that they will be able to afford it. In order to facilitate access, the Foundation for Innovative New Diagnostics (FIND) has negotiated significant price reductions. Current FIND-negotiated prices, along with the list of countries eligible for the discounts, are available on the web.10
Immunology, Issue 62, tuberculosis, drug resistance, rifampicin, rapid diagnosis, Xpert MTB/RIF test
TransFLP — A Method to Genetically Modify Vibrio cholerae Based on Natural Transformation and FLP-recombination
Institutions: Ecole Polytechnique Fédérale de Lausanne (EPFL).
Several methods are available to manipulate bacterial chromosomes1-3
. Most of these protocols rely on the insertion of conditionally replicative plasmids (e.g.
harboring pir-dependent or temperature-sensitive replicons1,2
). These plasmids are integrated into bacterial chromosomes based on homology-mediated recombination. Such insertional mutants are often directly used in experimental settings. Alternatively, selection for plasmid excision followed by its loss can be performed, which for Gram-negative bacteria often relies on the counter-selectable levan sucrase enzyme encoded by the sacB
. The excision can either restore the pre-insertion genotype or result in an exchange between the chromosome and the plasmid-encoded copy of the modified gene. A disadvantage of this technique is that it is time-consuming. The plasmid has to be cloned first; it requires horizontal transfer into V. cholerae
(most notably by mating with an E. coli
donor strain) or artificial transformation of the latter; and the excision of the plasmid is random and can either restore the initial genotype or create the desired modification if no positive selection is exerted. Here, we present a method for rapid manipulation of the V. cholerae
). This TransFLP method is based on the recently discovered chitin-mediated induction of natural competence in this organism6
and other representative of the genus Vibrio
such as V. fischeri7
. Natural competence allows the uptake of free DNA including PCR-generated DNA fragments. Once taken up, the DNA recombines with the chromosome given the presence of a minimum of 250-500 bp of flanking homologous region8
. Including a selection marker in-between these flanking regions allows easy detection of frequently occurring transformants.
This method can be used for different genetic manipulations of V. cholerae
and potentially also other naturally competent bacteria. We provide three novel examples on what can be accomplished by this method in addition to our previously published study on single gene deletions and the addition of affinity-tag sequences5
. Several optimization steps concerning the initial protocol of chitin-induced natural transformation6
are incorporated in this TransFLP protocol. These include among others the replacement of crab shell fragments by commercially available chitin flakes8
, the donation of PCR-derived DNA as transforming material9
, and the addition of FLP-recombination target sites (FRT)5
. FRT sites allow site-directed excision of the selection marker mediated by the Flp recombinase10
Immunology, Issue 68, Microbiology, Genetics, natural transformation, DNA uptake, FLP recombination, chitin, Vibrio cholerae
Capsular Serotyping of Streptococcus pneumoniae Using the Quellung Reaction
Institutions: Murdoch Childrens Research Institute, The University of Melbourne.
There are over 90 different capsular serotypes of Streptococcus pneumoniae
(the pneumococcus). As well as being a tool for understanding pneumococcal epidemiology, capsular serotyping can provide useful information for vaccine efficacy and impact studies. The Quellung reaction is the gold standard method for pneumococcal capsular serotyping. The method involves testing a pneumococcal cell suspension with pooled and specific antisera directed against the capsular polysaccharide. The antigen-antibody reactions are observed microscopically. The protocol has three main steps: 1) preparation of a bacterial cell suspension, 2) mixing of cells and antisera on a glass slide, and 3) reading the Quellung reaction using a microscope. The Quellung reaction is reasonably simple to perform and can be applied wherever a suitable microscope and antisera are available.
Immunology, Issue 84, Streptococcus pneumoniae, Quellung, serotyping, Neufeld, pneumococcus
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo
mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo
mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo
mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1
and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo
mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2
. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo
mutations. This is the case for autism and schizophrenia3
. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo
mutations would more frequently come from males, particularly older males4
. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo
mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo
mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing