An inexpensive, noninvasive system that could accurately classify flying insects would have important implications for entomological research, and allow for the development of many useful applications in vector and pest control for both medical and agricultural entomology. Given this, the last sixty years have seen many research efforts devoted to this task. To date, however, none of this research has had a lasting impact. In this work, we show that pseudo-acoustic optical sensors can produce superior data; that additional features, both intrinsic and extrinsic to the insect’s flight behavior, can be exploited to improve insect classification; that a Bayesian classification approach allows to efficiently learn classification models that are very robust to over-fitting, and a general classification framework allows to easily incorporate arbitrary number of features. We demonstrate the findings with large-scale experiments that dwarf all previous works combined, as measured by the number of insects and the number of species considered.
26 Related JoVE Articles!
Characterization of Surface Modifications by White Light Interferometry: Applications in Ion Sputtering, Laser Ablation, and Tribology Experiments
Institutions: Argonne National Laboratory, Argonne National Laboratory, MassThink LLC.
In materials science and engineering it is often necessary to obtain quantitative measurements of surface topography with micrometer lateral resolution. From the measured surface, 3D topographic maps can be subsequently analyzed using a variety of software packages to extract the information that is needed.
In this article we describe how white light interferometry, and optical profilometry (OP) in general, combined with generic surface analysis software, can be used for materials science and engineering tasks. In this article, a number of applications of white light interferometry for investigation of surface modifications in mass spectrometry, and wear phenomena in tribology and lubrication are demonstrated. We characterize the products of the interaction of semiconductors and metals with energetic ions (sputtering), and laser irradiation (ablation), as well as ex situ
measurements of wear of tribological test specimens.
Specifically, we will discuss:
Aspects of traditional ion sputtering-based mass spectrometry such as sputtering rates/yields measurements on Si and Cu and subsequent time-to-depth conversion.
Results of quantitative characterization of the interaction of femtosecond laser irradiation with a semiconductor surface. These results are important for applications such as ablation mass spectrometry, where the quantities of evaporated material can be studied and controlled via pulse duration and energy per pulse. Thus, by determining the crater geometry one can define depth and lateral resolution versus experimental setup conditions.
Measurements of surface roughness parameters in two dimensions, and quantitative measurements of the surface wear that occur as a result of friction and wear tests.
Some inherent drawbacks, possible artifacts, and uncertainty assessments of the white light interferometry approach will be discussed and explained.
Materials Science, Issue 72, Physics, Ion Beams (nuclear interactions), Light Reflection, Optical Properties, Semiconductor Materials, White Light Interferometry, Ion Sputtering, Laser Ablation, Femtosecond Lasers, Depth Profiling, Time-of-flight Mass Spectrometry, Tribology, Wear Analysis, Optical Profilometry, wear, friction, atomic force microscopy, AFM, scanning electron microscopy, SEM, imaging, visualization
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Assessing Differences in Sperm Competitive Ability in Drosophila
Institutions: University of California, Irvine.
Competition among conspecific males for fertilizing the ova is one of the mechanisms of sexual selection, i.e.
selection that operates on maximizing the number of successful mating events rather than on maximizing survival and viability 1
. Sperm competition represents the competition between males after copulating with the same female 2
, in which their sperm are coincidental in time and space. This phenomenon has been reported in multiple species of plants and animals 3
. For example, wild-caught D. melanogaster
females usually contain sperm from 2-3 males 4
. The sperm are stored in specialized organs with limited storage capacity, which might lead to the direct competition of the sperm from different males 2,5
Comparing sperm competitive ability of different males of interest (experimental male types) has been performed through controlled double-mating experiments in the laboratory 6,7
. Briefly, a single female is exposed to two different males consecutively, one experimental male and one cross-mating reference male. The same mating scheme is then followed using other experimental male types thus facilitating the indirect comparison of the competitive ability of their sperm through a common reference. The fraction of individuals fathered by the experimental and reference males is identified using markers, which allows one to estimate sperm competitive ability using simple mathematical expressions 7,8
. In addition, sperm competitive ability can be estimated in two different scenarios depending on whether the experimental male is second or first to mate (offense and defense assay, respectively) 9
, which is assumed to be reflective of different competence attributes.
Here, we describe an approach that helps to interrogate the role of different genetic factors that putatively underlie the phenomenon of sperm competitive ability in D. melanogaster
Developmental Biology, Issue 78, Molecular Biology, Cellular Biology, Genetics, Biochemistry, Spermatozoa, Drosophila melanogaster, Biological Evolution, Phenotype, genetics (animal and plant), animal biology, double-mating experiment, sperm competitive ability, male fertility, Drosophila, fruit fly, animal model
Training Synesthetic Letter-color Associations by Reading in Color
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
Assessment of Age-related Changes in Cognitive Functions Using EmoCogMeter, a Novel Tablet-computer Based Approach
Institutions: Freie Universität Berlin, Charité Berlin, Freie Universität Berlin, Psychiatric University Hospital Zurich.
The main goal of this study was to assess the usability of a tablet-computer-based application (EmoCogMeter) in investigating the effects of age on cognitive functions across the lifespan in a sample of 378 healthy subjects (age range 18-89 years). Consistent with previous findings we found an age-related cognitive decline across a wide range of neuropsychological domains (memory, attention, executive functions), thereby proving the usability of our tablet-based application. Regardless of prior computer experience, subjects of all age groups were able to perform the tasks without instruction or feedback from an experimenter. Increased motivation and compliance proved to be beneficial for task performance, thereby potentially increasing the validity of the results. Our promising findings underline the great clinical and practical potential of a tablet-based application for detection and monitoring of cognitive dysfunction.
Behavior, Issue 84, Neuropsychological Testing, cognitive decline, age, tablet-computer, memory, attention, executive functions
A Practical Guide to Phylogenetics for Nonexperts
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Computerized Dynamic Posturography for Postural Control Assessment in Patients with Intermittent Claudication
Institutions: University of Sydney, University of Hull, Hull and East Yorkshire Hospitals, Addenbrookes Hospital.
Computerized dynamic posturography with the EquiTest is an objective technique for measuring postural strategies under challenging static and dynamic conditions. As part of a diagnostic assessment, the early detection of postural deficits is important so that appropriate and targeted interventions can be prescribed. The Sensory Organization Test (SOT) on the EquiTest determines an individual's use of the sensory systems (somatosensory, visual, and vestibular) that are responsible for postural control. Somatosensory and visual input are altered by the calibrated sway-referenced support surface and visual surround, which move in the anterior-posterior direction in response to the individual's postural sway. This creates a conflicting sensory experience. The Motor Control Test (MCT) challenges postural control by creating unexpected postural disturbances in the form of backwards and forwards translations. The translations are graded in magnitude and the time to recover from the perturbation is computed.
Intermittent claudication, the most common symptom of peripheral arterial disease, is characterized by a cramping pain in the lower limbs and caused by muscle ischemia secondary to reduced blood flow to working muscles during physical exertion. Claudicants often display poor balance, making them susceptible to falls and activity avoidance. The Ankle Brachial Pressure Index (ABPI) is a noninvasive method for indicating the presence of peripheral arterial disease and intermittent claudication, a common symptom in the lower extremities. ABPI is measured as the highest systolic pressure from either the dorsalis pedis or posterior tibial artery divided by the highest brachial artery systolic pressure from either arm. This paper will focus on the use of computerized dynamic posturography in the assessment of balance in claudicants.
Medicine, Issue 82, Posture, Computerized dynamic posturography, Ankle brachial pressure index, Peripheral arterial disease, Intermittent claudication, Balance, Posture, EquiTest, Sensory Organization Test, Motor Control Test
Detection of the Genome and Transcripts of a Persistent DNA Virus in Neuronal Tissues by Fluorescent In situ Hybridization Combined with Immunostaining
Institutions: CNRS UMR 5534, Université de Lyon 1, LabEX DEVweCAN, CNRS UPR 3296, CNRS UMR 5286.
Single cell codetection of a gene, its RNA product and cellular regulatory proteins is critical to study gene expression regulation. This is a challenge in the field of virology; in particular for nuclear-replicating persistent DNA viruses that involve animal models for their study. Herpes simplex virus type 1 (HSV-1) establishes a life-long latent infection in peripheral neurons. Latent virus serves as reservoir, from which it reactivates and induces a new herpetic episode. The cell biology of HSV-1 latency remains poorly understood, in part due to the lack of methods to detect HSV-1 genomes in situ
in animal models. We describe a DNA-fluorescent in situ
hybridization (FISH) approach efficiently detecting low-copy viral genomes within sections of neuronal tissues from infected animal models. The method relies on heat-based antigen unmasking, and directly labeled home-made DNA probes, or commercially available probes. We developed a triple staining approach, combining DNA-FISH with RNA-FISH and immunofluorescence, using peroxidase based signal amplification to accommodate each staining requirement. A major improvement is the ability to obtain, within 10 µm tissue sections, low-background signals that can be imaged at high resolution by confocal microscopy and wide-field conventional epifluorescence. Additionally, the triple staining worked with a wide range of antibodies directed against cellular and viral proteins. The complete protocol takes 2.5 days to accommodate antibody and probe penetration within the tissue.
Neuroscience, Issue 83, Life Sciences (General), Virology, Herpes Simplex Virus (HSV), Latency, In situ hybridization, Nuclear organization, Gene expression, Microscopy
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Accuracy in Dental Medicine, A New Way to Measure Trueness and Precision
Institutions: University of Zürich.
Reference scanners are used in dental medicine to verify a lot of procedures. The main interest is to verify impression methods as they serve as a base for dental restorations. The current limitation of many reference scanners is the lack of accuracy scanning large objects like full dental arches, or the limited possibility to assess detailed tooth surfaces. A new reference scanner, based on focus variation scanning technique, was evaluated with regards to highest local and general accuracy. A specific scanning protocol was tested to scan original tooth surface from dental impressions. Also, different model materials were verified. The results showed a high scanning accuracy of the reference scanner with a mean deviation of 5.3 ± 1.1 µm for trueness and 1.6 ± 0.6 µm for precision in case of full arch scans. Current dental impression methods showed much higher deviations (trueness: 20.4 ± 2.2 µm, precision: 12.5 ± 2.5 µm) than the internal scanning accuracy of the reference scanner. Smaller objects like single tooth surface can be scanned with an even higher accuracy, enabling the system to assess erosive and abrasive tooth surface loss. The reference scanner can be used to measure differences for a lot of dental research fields. The different magnification levels combined with a high local and general accuracy can be used to assess changes of single teeth or restorations up to full arch changes.
Medicine, Issue 86, Laboratories, Dental, Calibration, Technology, Dental impression, Accuracy, Trueness, Precision, Full arch scan, Abrasion
Patient-specific Modeling of the Heart: Estimation of Ventricular Fiber Orientations
Institutions: Johns Hopkins University.
Patient-specific simulations of heart (dys)function aimed at personalizing cardiac therapy are hampered by the absence of in vivo
imaging technology for clinically acquiring myocardial fiber orientations. The objective of this project was to develop a methodology to estimate cardiac fiber orientations from in vivo
images of patient heart geometries. An accurate representation of ventricular geometry and fiber orientations was reconstructed, respectively, from high-resolution ex vivo structural magnetic resonance (MR) and diffusion tensor (DT) MR images of a normal human heart, referred to as the atlas. Ventricular geometry of a patient heart was extracted, via
semiautomatic segmentation, from an in vivo
computed tomography (CT) image. Using image transformation algorithms, the atlas ventricular geometry was deformed to match that of the patient. Finally, the deformation field was applied to the atlas fiber orientations to obtain an estimate of patient fiber orientations. The accuracy of the fiber estimates was assessed using six normal and three failing canine hearts. The mean absolute difference between inclination angles of acquired and estimated fiber orientations was 15.4 °. Computational simulations of ventricular activation maps and pseudo-ECGs in sinus rhythm and ventricular tachycardia indicated that there are no significant differences between estimated and acquired fiber orientations at a clinically observable level.The new insights obtained from the project will pave the way for the development of patient-specific models of the heart that can aid physicians in personalized diagnosis and decisions regarding electrophysiological interventions.
Bioengineering, Issue 71, Biomedical Engineering, Medicine, Anatomy, Physiology, Cardiology, Myocytes, Cardiac, Image Processing, Computer-Assisted, Magnetic Resonance Imaging, MRI, Diffusion Magnetic Resonance Imaging, Cardiac Electrophysiology, computerized simulation (general), mathematical modeling (systems analysis), Cardiomyocyte, biomedical image processing, patient-specific modeling, Electrophysiology, simulation
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
A Novel Bayesian Change-point Algorithm for Genome-wide Analysis of Diverse ChIPseq Data Types
Institutions: Stony Brook University, Cold Spring Harbor Laboratory, University of Texas at Dallas.
ChIPseq is a widely used technique for investigating protein-DNA interactions. Read density profiles are generated by using next-sequencing of protein-bound DNA and aligning the short reads to a reference genome. Enriched regions are revealed as peaks, which often differ dramatically in shape, depending on the target protein1
. For example, transcription factors often bind in a site- and sequence-specific manner and tend to produce punctate peaks, while histone modifications are more pervasive and are characterized by broad, diffuse islands of enrichment2
. Reliably identifying these regions was the focus of our work.
Algorithms for analyzing ChIPseq data have employed various methodologies, from heuristics3-5
to more rigorous statistical models, e.g.
Hidden Markov Models (HMMs)6-8
. We sought a solution that minimized the necessity for difficult-to-define, ad hoc parameters that often compromise resolution and lessen the intuitive usability of the tool. With respect to HMM-based methods, we aimed to curtail parameter estimation procedures and simple, finite state classifications that are often utilized.
Additionally, conventional ChIPseq data analysis involves categorization of the expected read density profiles as either punctate or diffuse followed by subsequent application of the appropriate tool. We further aimed to replace the need for these two distinct models with a single, more versatile model, which can capably address the entire spectrum of data types.
To meet these objectives, we first constructed a statistical framework that naturally modeled ChIPseq data structures using a cutting edge advance in HMMs9
, which utilizes only explicit formulas-an innovation crucial to its performance advantages. More sophisticated then heuristic models, our HMM accommodates infinite hidden states through a Bayesian model. We applied it to identifying reasonable change points in read density, which further define segments of enrichment. Our analysis revealed how our Bayesian Change Point (BCP) algorithm had a reduced computational complexity-evidenced by an abridged run time and memory footprint. The BCP algorithm was successfully applied to both punctate peak and diffuse island identification with robust accuracy and limited user-defined parameters. This illustrated both its versatility and ease of use. Consequently, we believe it can be implemented readily across broad ranges of data types and end users in a manner that is easily compared and contrasted, making it a great tool for ChIPseq data analysis that can aid in collaboration and corroboration between research groups. Here, we demonstrate the application of BCP to existing transcription factor10,11
and epigenetic data12
to illustrate its usefulness.
Genetics, Issue 70, Bioinformatics, Genomics, Molecular Biology, Cellular Biology, Immunology, Chromatin immunoprecipitation, ChIP-Seq, histone modifications, segmentation, Bayesian, Hidden Markov Models, epigenetics
Determining Cell Number During Cell Culture using the Scepter Cell Counter
Institutions: Millipore Inc.
Counting cells is often a necessary but tedious step for in vitro
cell culture. Consistent cell concentrations ensure experimental reproducibility and accuracy. Cell counts are important for monitoring cell health and proliferation rate, assessing immortalization or transformation, seeding cells for subsequent experiments, transfection or infection, and preparing for cell-based assays. It is important that cell counts be accurate, consistent, and fast, particularly for quantitative measurements of cellular responses.
Despite this need for speed and accuracy in cell counting, 71% of 400 researchers surveyed1
who count cells using a hemocytometer. While hemocytometry is inexpensive, it is laborious and subject to user bias and misuse, which results in inaccurate counts. Hemocytometers are made of special optical glass on which cell suspensions are loaded in specified volumes and counted under a microscope. Sources of errors in hemocytometry include: uneven cell distribution in the sample, too many or too few cells in the sample, subjective decisions as to whether a given cell falls within the defined counting area, contamination of the hemocytometer, user-to-user variation, and variation of hemocytometer filling rate2
To alleviate the tedium associated with manual counting, 29% of researchers count cells using automated cell counting devices; these include vision-based counters, systems that detect cells using the Coulter principle, or flow cytometry1
. For most researchers, the main barrier to using an automated system is the price associated with these large benchtop instruments1
The Scepter cell counter is an automated handheld device that offers the automation and accuracy of Coulter counting at a relatively low cost. The system employs the Coulter principle of impedance-based particle detection3
in a miniaturized format using a combination of analog and digital hardware for sensing, signal processing, data storage, and graphical display. The disposable tip is engineered with a microfabricated, cell- sensing zone that enables discrimination by cell size and cell volume at sub-micron and sub-picoliter resolution. Enhanced with precision liquid-handling channels and electronics, the Scepter cell counter reports cell population statistics graphically displayed as a histogram.
Cellular Biology, Issue 45, Scepter, cell counting, cell culture, hemocytometer, Coulter, Impedance-based particle detection
Isolation of Fidelity Variants of RNA Viruses and Characterization of Virus Mutation Frequency
Institutions: Institut Pasteur .
RNA viruses use RNA dependent RNA polymerases to replicate their genomes. The intrinsically high error rate of these enzymes is a large contributor to the generation of extreme population diversity that facilitates virus adaptation and evolution. Increasing evidence shows that the intrinsic error rates, and the resulting mutation frequencies, of RNA viruses can be modulated by subtle amino acid changes to the viral polymerase. Although biochemical assays exist for some viral RNA polymerases that permit quantitative measure of incorporation fidelity, here we describe a simple method of measuring mutation frequencies of RNA viruses that has proven to be as accurate as biochemical approaches in identifying fidelity altering mutations. The approach uses conventional virological and sequencing techniques that can be performed in most biology laboratories. Based on our experience with a number of different viruses, we have identified the key steps that must be optimized to increase the likelihood of isolating fidelity variants and generating data of statistical significance. The isolation and characterization of fidelity altering mutations can provide new insights into polymerase structure and function1-3
. Furthermore, these fidelity variants can be useful tools in characterizing mechanisms of virus adaptation and evolution4-7
Immunology, Issue 52, Polymerase fidelity, RNA virus, mutation frequency, mutagen, RNA polymerase, viral evolution
A Protocol for Computer-Based Protein Structure and Function Prediction
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
Measurement of γHV68 Infection in Mice
Institutions: University of Southern California, Los Angeles.
γ-Herpesviruses (γ-HVs) are notable for their ability to establish latent infections of lymphoid cells1
. The narrow host range of human γ-HVs, such as EBV and KSHV, has severely hindered detailed pathogenic studies. Murine γ-herpesvirus 68 (γHV68) shares extensive genetic and biological similarities with human γ-HVs and is a natural pathogen of murid rodents2
. As such, evaluation of γHV68 infection of mice inbred strains at different stages of viral infection provides an important model for understanding viral lifecycle and pathogenesis during γ-HVs infection.
Upon intranasal inoculation, γHV68 infection results in acute viremia in the lung that is later resolved into a latent infection of splenocytes and other cells, which may be reactivated throughout the life of the host3,4
. In this protocol, we will describe how to use the plaque assay to assess infectious virus titer in the lung homogenates on Vero cell monolayers at the early stage (5 - 7 days) of post-intranasal infection (dpi). While acute infection is largely cleared 2 - 3 weeks postinfection, a latent infection of γHV68 is established around 14 dpi and maintained later on in the spleen of the mice. Latent infection usually affects a very small population of cells in the infected tissues, whereby the virus stays dormant and shuts off most of its gene expression. Latently-infected splenocytes spontaneously reactivate virus upon explanting into tissue culture, which can be recapitulated by an infectious center (IC) assay to determine the viral latent load. To further estimate the amount of viral genome copies in the acutely and/or latently infected tissues, quantitative real-time PCR (qPCR) is used for its maximal sensitivity and accuracy. The combined analyses of the results of qPCR and plaque assay, and/or IC assay will reveal the spatiotemporal profiles of viral replication and infectivity in vivo
Immunology, Issue 57, γHV68, herpesvirus, viral infection, plaque assay, infectious center assay, PCR, qPCR, host-virus interaction
Detection of Invasive Pulmonary Aspergillosis in Haematological Malignancy Patients by using Lateral-flow Technology
Institutions: University of Exeter, Queen Mary University of London, St. Bartholomew's Hospital and The London NHS Trust.
Invasive pulmonary aspergillosis (IPA) is a leading cause of morbidity and mortality in haematological malignancy patients and hematopoietic stem cell transplant recipients1
. Detection of IPA represents a formidable diagnostic challenge and, in the absence of a 'gold standard', relies on a combination of clinical data and microbiology and histopathology where feasible. Diagnosis of IPA must conform to the European Organization for Research and Treatment of Cancer and the National Institute of Allergy and Infectious Diseases Mycology Study Group (EORTC/MSG) consensus defining "proven", "probable", and "possible" invasive fungal diseases2
. Currently, no nucleic acid-based tests have been externally validated for IPA detection and so polymerase chain reaction (PCR) is not included in current EORTC/MSG diagnostic criteria.
Identification of Aspergillus
in histological sections is problematic because of similarities in hyphal morphologies with other invasive fungal pathogens3
, and proven identification requires isolation of the etiologic agent in pure culture. Culture-based approaches rely on the availability of biopsy samples, but these are not always accessible in sick patients, and do not always yield viable propagules for culture when obtained.
An important feature in the pathogenesis of Aspergillus
is angio-invasion, a trait that provides opportunities to track the fungus immunologically using tests that detect characteristic antigenic signatures molecules in serum and bronchoalveolar lavage (BAL) fluids. This has led to the development of the Platelia enzyme immunoassay (GM-EIA) that detects Aspergillus
galactomannan and a 'pan-fungal' assay (Fungitell test) that detects the conserved fungal cell wall component (1 →3)-β-D-glucan, but not in the mucorales that lack this component in their cell walls1,4
. Issues surrounding the accuracy of these tests1,4-6
has led to the recent development of next-generation monoclonal antibody (MAb)-based assays that detect surrogate markers of infection1,5
recently described the generation of an Aspergillus
-specific MAb (JF5) using hybridoma technology and its use to develop an immuno-chromatographic lateral-flow device (LFD) for the point-of-care (POC) diagnosis of IPA. A major advantage of the LFD is its ability to detect activity since MAb JF5 binds to an extracellular glycoprotein antigen that is secreted during active growth of the fungus only5
. This is an important consideration when using fluids such as lung BAL for diagnosing IPA since Aspergillus
spores are a common component of inhaled air. The utility of the device in diagnosing IPA has been demonstrated using an animal model of infection, where the LFD displayed improved sensitivity and specificity compared to the Platelia GM and Fungitell (1 → 3)-β-D-glucan assays7
Here, we present a simple LFD procedure to detect Aspergillus
antigen in human serum and BAL fluids. Its speed and accuracy provides a novel adjunct point-of-care test for diagnosis of IPA in haematological malignancy patients.
Immunology, Issue 61, Invasive pulmonary aspergillosis, acute myeloid leukemia, bone marrow transplant, diagnosis, monoclonal antibody, lateral-flow technology
A Primary Neuron Culture System for the Study of Herpes Simplex Virus Latency and Reactivation
Institutions: New York University School of Medicine, New York University School of Medicine, New York University School of Medicine, New York University School of Medicine, New York University School of Medicine, New York University School of Medicine, New York University School of Medicine.
Herpes simplex virus type-1 (HSV-1) establishes a life-long latent infection in peripheral neurons. This latent reservoir is the source of recurrent reactivation events that ensure transmission and contribute to clinical disease. Current antivirals do not impact the latent reservoir and there are no vaccines. While the molecular details of lytic replication are well-characterized, mechanisms controlling latency in neurons remain elusive. Our present understanding of latency is derived from in vivo
studies using small animal models, which have been indispensable for defining viral gene requirements and the role of immune responses. However, it is impossible to distinguish specific effects on the virus-neuron relationship from more general consequences of infection mediated by immune or non-neuronal support cells in live animals. In addition, animal experimentation is costly, time-consuming, and limited in terms of available options for manipulating host processes. To overcome these limitations, a neuron-only system is desperately needed that reproduces the in vivo
characteristics of latency and reactivation but offers the benefits of tissue culture in terms of homogeneity and accessibility.
Here we present an in vitro
model utilizing cultured primary sympathetic neurons from rat superior cervical ganglia (SCG) (Figure 1
) to study HSV-1 latency and reactivation that fits most if not all of the desired criteria. After eliminating non-neuronal cells, near-homogeneous TrkA+
neuron cultures are infected with HSV-1 in the presence of acyclovir (ACV) to suppress lytic replication. Following ACV removal, non-productive HSV-1 infections that faithfully exhibit accepted hallmarks of latency are efficiently established. Notably, lytic mRNAs, proteins, and infectious virus become undetectable, even in the absence of selection, but latency-associated transcript (LAT) expression persists in neuronal nuclei. Viral genomes are maintained at an average copy number of 25 per neuron and can be induced to productively replicate by interfering with PI3-Kinase / Akt signaling or the simple withdrawal of nerve growth factor1
. A recombinant HSV-1 encoding EGFP fused to the viral lytic protein Us11 provides a functional, real-time marker for replication resulting from reactivation that is readily quantified. In addition to chemical treatments, genetic methodologies such as RNA-interference or gene delivery via lentiviral vectors can be successfully applied to the system permitting mechanistic studies that are very difficult, if not impossible, in animals. In summary, the SCG-based HSV-1 latency / reactivation system provides a powerful, necessary tool to unravel the molecular mechanisms controlling HSV1 latency and reactivation in neurons, a long standing puzzle in virology whose solution may offer fresh insights into developing new therapies that target the latent herpesvirus reservoir.
Immunology, Issue 62, neuron cell culture, Herpes Simplex Virus (HSV), molecular biology, virology
Multiplexed Fluorometric ImmunoAssay Testing Methodology and Troubleshooting
Institutions: Charles River .
To ensure the quality of animal models used in biomedical research we have developed a number of diagnostic testing strategies and methods to determine if animals have been exposed to adventitious infectious agents (viruses, mycoplasma, and other fastidious microorganisms). Infections of immunocompetent animals are generally transient, yet serum antibody responses to infection often can be detected within days to weeks and persist throughout the life of the host. Serology is the primary diagnostic methodology by which laboratory animals are monitored. Historically the indirect enzyme-linked immunosorbent assay (ELISA) has been the main screening method for serosurveillance. The ELISA is performed as a singleplex, in which one microbial antigen-antibody reaction is measured per well. In comparison the MFIA is performed as a multiplexed
assay. Since the microspheres come in 100 distinct color sets, as many as 100 different assays can be performed simultaneously in a single microplate well. This innovation decreases the amount of serum, reagents and disposables required for routine testing while increasing the amount of information obtained from a single test well. In addition, we are able to incorporate multiple internal control beads to verify sample and system suitability and thereby assure the accuracy of results. These include tissue control and IgG anti-test serum species immunoglobulin (αIg) coated bead sets to evaluate sample suitability. As in the ELISA and IFA, the tissue control detects non-specific binding of serum immunoglobulin. The αIg control (Serum control) confirms that serum has been added and contains a sufficient immunoglobulin concentration while the IgG control bead (System Suitability control), coated with serum species immunoglobulin, demonstrates that the labeled reagents and Luminex reader are functioning properly.
Basic Protocols, Issue 58, Multiplexed Fluorometric ImmunoAssay, MFIA, bead, serum, BAG, SPE, aggregate, microarray
Functional Mapping with Simultaneous MEG and EEG
Institutions: MGH - Massachusetts General Hospital.
We use magnetoencephalography (MEG) and electroencephalography (EEG) to locate and determine the temporal evolution in brain areas involved in the processing of simple sensory stimuli. We will use somatosensory stimuli to locate the hand somatosensory areas, auditory stimuli to locate the auditory cortices, visual stimuli in four quadrants of the visual field to locate the early visual areas. These type of experiments are used for functional mapping in epileptic and brain tumor patients to locate eloquent cortices. In basic neuroscience similar experimental protocols are used to study the orchestration of cortical activity. The acquisition protocol includes quality assurance procedures, subject preparation for the combined MEG/EEG study, and acquisition of evoked-response data with somatosensory, auditory, and visual stimuli. We also demonstrate analysis of the data using the equivalent current dipole model and cortically-constrained minimum-norm estimates. Anatomical MRI data are employed in the analysis for visualization and for deriving boundaries of tissue boundaries for forward modeling and cortical location and orientation constraints for the minimum-norm estimates.
JoVE neuroscience, Issue 40, neuroscience, brain, MEG, EEG, functional imaging
Cell Block Preparation from Cytology Specimen with Predominance of Individually Scattered Cells
Institutions: University of Wisconsin - Milwaukee.
This video demonstrates Shidham's method for preparation of cell blocks from liquid based cervicovaginal cytology specimens containing individually scattered cells and small cell groups. This technique uses HistoGel (Thermo Scientific) with conventional laboratory equipment.
The use of cell block sections is a valuable ancillary tool for evaluation of non-gynecologic cytology. They enable the cytopathologist to study additional morphologic specimen detail including the architecture of the lesion. Most importantly, they allow for the evaluation of ancillary studies such as immunocytochemistry, in-situ hybridization tests (FISH/CISH) and in-situ polymerase chain reaction (PCR). Traditional cell block preparation techniques have mostly been applied to non-gynecologic cytology specimens, typically for body fluid effusions and fine needle aspiration biopsies.
Liquid based cervicovaginal specimens are relatively less cellular than their non-gynecologic counterparts with many individual scattered cells. Because of this, adequate cellularity within the cell block sections is difficult to achieve. In addition, the histotechnologist sectioning the block cannot visualize the level at which the cells are at the highest concentration. Therefore, it is difficult to monitor the appropriate level at which sections can be selected to be transferred to the glass slides for testing. As a result, the area of the cell block with the cells of interest may be missed, either by cutting past or not cutting deep enough. Current protocol for Shidham's method addresses these issues. Although this protocol is standardized and reported for gynecologic liquid based cytology specimens, it can also be applied to non-gynecologic specimens such as effusion fluids, FNA, brushings, cyst contents etc for improved quality of diagnostic material in cell block sections.
Cellular Biology, Issue 29, surgical pathology, cytopathology, FNA, cellblocks, SCIP. immunohistochemistry
Building a Better Mosquito: Identifying the Genes Enabling Malaria and Dengue Fever Resistance in A. gambiae and A. aegypti Mosquitoes
Institutions: Johns Hopkins University.
In this interview, George Dimopoulos focuses on the physiological mechanisms used by mosquitoes to combat Plasmodium falciparum and dengue virus infections. Explanation is given for how key refractory genes, those genes conferring resistance to vector pathogens, are identified in the mosquito and how this knowledge can be used to generate transgenic mosquitoes that are unable to carry the malaria parasite or dengue virus.
Cellular Biology, Issue 5, Translational Research, mosquito, malaria, virus, dengue, genetics, injection, RNAi, transgenesis, transgenic
Preventing the Spread of Malaria and Dengue Fever Using Genetically Modified Mosquitoes
Institutions: University of California, Irvine (UCI).
In this candid interview, Anthony A. James explains how mosquito genetics can be exploited to control malaria and dengue transmission. Population replacement strategy, the idea that transgenic mosquitoes can be released into the wild to control disease transmission, is introduced, as well as the concept of genetic drive and the design criterion for an effective genetic drive system. The ethical considerations of releasing genetically-modified organisms into the wild are also discussed.
Cellular Biology, Issue 5, mosquito, malaria, dengue fever, genetics, infectious disease, Translational Research
Population Replacement Strategies for Controlling Vector Populations and the Use of Wolbachia pipientis for Genetic Drive
Institutions: Johns Hopkins University.
In this video, Jason Rasgon discusses population replacement strategies to control vector-borne diseases such as malaria and dengue. "Population replacement" is the replacement of wild vector populations (that are competent to transmit pathogens) with those that are not competent to transmit pathogens. There are several theoretical strategies to accomplish this. One is to exploit the maternally-inherited symbiotic bacteria Wolbachia pipientis. Wolbachia is a widespread reproductive parasite that spreads in a selfish manner at the extent of its host's fitness. Jason Rasgon discusses, in detail, the basic biology of this bacterial symbiont and various ways to use it for control of vector-borne diseases.
Cellular Biology, Issue 5, mosquito, malaria, genetics, infectious disease, Wolbachia