Generalized anxiety disorder (GAD) is a psychiatric disorder characterized by a constant and unspecific anxiety that interferes with daily-life activities. Its high prevalence in general population and the severe limitations it causes, point out the necessity to find new efficient strategies to treat it. Together with the cognitive-behavioral treatments, relaxation represents a useful approach for the treatment of GAD, but it has the limitation that it is hard to be learned. The INTREPID project is aimed to implement a new instrument to treat anxiety-related disorders and to test its clinical efficacy in reducing anxiety-related symptoms. The innovation of this approach is the combination of virtual reality and biofeedback, so that the first one is directly modified by the output of the second one. In this way, the patient is made aware of his or her reactions through the modification of some features of the VR environment in real time. Using mental exercises the patient learns to control these physiological parameters and using the feedback provided by the virtual environment is able to gauge his or her success. The supplemental use of portable devices, such as PDA or smart-phones, allows the patient to perform at home, individually and autonomously, the same exercises experienced in therapist's office. The goal is to anchor the learned protocol in a real life context, so enhancing the patients' ability to deal with their symptoms. The expected result is a better and faster learning of relaxation techniques, and thus an increased effectiveness of the treatment if compared with traditional clinical protocols.
24 Related JoVE Articles!
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Using Insect Electroantennogram Sensors on Autonomous Robots for Olfactory Searches
Institutions: Centre National de la Recherche Scientifique (CNRS), Institut d'Ecologie et des Sciences de l'Environnement de Paris, Institut Pasteur.
Robots designed to track chemical leaks in hazardous industrial facilities1
or explosive traces in landmine fields2
face the same problem as insects foraging for food or searching for mates3
: the olfactory search is constrained by the physics of turbulent transport4
. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity5-6
, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones7
but also chemicals that are relevant to humans, e.g.
, volatile compounds emanating from cancer cells8
or toxic and illicit substances9-11
. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ
in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors12
. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies13
. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration14
or using nanostructured gas sensors that mimic insect antennae15
Neuroscience, Issue 90, robotics, electroantennogram, EAG, gas sensor, electronic nose, olfactory search, surge and casting, moth, insect, olfaction, neuron
Next-generation Sequencing of 16S Ribosomal RNA Gene Amplicons
Institutions: National Research Council Canada.
One of the major questions in microbial ecology is “who is there?” This question can be answered using various tools, but one of the long-lasting gold standards is to sequence 16S ribosomal RNA (rRNA) gene amplicons generated by domain-level PCR reactions amplifying from genomic DNA. Traditionally, this was performed by cloning and Sanger (capillary electrophoresis) sequencing of PCR amplicons. The advent of next-generation sequencing has tremendously simplified and increased the sequencing depth for 16S rRNA gene sequencing. The introduction of benchtop sequencers now allows small labs to perform their 16S rRNA sequencing in-house in a matter of days. Here, an approach for 16S rRNA gene amplicon sequencing using a benchtop next-generation sequencer is detailed. The environmental DNA is first amplified by PCR using primers that contain sequencing adapters and barcodes. They are then coupled to spherical particles via emulsion PCR. The particles are loaded on a disposable chip and the chip is inserted in the sequencing machine after which the sequencing is performed. The sequences are retrieved in fastq format, filtered and the barcodes are used to establish the sample membership of the reads. The filtered and binned reads are then further analyzed using publically available tools. An example analysis where the reads were classified with a taxonomy-finding algorithm within the software package Mothur is given. The method outlined here is simple, inexpensive and straightforward and should help smaller labs to take advantage from the ongoing genomic revolution.
Molecular Biology, Issue 90, Metagenomics, Bacteria, 16S ribosomal RNA gene, Amplicon sequencing, Next-generation sequencing, benchtop sequencers
Reduced-gravity Environment Hardware Demonstrations of a Prototype Miniaturized Flow Cytometer and Companion Microfluidic Mixing Technology
Institutions: DNA Medicine Institute, Harvard Medical School, NASA Glenn Research Center, ZIN Technologies.
Until recently, astronaut blood samples were collected in-flight, transported to earth on the Space Shuttle, and analyzed in terrestrial laboratories. If humans are to travel beyond low Earth orbit, a transition towards space-ready, point-of-care (POC) testing is required. Such testing needs to be comprehensive, easy to perform in a reduced-gravity environment, and unaffected by the stresses of launch and spaceflight. Countless POC devices have been developed to mimic laboratory scale counterparts, but most have narrow applications and few have demonstrable use in an in-flight, reduced-gravity environment. In fact, demonstrations of biomedical diagnostics in reduced gravity are limited altogether, making component choice and certain logistical challenges difficult to approach when seeking to test new technology. To help fill the void, we are presenting a modular method for the construction and operation of a prototype blood diagnostic device and its associated parabolic flight test rig that meet the standards for flight-testing onboard a parabolic flight, reduced-gravity aircraft. The method first focuses on rig assembly for in-flight, reduced-gravity testing of a flow cytometer and a companion microfluidic mixing chip. Components are adaptable to other designs and some custom components, such as a microvolume sample loader and the micromixer may be of particular interest. The method then shifts focus to flight preparation, by offering guidelines and suggestions to prepare for a successful flight test with regard to user training, development of a standard operating procedure (SOP), and other issues. Finally, in-flight experimental procedures specific to our demonstrations are described.
Cellular Biology, Issue 93, Point-of-care, prototype, diagnostics, spaceflight, reduced gravity, parabolic flight, flow cytometry, fluorescence, cell counting, micromixing, spiral-vortex, blood mixing
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
Topographical Estimation of Visual Population Receptive Fields by fMRI
Institutions: Baylor College of Medicine, Max Planck Institute for Biological Cybernetics, Bernstein Center for Computational Neuroscience.
Visual cortex is retinotopically organized so that neighboring populations of cells map to neighboring parts of the visual field. Functional magnetic resonance imaging allows us to estimate voxel-based population receptive fields (pRF), i.e.
, the part of the visual field that activates the cells within each voxel. Prior, direct, pRF estimation methods1
suffer from certain limitations: 1) the pRF model is chosen a-priori and may not fully capture the actual pRF shape, and 2) pRF centers are prone to mislocalization near the border of the stimulus space. Here a new topographical pRF estimation method2
is proposed that largely circumvents these limitations. A linear model is used to predict the Blood Oxygen Level-Dependent (BOLD) signal by convolving the linear response of the pRF to the visual stimulus with the canonical hemodynamic response function. PRF topography is represented as a weight vector whose components represent the strength of the aggregate response of voxel neurons to stimuli presented at different visual field locations. The resulting linear equations can be solved for the pRF weight vector using ridge regression3
, yielding the pRF topography. A pRF model that is matched to the estimated topography can then be chosen post-hoc, thereby improving the estimates of pRF parameters such as pRF-center location, pRF orientation, size, etc
. Having the pRF topography available also allows the visual verification of pRF parameter estimates allowing the extraction of various pRF properties without having to make a-priori assumptions about the pRF structure. This approach promises to be particularly useful for investigating the pRF organization of patients with disorders of the visual system.
Behavior, Issue 96, population receptive field, vision, functional magnetic resonance imaging, retinotopy
Flat-floored Air-lifted Platform: A New Method for Combining Behavior with Microscopy or Electrophysiology on Awake Freely Moving Rodents
Institutions: University of Helsinki, Neurotar LTD, University of Eastern Finland, University of Helsinki.
It is widely acknowledged that the use of general anesthetics can undermine the relevance of electrophysiological or microscopical data obtained from a living animal’s brain. Moreover, the lengthy recovery from anesthesia limits the frequency of repeated recording/imaging episodes in longitudinal studies. Hence, new methods that would allow stable recordings from non-anesthetized behaving mice are expected to advance the fields of cellular and cognitive neurosciences. Existing solutions range from mere physical restraint to more sophisticated approaches, such as linear and spherical treadmills used in combination with computer-generated virtual reality. Here, a novel method is described where a head-fixed mouse can move around an air-lifted mobile homecage and explore its environment under stress-free conditions. This method allows researchers to perform behavioral tests (e.g.
, learning, habituation or novel object recognition) simultaneously with two-photon microscopic imaging and/or patch-clamp recordings, all combined in a single experiment. This video-article describes the use of the awake animal head fixation device (mobile homecage), demonstrates the procedures of animal habituation, and exemplifies a number of possible applications of the method.
Empty Value, Issue 88, awake, in vivo two-photon microscopy, blood vessels, dendrites, dendritic spines, Ca2+ imaging, intrinsic optical imaging, patch-clamp
In Situ Neutron Powder Diffraction Using Custom-made Lithium-ion Batteries
Institutions: University of Sydney, University of Wollongong, Australian Synchrotron, Australian Nuclear Science and Technology Organisation, University of Wollongong, University of New South Wales.
Li-ion batteries are widely used in portable electronic devices and are considered as promising candidates for higher-energy applications such as electric vehicles.1,2
However, many challenges, such as energy density and battery lifetimes, need to be overcome before this particular battery technology can be widely implemented in such applications.3
This research is challenging, and we outline a method to address these challenges using in situ
NPD to probe the crystal structure of electrodes undergoing electrochemical cycling (charge/discharge) in a battery. NPD data help determine the underlying structural mechanism responsible for a range of electrode properties, and this information can direct the development of better electrodes and batteries.
We briefly review six types of battery designs custom-made for NPD experiments and detail the method to construct the ‘roll-over’ cell that we have successfully used on the high-intensity NPD instrument, WOMBAT, at the Australian Nuclear Science and Technology Organisation (ANSTO). The design considerations and materials used for cell construction are discussed in conjunction with aspects of the actual in situ
NPD experiment and initial directions are presented on how to analyze such complex in situ
Physics, Issue 93, In operando, structure-property relationships, electrochemical cycling, electrochemical cells, crystallography, battery performance
HPLC Measurement of the DNA Oxidation Biomarker, 8-oxo-7,8-dihydro-2’-deoxyguanosine, in Cultured Cells and Animal Tissues
Institutions: Health Canada.
Oxidative stress is associated with many physiological and pathological processes, as well as xenobiotic metabolism, leading to the oxidation of biomacromolecules, including DNA. Therefore, efficient detection of DNA oxidation is important for a variety of research disciplines, including medicine and toxicology. A common biomarker of oxidatively damaged DNA is 8-oxo-7,8-dihydro-2'-deoxyguanosine (8-oxo-dGuo; often erroneously referred to as 8-hydroxy-2'-deoxyguanosine (8-OH-dGuo or 8-oxo-dG)). Several protocols for 8-oxo-dGuo measurement by high pressure liquid chromatography with electrochemical detection (HPLC-ED) have been described. However, these were mainly applied to purified DNA treated with pro-oxidants. In addition, due to methodological differences between laboratories, mainly due to differences in analytical equipment, the adoption of published methods for detection of 8-oxo-dGuo by HPLC-ED requires careful optimization by each laboratory. A comprehensive protocol, describing such an optimization process, is lacking. Here, a detailed protocol is described for the detection of 8-oxo-dGuo by HPLC-ED, in DNA from cultured cells or animal tissues. It illustrates how DNA sample preparation can be easily and rapidly optimized to minimize undesirable DNA oxidation that can occur during sample preparation. This protocol shows how to detect 8-oxo-dGuo in cultured human alveolar adenocarcinoma cells (i.e.
, A549 cells) treated with the oxidizing agent KBrO3
, and from the spleen of mice exposed to the polycyclic aromatic hydrocarbon dibenzo(def,p
)chrysene (DBC, formerly known as dibenzo(a,l)
pyrene, DalP). Overall, this work illustrates how an HPLC-ED methodology can be readily optimized for the detection of 8-oxo-dGuo in biological samples.
Chemistry, Issue 102, Oxidative Stress, DNA Damage, 8-oxo-7,8-dihydro-2'-deoxyguanosine, 8-hydroxy-2'-deoxyguanosine, Xenobiotic Metabolism, Human Health
Removal of Trace Elements by Cupric Oxide Nanoparticles from Uranium In Situ Recovery Bleed Water and Its Effect on Cell Viability
Institutions: University of New Mexico, University of Wyoming, University of Wyoming, Colorado State University, Colorado State University, California Northstate University.
recovery (ISR) is the predominant method of uranium extraction in the United States. During ISR, uranium is leached from an ore body and extracted through ion exchange. The resultant production bleed water (PBW) contains contaminants such as arsenic and other heavy metals. Samples of PBW from an active ISR uranium facility were treated with cupric oxide nanoparticles (CuO-NPs). CuO-NP treatment of PBW reduced priority contaminants, including arsenic, selenium, uranium, and vanadium. Untreated and CuO-NP treated PBW was used as the liquid component of the cell growth media and changes in viability were determined by the MTT (3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide) assay in human embryonic kidney (HEK 293) and human hepatocellular carcinoma (Hep G2) cells. CuO-NP treatment was associated with improved HEK and HEP cell viability. Limitations of this method include dilution of the PBW by growth media components and during osmolality adjustment as well as necessary pH adjustment. This method is limited in its wider context due to dilution effects and changes in the pH of the PBW which is traditionally slightly acidic however; this method could have a broader use assessing CuO-NP treatment in more neutral waters.
Environmental Sciences, Issue 100, Energy production, uranium in situ recovery, water decontamination, nanoparticles, toxicity, cytotoxicity, in vitro cell culture
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Gene-environment Interaction Models to Unmask Susceptibility Mechanisms in Parkinson's Disease
Institutions: SRI International, University of California-Santa Cruz.
Lipoxygenase (LOX) activity has been implicated in neurodegenerative disorders such as Alzheimer's disease, but its effects in Parkinson's disease (PD) pathogenesis are less understood. Gene-environment interaction models have utility in unmasking the impact of specific cellular pathways in toxicity that may not be observed using a solely genetic or toxicant disease model alone. To evaluate if distinct LOX isozymes selectively contribute to PD-related neurodegeneration, transgenic (i.e.
5-LOX and 12/15-LOX deficient) mice can be challenged with a toxin that mimics cell injury and death in the disorder. Here we describe the use of a neurotoxin, 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP), which produces a nigrostriatal lesion to elucidate the distinct contributions of LOX isozymes to neurodegeneration related to PD. The use of MPTP in mouse, and nonhuman primate, is well-established to recapitulate the nigrostriatal damage in PD. The extent of MPTP-induced lesioning is measured by HPLC analysis of dopamine and its metabolites and semi-quantitative Western blot analysis of striatum for tyrosine hydroxylase (TH), the rate-limiting enzyme for the synthesis of dopamine. To assess inflammatory markers, which may demonstrate LOX isozyme-selective sensitivity, glial fibrillary acidic protein (GFAP) and Iba-1 immunohistochemistry are performed on brain sections containing substantia nigra, and GFAP Western blot analysis is performed on striatal homogenates. This experimental approach can provide novel insights into gene-environment interactions underlying nigrostriatal degeneration and PD.
Medicine, Issue 83, MPTP, dopamine, Iba1, TH, GFAP, lipoxygenase, transgenic, gene-environment interactions, mouse, Parkinson's disease, neurodegeneration, neuroinflammation
The Generation of Higher-order Laguerre-Gauss Optical Beams for High-precision Interferometry
Institutions: University of Birmingham.
Thermal noise in high-reflectivity mirrors is a major impediment for several types of high-precision interferometric experiments that aim to reach the standard quantum limit or to cool mechanical systems to their quantum ground state. This is for example the case of future gravitational wave observatories, whose sensitivity to gravitational wave signals is expected to be limited in the most sensitive frequency band, by atomic vibration of their mirror masses. One promising approach being pursued to overcome this limitation is to employ higher-order Laguerre-Gauss (LG) optical beams in place of the conventionally used fundamental mode. Owing to their more homogeneous light intensity distribution these beams average more effectively over the thermally driven fluctuations of the mirror surface, which in turn reduces the uncertainty in the mirror position sensed by the laser light.
We demonstrate a promising method to generate higher-order LG beams by shaping a fundamental Gaussian beam with the help of diffractive optical elements. We show that with conventional sensing and control techniques that are known for stabilizing fundamental laser beams, higher-order LG modes can be purified and stabilized just as well at a comparably high level. A set of diagnostic tools allows us to control and tailor the properties of generated LG beams. This enabled us to produce an LG beam with the highest purity reported to date. The demonstrated compatibility of higher-order LG modes with standard interferometry techniques and with the use of standard spherical optics makes them an ideal candidate for application in a future generation of high-precision interferometry.
Physics, Issue 78, Optics, Astronomy, Astrophysics, Gravitational waves, Laser interferometry, Metrology, Thermal noise, Laguerre-Gauss modes, interferometry
Toxin Induction and Protein Extraction from Fusarium spp. Cultures for Proteomic Studies
Institutions: Centre de Recherche Public-Gabriel Lippmann.
are filamentous fungi able to produce different toxins. Fusarium mycotoxins such as deoxynivalenol, nivalenol, T2, zearelenone, fusaric acid, moniliformin, etc... have adverse effects on both human and animal health and some are considered as pathogenicity factors. Proteomic studies showed to be effective for deciphering toxin production mechanisms (Taylor et al.
, 2008) as well as for identifying potential pathogenic factors (Paper et al.
, 2007, Houterman et al.,
2007) in Fusaria
. It becomes therefore fundamental to establish reliable methods for comparing between proteomic studies in order to rely on true differences found in protein expression among experiments, strains and laboratories. The procedure that will be described should contribute to an increased level of standardization of proteomic procedures by two ways. The filmed protocol is used to increase the level of details that can be described precisely. Moreover, the availability of standardized procedures to process biological replicates should guarantee a higher robustness of data, taking into account also the human factor within the technical reproducibility of the extraction procedure.
The protocol described requires 16 days for its completion: fourteen days for cultures and two days for protein extraction (figure 1).
strains are grown on solid media for 4 days; they are then manually fragmented and transferred into a modified toxin inducing media (Jiao et al.
, 2008) for 10 days. Mycelium is collected by filtration through a Miracloth layer. Grinding is performed in a cold chamber. Different operators performed extraction replicates (n=3) in order to take into account the bias due to technical variations (figure 2). Extraction was based on a SDS/DTT buffer as described in Taylor et al. (2008) with slight modifications. Total protein extraction required a precipitation process of the proteins using Aceton/TCA/DTT buffer overnight and Acetone /DTT washing (figure 3a,3b). Proteins were finally resolubilized in the protein-labelling buffer and quantified. Results of the extraction were visualized on a 1D gel (Figure 4, SDS-PAGE), before proceeding to 2D gels (IEF/SDS-PAGE). The same procedure can be applied for proteomic analyses on other growing media and other filamentous fungi (Miles et al.,
Microbiology, Issue 36, MIAPE, Fusarium graminearum, toxin induction, fungal cultures, proteomics, sample processing, protein extraction
Video Bioinformatics Analysis of Human Embryonic Stem Cell Colony Growth
Institutions: University of California, University of California, University of California, University of California.
Because video data are complex and are comprised of many images, mining information from video material is difficult to do without the aid of computer software. Video bioinformatics is a powerful quantitative approach for extracting spatio-temporal data from video images using computer software to perform dating mining and analysis. In this article, we introduce a video bioinformatics method for quantifying the growth of human embryonic stem cells (hESC) by analyzing time-lapse videos collected in a Nikon BioStation CT incubator equipped with a camera for video imaging. In our experiments, hESC colonies that were attached to Matrigel were filmed for 48 hours in the BioStation CT. To determine the rate of growth of these colonies, recipes were developed using CL-Quant software which enables users to extract various types of data from video images. To accurately evaluate colony growth, three recipes were created. The first segmented the image into the colony and background, the second enhanced the image to define colonies throughout the video sequence accurately, and the third measured the number of pixels in the colony over time. The three recipes were run in sequence on video data collected in a BioStation CT to analyze the rate of growth of individual hESC colonies over 48 hours. To verify the truthfulness of the CL-Quant recipes, the same data were analyzed manually using Adobe Photoshop software. When the data obtained using the CL-Quant recipes and Photoshop were compared, results were virtually identical, indicating the CL-Quant recipes were truthful. The method described here could be applied to any video data to measure growth rates of hESC or other cells that grow in colonies. In addition, other video bioinformatics recipes can be developed in the future for other cell processes such as migration, apoptosis, and cell adhesion.
Cellular Biology, Issue 39, hESC, matrigel, stem cells, video bioinformatics, colony, growth
An Analytical Tool-box for Comprehensive Biochemical, Structural and Transcriptome Evaluation of Oral Biofilms Mediated by Mutans Streptococci
Institutions: University of Rochester Medical Center, Sichuan University, Glostrup Hospital, Glostrup, Denmark, University of Rochester Medical Center.
Biofilms are highly dynamic, organized and structured communities of microbial cells enmeshed in an extracellular matrix of variable density and composition 1, 2
. In general, biofilms develop from initial microbial attachment on a surface followed by formation of cell clusters (or microcolonies) and further development and stabilization of the microcolonies, which occur in a complex extracellular matrix. The majority of biofilm matrices harbor exopolysaccharides (EPS), and dental biofilms are no exception; especially those associated with caries disease, which are mostly mediated by mutans streptococci 3
. The EPS are synthesized by microorganisms (S. mutans
, a key contributor) by means of extracellular enzymes, such as glucosyltransferases using sucrose primarily as substrate 3
Studies of biofilms formed on tooth surfaces are particularly challenging owing to their constant exposure to environmental challenges associated with complex diet-host-microbial interactions occurring in the oral cavity. Better understanding of the dynamic changes of the structural organization and composition of the matrix, physiology and transcriptome/proteome profile of biofilm-cells in response to these complex interactions would further advance the current knowledge of how oral biofilms modulate pathogenicity. Therefore, we have developed an analytical tool-box to facilitate biofilm analysis at structural, biochemical and molecular levels by combining commonly available and novel techniques with custom-made software for data analysis. Standard analytical (colorimetric assays, RT-qPCR and microarrays) and novel fluorescence techniques (for simultaneous labeling of bacteria and EPS) were integrated with specific software for data analysis to address the complex nature of oral biofilm research.
The tool-box is comprised of 4 distinct but interconnected steps (Figure 1): 1) Bioassays, 2) Raw Data Input, 3) Data Processing, and 4) Data Analysis. We used our in vitro
biofilm model and specific experimental conditions to demonstrate the usefulness and flexibility of the tool-box. The biofilm model is simple, reproducible and multiple replicates of a single experiment can be done simultaneously 4, 5
. Moreover, it allows temporal evaluation, inclusion of various microbial species 5
and assessment of the effects of distinct experimental conditions (e.g. treatments 6
; comparison of knockout mutants vs. parental strain 5
; carbohydrates availability 7
). Here, we describe two specific components of the tool-box, including (i) new software for microarray data mining/organization (MDV) and fluorescence imaging analysis (DUOSTAT), and (ii) in situ
EPS-labeling. We also provide an experimental case showing how the tool-box can assist with biofilms analysis, data organization, integration and interpretation.
Microbiology, Issue 47, Extracellular matrix, polysaccharides, biofilm, mutans streptococci, glucosyltransferases, confocal fluorescence, microarray
Aseptic Laboratory Techniques: Plating Methods
Institutions: University of California, Los Angeles .
Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories
(BMBL) as well as Material Safety Data Sheets
(MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection
(ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to:
● Perform plating procedures without contaminating media.
● Isolate single bacterial colonies by the streak-plating method.
● Use pour-plating and spread-plating methods to determine the concentration of bacteria.
● Perform soft agar overlays when working with phage.
● Transfer bacterial cells from one plate to another using the replica-plating procedure.
● Given an experimental task, select the appropriate plating method.
Basic Protocols, Issue 63, Streak plates, pour plates, soft agar overlays, spread plates, replica plates, bacteria, colonies, phage, plaques, dilutions
Development of a Unilaterally-lesioned 6-OHDA Mouse Model of Parkinson's Disease
Institutions: University of Toronto at Scarborough.
The unilaterally lesioned 6-hyroxydopamine (6-OHDA)-lesioned rat model of Parkinson's disease (PD) has proved to be invaluable in advancing our understanding of the mechanisms underlying parkinsonian symptoms, since it recapitulates the changes in basal ganglia circuitry and pharmacology observed in parkinsonian patients1-4
. However, the precise cellular and molecular changes occurring at cortico-striatal synapses of the output pathways within the striatum, which is the major input region of the basal ganglia remain elusive, and this is believed to be site where pathological abnormalities underlying parkinsonian symptoms arise3,5
In PD, understanding the mechanisms underlying changes in basal ganglia circuitry following degeneration of the nigro-striatal pathway has been greatly advanced by the development of bacterial artificial chromosome (BAC) mice over-expressing green fluorescent proteins driven by promoters specific for the two striatal output pathways (direct pathway: eGFP-D1; indirect pathway: eGFP-D2 and eGFP-A2a)8
, allowing them to be studied in isolation. For example, recent studies have suggested that there are pathological changes in synaptic plasticity in parkinsonian mice9,10
. However, these studies utilised juvenile mice and acute models of parkinsonism. It is unclear whether the changes described in adult rats with stable 6-OHDA lesions also occur in these models. Other groups have attempted to generate a stable unilaterally-lesioned 6-OHDA adult mouse model of PD by lesioning the medial forebrain bundle (MFB), unfortunately, the mortality rate in this study was extremely high, with only 14% surviving the surgery for 21 days or longer11
. More recent studies have generated intra-nigral lesions with both a low mortality rate >80% loss of dopaminergic neurons, however expression of L-DOPA induced dyskinesia11,12,13,14
was variable in these studies. Another well established mouse model of PD is the MPTP-lesioned mouse15
. Whilst this model has proven useful in the assessment of potential neuroprotective agents16
, it is less suitable for understanding mechanisms underlying symptoms of PD, as this model often fails to induce motor deficits, and shows a wide variability in the extent of lesion17, 18
Here we have developed a stable unilateral 6-OHDA-lesioned mouse model of PD by direct administration of 6-OHDA into the MFB, which consistently causes >95% loss of striatal dopamine (as measured by HPLC), as well as producing the behavioural imbalances observed in the well characterised unilateral 6-OHDA-lesioned rat model of PD. This newly developed mouse model of PD will prove a valuable tool in understanding the mechanisms underlying generation of parkinsonian symptoms.
Medicine, Issue 60, mouse, 6-OHDA, Parkinson’s disease, medial forebrain bundle, unilateral
Creating Objects and Object Categories for Studying Perception and Perceptual Learning
Institutions: Georgia Health Sciences University, Georgia Health Sciences University, Georgia Health Sciences University, Palo Alto Research Center, Palo Alto Research Center, University of Minnesota .
In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1
. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes
) with such properties2
Many innovative and useful methods currently exist for creating novel objects and object categories3-6
(also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings.
First, shape variations are generally imposed by the experimenter5,9,10
, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints.
Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13
. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases.
Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms.
Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14
. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13
. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics15,16
. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects9,13
. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper.
We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have.
Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis.
Neuroscience, Issue 69, machine learning, brain, classification, category learning, cross-modal perception, 3-D prototyping, inference
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Facilitating Drug Discovery: An Automated High-content Inflammation Assay in Zebrafish
Institutions: Karlsruhe Institute of Technology, Karlsruhe, Germany, Karlsruhe Institute of Technology, Karlsruhe, Germany.
Zebrafish larvae are particularly amenable to whole animal small molecule screens1,2
due to their small size and relative ease of manipulation and observation, as well as the fact that compounds can simply be added to the bathing water and are readily absorbed when administered in a <1% DMSO solution. Due to the optical clarity of zebrafish larvae and the availability of transgenic lines expressing fluorescent proteins in leukocytes, zebrafish offer the unique advantage of monitoring an acute inflammatory response in vivo
. Consequently, utilizing the zebrafish for high-content small molecule screens aiming at the identification of immune-modulatory compounds with high throughput has been proposed3-6
, suggesting inflammation induction scenarios e.g. localized nicks in fin tissue, laser damage directed to the yolk surface of embryos7
or tailfin amputation3,5,6
. The major drawback of these methods however was the requirement of manual larva manipulation to induce wounding, thus preventing high-throughput screening. Introduction of the chemically induced inflammation (ChIn) assay8
eliminated these obstacles. Since wounding is inflicted chemically the number of embryos that can be treated simultaneously is virtually unlimited. Temporary treatment of zebrafish larvae with copper sulfate selectively induces cell death in hair cells of the lateral line system and results in rapid granulocyte recruitment to injured neuromasts. The inflammatory response can be followed in real-time by using compound transgenic cldnB::GFP/lysC::DsRED26,9
zebrafish larvae that express a green fluorescent protein in neuromast cells, as well as a red fluorescent protein labeling granulocytes.
In order to devise a screening strategy that would allow both high-content and high-throughput analyses we introduced robotic liquid handling and combined automated microscopy with a custom developed software script. This script enables automated quantification of the inflammatory response by scoring the percent area occupied by red fluorescent leukocytes within an empirically defined area surrounding injured green fluorescent neuromasts. Furthermore, we automated data processing, handling, visualization, and storage all based on custom developed MATLAB and Python scripts.
In brief, we introduce an automated HC/HT screen that allows testing of chemical compounds for their effect on initiation, progression or resolution of a granulocytic inflammatory response. This protocol serves a good starting point for more in-depth analyses of drug mechanisms and pathways involved in the orchestration of an innate immune response. In the future, it may help identifying intolerable toxic or off-target effects at earlier phases of drug discovery and thereby reduce procedural risks and costs for drug development.
Immunology, Issue 65, Molecular Biology, Genetics, Zebrafish, Inflammation, Drug discovery, HCS, High Content Screening, Automated Microscopy, high throughput
The Logic, Experimental Steps, and Potential of Heterologous Natural Product Biosynthesis Featuring the Complex Antibiotic Erythromycin A Produced Through E. coli
Institutions: State University of New York at Buffalo, Massachusetts Institute of Technology.
The heterologous production of complex natural products is an approach designed to address current limitations and future possibilities. It is particularly useful for those compounds which possess therapeutic value but cannot be sufficiently produced or would benefit from an improved form of production. The experimental procedures involved can be subdivided into three components: 1) genetic transfer; 2) heterologous reconstitution; and 3) product analysis. Each experimental component is under continual optimization to meet the challenges and anticipate the opportunities associated with this emerging approach.
Heterologous biosynthesis begins with the identification of a genetic sequence responsible for a valuable natural product. Transferring this sequence to a heterologous host is complicated by the biosynthetic pathway complexity responsible for product formation. The antibiotic erythromycin A is a good example. Twenty genes (totaling >50 kb) are required for eventual biosynthesis. In addition, three of these genes encode megasynthases, multi-domain enzymes each ~300 kDa in size. This genetic material must be designed and transferred to E. coli
for reconstituted biosynthesis. The use of PCR isolation, operon construction, multi-cystronic plasmids, and electro-transformation will be described in transferring the erythromycin A genetic cluster to E. coli
Once transferred, the E. coli
cell must support eventual biosynthesis. This process is also challenging given the substantial differences between E. coli
and most original hosts responsible for complex natural product formation. The cell must provide necessary substrates to support biosynthesis and coordinately express the transferred genetic cluster to produce active enzymes. In the case of erythromycin A, the E. coli
cell had to be engineered to provide the two precursors (propionyl-CoA and (2S)-methylmalonyl-CoA) required for biosynthesis. In addition, gene sequence modifications, plasmid copy number, chaperonin co-expression, post-translational enzymatic modification, and process temperature were also required to allow final erythromycin A formation.
Finally, successful production must be assessed. For the erythromycin A case, we will present two methods. The first is liquid chromatography-mass spectrometry (LC-MS) to confirm and quantify production. The bioactivity of erythromycin A will also be confirmed through use of a bioassay in which the antibiotic activity is tested against Bacillus subtilis
. The assessment assays establish erythromycin A biosynthesis from E. coli
and set the stage for future engineering efforts to improve or diversify production and for the production of new complex natural compounds using this approach.
Biomedical Engineering, Issue 71, Chemical Engineering, Bioengineering, Molecular Biology, Cellular Biology, Microbiology, Basic Protocols, Biochemistry, Biotechnology, Heterologous biosynthesis, natural products, antibiotics, erythromycin A, metabolic engineering, E. coli
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Quantifying Learning in Young Infants: Tracking Leg Actions During a Discovery-learning Task
Institutions: University of Southern California, Temple University, Niigata University of Health and Welfare.
Task-specific actions emerge from spontaneous movement during infancy. It has been proposed that task-specific actions emerge through a discovery-learning process. Here a method is described in which 3-4 month old infants learn a task by discovery and their leg movements are captured to quantify the learning process. This discovery-learning task uses an infant activated mobile that rotates and plays music based on specified leg action of infants. Supine infants activate the mobile by moving their feet vertically across a virtual threshold. This paradigm is unique in that as infants independently discover that their leg actions activate the mobile, the infants’ leg movements are tracked using a motion capture system allowing for the quantification of the learning process. Specifically, learning is quantified in terms of the duration of mobile activation, the position variance of the end effectors (feet) that activate the mobile, changes in hip-knee coordination patterns, and changes in hip and knee muscle torque. This information describes infant exploration and exploitation at the interplay of person and environmental constraints that support task-specific action. Subsequent research using this method can investigate how specific impairments of different populations of infants at risk for movement disorders influence the discovery-learning process for task-specific action.
Behavior, Issue 100, infant, discovery-learning, motor learning, motor control, kinematics, kinetics