We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
25 Related JoVE Articles!
Using Informational Connectivity to Measure the Synchronous Emergence of fMRI Multi-voxel Information Across Time
Institutions: University of Pennsylvania.
It is now appreciated that condition-relevant information can be present within distributed patterns of functional magnetic resonance imaging (fMRI) brain activity, even for conditions with similar levels of univariate activation. Multi-voxel pattern (MVP) analysis has been used to decode this information with great success. FMRI investigators also often seek to understand how brain regions interact in interconnected networks, and use functional connectivity (FC) to identify regions that have correlated responses over time. Just as univariate analyses can be insensitive to information in MVPs, FC may not fully characterize the brain networks that process conditions with characteristic MVP signatures. The method described here, informational connectivity (IC), can identify regions with correlated changes in MVP-discriminability across time, revealing connectivity that is not accessible to FC. The method can be exploratory, using searchlights to identify seed-connected areas, or planned, between pre-selected regions-of-interest. The results can elucidate networks of regions that process MVP-related conditions, can breakdown MVPA searchlight maps into separate networks, or can be compared across tasks and patient groups.
Neuroscience, Issue 89, fMRI, MVPA, connectivity, informational connectivity, functional connectivity, networks, multi-voxel pattern analysis, decoding, classification, method, multivariate
Assessment of Gastric Emptying in Non-obese Diabetic Mice Using a [13C]-octanoic Acid Breath Test
Institutions: Mayo Clinic .
Gastric emptying studies in mice have been limited by the inability to follow gastric emptying changes in the same animal since the most commonly used techniques require killing of the animals and postmortem recovery of the meal1,2
. This approach prevents longitudinal studies to determine changes in gastric emptying with age and progression of disease. The commonly used [13
C]-octanoic acid breath test for humans3
has been modified for use in mice4-6
and we previously showed that this test is reliable and responsive to changes in gastric emptying in response to drugs and during diabetic disease progression8
. In this video presentation the principle and practical implementation of this modified test is explained. As in the previous study, NOD LtJ mice are used, a model of type 1 diabetes9
. A proportion of these mice develop the symptoms of gastroparesis, a complication of diabetes characterized by delayed gastric emptying without mechanical obstruction of the stomach10
This paper demonstrates how to train the mice for testing, how to prepare the test meal and obtain 4 hr gastric emptying data and how to analyze the obtained data. The carbon isotope analyzer used in the present study is suitable for the automatic sampling of the air samples from up to 12 mice at the same time. This technique allows the longitudinal follow-up of gastric emptying from larger groups of mice with diabetes or other long-standing diseases.
Medicine, Issue 73, Biomedical Engineering, Molecular Biology, Anatomy, Physiology, Neurobiology, Gastrointestinal Tract, Gastrointestinal Diseases, Ion Channels, Diagnostic Techniques and Procedures, Electrophysiology, Gastric emptying, [13C]-octanoic acid, breath test, in vivo, clinical, assay, mice, animal model
Haptic/Graphic Rehabilitation: Integrating a Robot into a Virtual Environment Library and Applying it to Stroke Therapy
Institutions: University of Illinois at Chicago and Rehabilitation Institute of Chicago, Rehabilitation Institute of Chicago.
Recent research that tests interactive devices for prolonged therapy practice has revealed new prospects for robotics combined with graphical and other forms of biofeedback. Previous human-robot interactive systems have required different software commands to be implemented for each robot leading to unnecessary developmental overhead time each time a new system becomes available. For example, when a haptic/graphic virtual reality environment has been coded for one specific robot to provide haptic feedback, that specific robot would not be able to be traded for another robot without recoding the program. However, recent efforts in the open source community have proposed a wrapper class approach that can elicit nearly identical responses regardless of the robot used. The result can lead researchers across the globe to perform similar experiments using shared code. Therefore modular "switching out"of one robot for another would not affect development time. In this paper, we outline the successful creation and implementation of a wrapper class for one robot into the open-source H3DAPI, which integrates the software commands most commonly used by all robots.
Bioengineering, Issue 54, robotics, haptics, virtual reality, wrapper class, rehabilitation robotics, neural engineering, H3DAPI, C++
Identifying DNA Mutations in Purified Hematopoietic Stem/Progenitor Cells
Institutions: UT Health Science Center at San Antonio, UT Health Science Center at San Antonio, UT Health Science Center at San Antonio, UT Health Science Center at San Antonio, UT Health Science Center at San Antonio.
In recent years, it has become apparent that genomic instability is tightly related to many developmental disorders, cancers, and aging. Given that stem cells are responsible for ensuring tissue homeostasis and repair throughout life, it is reasonable to hypothesize that the stem cell population is critical for preserving genomic integrity of tissues. Therefore, significant interest has arisen in assessing the impact of endogenous and environmental factors on genomic integrity in stem cells and their progeny, aiming to understand the etiology of stem-cell based diseases.
transgenic mice carry a recoverable λ phage vector encoding the LacI
reporter system, in which the LacI
gene serves as the mutation reporter. The result of a mutated LacI
gene is the production of β-galactosidase that cleaves a chromogenic substrate, turning it blue. The LacI
reporter system is carried in all cells, including stem/progenitor cells and can easily be recovered and used to subsequently infect E. coli
. After incubating infected E. coli
on agarose that contains the correct substrate, plaques can be scored; blue plaques indicate a mutant LacI
gene, while clear plaques harbor wild-type. The frequency of blue (among clear) plaques indicates the mutant frequency in the original cell population the DNA was extracted from. Sequencing the mutant LacI
gene will show the location of the mutations in the gene and the type of mutation.
transgenic mouse model is well-established as an in vivo
mutagenesis assay. Moreover, the mice and the reagents for the assay are commercially available. Here we describe in detail how this model can be adapted to measure the frequency of spontaneously occurring DNA mutants in stem cell-enriched Lin-
(LSK) cells and other subpopulations of the hematopoietic system.
Infection, Issue 84, In vivo mutagenesis, hematopoietic stem/progenitor cells, LacI mouse model, DNA mutations, E. coli
Inducing Plasticity of Astrocytic Receptors by Manipulation of Neuronal Firing Rates
Institutions: University of California Riverside, University of California Riverside, University of California Riverside.
Close to two decades of research has established that astrocytes in situ
and in vivo
express numerous G protein-coupled receptors (GPCRs) that can be stimulated by neuronally-released transmitter. However, the ability of astrocytic receptors to exhibit plasticity in response to changes in neuronal activity has received little attention. Here we describe a model system that can be used to globally scale up or down astrocytic group I metabotropic glutamate receptors (mGluRs) in acute brain slices. Included are methods on how to prepare parasagittal hippocampal slices, construct chambers suitable for long-term slice incubation, bidirectionally manipulate neuronal action potential frequency, load astrocytes and astrocyte processes with fluorescent Ca2+
indicator, and measure changes in astrocytic Gq GPCR activity by recording spontaneous and evoked astrocyte Ca2+
events using confocal microscopy. In essence, a “calcium roadmap” is provided for how to measure plasticity of astrocytic Gq GPCRs. Applications of the technique for study of astrocytes are discussed. Having an understanding of how astrocytic receptor signaling is affected by changes in neuronal activity has important implications for both normal synaptic function as well as processes underlying neurological disorders and neurodegenerative disease.
Neuroscience, Issue 85, astrocyte, plasticity, mGluRs, neuronal Firing, electrophysiology, Gq GPCRs, Bolus-loading, calcium, microdomains, acute slices, Hippocampus, mouse
Combined DNA-RNA Fluorescent In situ Hybridization (FISH) to Study X Chromosome Inactivation in Differentiated Female Mouse Embryonic Stem Cells
Institutions: Erasmus MC - University Medical Center.
Fluorescent in situ
hybridization (FISH) is a molecular technique which enables the detection of nucleic acids in cells. DNA FISH is often used in cytogenetics and cancer diagnostics, and can detect aberrations of the genome, which often has important clinical implications. RNA FISH can be used to detect RNA molecules in cells and has provided important insights in regulation of gene expression. Combining DNA and RNA FISH within the same cell is technically challenging, as conditions suitable for DNA FISH might be too harsh for fragile, single stranded RNA molecules. We here present an easily applicable protocol which enables the combined, simultaneous detection of Xist
RNA and DNA encoded by the X chromosomes. This combined DNA-RNA FISH protocol can likely be applied to other systems where both RNA and DNA need to be detected.
Biochemistry, Issue 88, Fluorescent in situ hybridization (FISH), combined DNA-RNA FISH, ES cell, cytogenetics, single cell analysis, X chromosome inactivation (XCI), Xist, Bacterial artificial chromosome (BAC), DNA-probe, Rnf12
Using Coculture to Detect Chemically Mediated Interspecies Interactions
Institutions: University of North Carolina at Chapel Hill .
In nature, bacteria rarely exist in isolation; they are instead surrounded by a diverse array of other microorganisms that alter the local environment by secreting metabolites. These metabolites have the potential to modulate the physiology and differentiation of their microbial neighbors and are likely important factors in the establishment and maintenance of complex microbial communities. We have developed a fluorescence-based coculture screen to identify such chemically mediated microbial interactions. The screen involves combining a fluorescent transcriptional reporter strain with environmental microbes on solid media and allowing the colonies to grow in coculture. The fluorescent transcriptional reporter is designed so that the chosen bacterial strain fluoresces when it is expressing a particular phenotype of interest (i.e.
biofilm formation, sporulation, virulence factor production, etc
.) Screening is performed under growth conditions where this phenotype is not
expressed (and therefore the reporter strain is typically nonfluorescent). When an environmental microbe secretes a metabolite that activates this phenotype, it diffuses through the agar and activates the fluorescent reporter construct. This allows the inducing-metabolite-producing microbe to be detected: they are the nonfluorescent colonies most proximal to the fluorescent colonies. Thus, this screen allows the identification of environmental microbes that produce diffusible metabolites that activate a particular physiological response in a reporter strain. This publication discusses how to: a) select appropriate coculture screening conditions, b) prepare the reporter and environmental microbes for screening, c) perform the coculture screen, d) isolate putative inducing organisms, and e) confirm their activity in a secondary screen. We developed this method to screen for soil organisms that activate biofilm matrix-production in Bacillus subtilis
; however, we also discuss considerations for applying this approach to other genetically tractable bacteria.
Microbiology, Issue 80, High-Throughput Screening Assays, Genes, Reporter, Microbial Interactions, Soil Microbiology, Coculture, microbial interactions, screen, fluorescent transcriptional reporters, Bacillus subtilis
Detection of the Genome and Transcripts of a Persistent DNA Virus in Neuronal Tissues by Fluorescent In situ Hybridization Combined with Immunostaining
Institutions: CNRS UMR 5534, Université de Lyon 1, LabEX DEVweCAN, CNRS UPR 3296, CNRS UMR 5286.
Single cell codetection of a gene, its RNA product and cellular regulatory proteins is critical to study gene expression regulation. This is a challenge in the field of virology; in particular for nuclear-replicating persistent DNA viruses that involve animal models for their study. Herpes simplex virus type 1 (HSV-1) establishes a life-long latent infection in peripheral neurons. Latent virus serves as reservoir, from which it reactivates and induces a new herpetic episode. The cell biology of HSV-1 latency remains poorly understood, in part due to the lack of methods to detect HSV-1 genomes in situ
in animal models. We describe a DNA-fluorescent in situ
hybridization (FISH) approach efficiently detecting low-copy viral genomes within sections of neuronal tissues from infected animal models. The method relies on heat-based antigen unmasking, and directly labeled home-made DNA probes, or commercially available probes. We developed a triple staining approach, combining DNA-FISH with RNA-FISH and immunofluorescence, using peroxidase based signal amplification to accommodate each staining requirement. A major improvement is the ability to obtain, within 10 µm tissue sections, low-background signals that can be imaged at high resolution by confocal microscopy and wide-field conventional epifluorescence. Additionally, the triple staining worked with a wide range of antibodies directed against cellular and viral proteins. The complete protocol takes 2.5 days to accommodate antibody and probe penetration within the tissue.
Neuroscience, Issue 83, Life Sciences (General), Virology, Herpes Simplex Virus (HSV), Latency, In situ hybridization, Nuclear organization, Gene expression, Microscopy
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g.
by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5
. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6
, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1
. Originally published by Naal et al.1
, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here.
Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11
, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2
. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280
= 4,200 L/M/cm)12
. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
Visualizing Clathrin-mediated Endocytosis of G Protein-coupled Receptors at Single-event Resolution via TIRF Microscopy
Institutions: Carnegie Mellon University.
Many important signaling receptors are internalized through the well-studied process of clathrin-mediated endocytosis (CME). Traditional cell biological assays, measuring global changes in endocytosis, have identified over 30 known components participating in CME, and biochemical studies have generated an interaction map of many of these components. It is becoming increasingly clear, however, that CME is a highly dynamic process whose regulation is complex and delicate. In this manuscript, we describe the use of Total Internal Reflection Fluorescence (TIRF) microscopy to directly visualize the dynamics of components of the clathrin-mediated endocytic machinery, in real time in living cells, at the level of individual events that mediate this process. This approach is essential to elucidate the subtle changes that can alter endocytosis without globally blocking it, as is seen with physiological regulation. We will focus on using this technique to analyze an area of emerging interest, the role of cargo composition in modulating the dynamics of distinct clathrin-coated pits (CCPs). This protocol is compatible with a variety of widely available fluorescence probes, and may be applied to visualizing the dynamics of many cargo molecules that are internalized from the cell surface.
Cellular Biology, Issue 92, Endocytosis, TIRF, total internal reflection fluorescence microscopy, clathrin, arrestin, receptors, live-cell microscopy, clathrin-mediated endocytosis
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro
using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro
preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers.
In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo
counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure
neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic
SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
Using an EEG-Based Brain-Computer Interface for Virtual Cursor Movement with BCI2000
Institutions: University of Wisconsin-Madison, New York State Dept. of Health.
A brain-computer interface (BCI) functions by translating a neural signal, such as the electroencephalogram (EEG), into a signal that can be used to control a computer or other device. The amplitude of the EEG signals in selected frequency bins are measured and translated into a device command, in this case the horizontal and vertical velocity of a computer cursor. First, the EEG electrodes are applied to the user s scalp using a cap to record brain activity. Next, a calibration procedure is used to find the EEG electrodes and features that the user will learn to voluntarily modulate to use the BCI. In humans, the power in the mu (8-12 Hz) and beta (18-28 Hz) frequency bands decrease in amplitude during a real or imagined movement. These changes can be detected in the EEG in real-time, and used to control a BCI (,). Therefore, during a screening test, the user is asked to make several different imagined movements with their hands and feet to determine the unique EEG features that change with the imagined movements. The results from this calibration will show the best channels to use, which are configured so that amplitude changes in the mu and beta frequency bands move the cursor either horizontally or vertically. In this experiment, the general purpose BCI system BCI2000 is used to control signal acquisition, signal processing, and feedback to the user .
Neuroscience, Issue 29, BCI, EEG, brain-computer interface, BCI2000
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (https://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
High Throughput Quantitative Expression Screening and Purification Applied to Recombinant Disulfide-rich Venom Proteins Produced in E. coli
Institutions: Aix-Marseille Université, Commissariat à l'énergie atomique et aux énergies alternatives (CEA) Saclay, France.
Escherichia coli (E. coli)
is the most widely used expression system for the production of recombinant proteins for structural and functional studies. However, purifying proteins is sometimes challenging since many proteins are expressed in an insoluble form. When working with difficult or multiple targets it is therefore recommended to use high throughput (HTP) protein expression screening on a small scale (1-4 ml cultures) to quickly identify conditions for soluble expression. To cope with the various structural genomics programs of the lab, a quantitative (within a range of 0.1-100 mg/L culture of recombinant protein) and HTP protein expression screening protocol was implemented and validated on thousands of proteins. The protocols were automated with the use of a liquid handling robot but can also be performed manually without specialized equipment.
Disulfide-rich venom proteins are gaining increasing recognition for their potential as therapeutic drug leads. They can be highly potent and selective, but their complex disulfide bond networks make them challenging to produce. As a member of the FP7 European Venomics project (www.venomics.eu), our challenge is to develop successful production strategies with the aim of producing thousands of novel venom proteins for functional characterization. Aided by the redox properties of disulfide bond isomerase DsbC, we adapted our HTP production pipeline for the expression of oxidized, functional venom peptides in the E. coli
cytoplasm. The protocols are also applicable to the production of diverse disulfide-rich proteins. Here we demonstrate our pipeline applied to the production of animal venom proteins. With the protocols described herein it is likely that soluble disulfide-rich proteins will be obtained in as little as a week. Even from a small scale, there is the potential to use the purified proteins for validating the oxidation state by mass spectrometry, for characterization in pilot studies, or for sensitive micro-assays.
Bioengineering, Issue 89, E. coli, expression, recombinant, high throughput (HTP), purification, auto-induction, immobilized metal affinity chromatography (IMAC), tobacco etch virus protease (TEV) cleavage, disulfide bond isomerase C (DsbC) fusion, disulfide bonds, animal venom proteins/peptides
Setting Limits on Supersymmetry Using Simplified Models
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Creating Objects and Object Categories for Studying Perception and Perceptual Learning
Institutions: Georgia Health Sciences University, Georgia Health Sciences University, Georgia Health Sciences University, Palo Alto Research Center, Palo Alto Research Center, University of Minnesota .
In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1
. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes
) with such properties2
Many innovative and useful methods currently exist for creating novel objects and object categories3-6
(also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings.
First, shape variations are generally imposed by the experimenter5,9,10
, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints.
Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13
. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases.
Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms.
Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14
. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13
. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics15,16
. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects9,13
. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper.
We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have.
Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis.
Neuroscience, Issue 69, machine learning, brain, classification, category learning, cross-modal perception, 3-D prototyping, inference
Trajectory Data Analyses for Pedestrian Space-time Activity Study
Institutions: Kean University, University of Wisconsin-Madison.
It is well recognized that human movement in the spatial and temporal dimensions has direct influence on disease transmission1-3
. An infectious disease typically spreads via contact between infected and susceptible individuals in their overlapped activity spaces. Therefore, daily mobility-activity information can be used as an indicator to measure exposures to risk factors of infection. However, a major difficulty and thus the reason for paucity of studies of infectious disease transmission at the micro scale arise from the lack of detailed individual mobility data. Previously in transportation and tourism research detailed space-time activity data often relied on the time-space diary technique, which requires subjects to actively record their activities in time and space. This is highly demanding for the participants and collaboration from the participants greatly affects the quality of data4
Modern technologies such as GPS and mobile communications have made possible the automatic collection of trajectory data. The data collected, however, is not ideal for modeling human space-time activities, limited by the accuracies of existing devices. There is also no readily available tool for efficient processing of the data for human behavior study. We present here a suite of methods and an integrated ArcGIS desktop-based visual interface for the pre-processing and spatiotemporal analyses of trajectory data. We provide examples of how such processing may be used to model human space-time activities, especially with error-rich pedestrian trajectory data, that could be useful in public health studies such as infectious disease transmission modeling.
The procedure presented includes pre-processing, trajectory segmentation, activity space characterization, density estimation and visualization, and a few other exploratory analysis methods. Pre-processing is the cleaning of noisy raw trajectory data. We introduce an interactive visual pre-processing interface as well as an automatic module. Trajectory segmentation5
involves the identification of indoor and outdoor parts from pre-processed space-time tracks. Again, both interactive visual segmentation and automatic segmentation are supported. Segmented space-time tracks are then analyzed to derive characteristics of one's activity space such as activity radius etc.
Density estimation and visualization are used to examine large amount of trajectory data to model hot spots and interactions. We demonstrate both density surface mapping6
and density volume rendering7
. We also include a couple of other exploratory data analyses (EDA) and visualizations tools, such as Google Earth animation support and connection analysis. The suite of analytical as well as visual methods presented in this paper may be applied to any trajectory data for space-time activity studies.
Environmental Sciences, Issue 72, Computer Science, Behavior, Infectious Diseases, Geography, Cartography, Data Display, Disease Outbreaks, cartography, human behavior, Trajectory data, space-time activity, GPS, GIS, ArcGIS, spatiotemporal analysis, visualization, segmentation, density surface, density volume, exploratory data analysis, modelling
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Brain Imaging Investigation of the Neural Correlates of Observing Virtual Social Interactions
Institutions: University of Alberta, University of Illinois, University of Alberta, University of Alberta, University of Alberta, University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign.
The ability to gauge social interactions is crucial in the assessment of others’ intentions. Factors such as facial expressions and body language affect our decisions in personal and professional life alike 1
. These "friend or foe
" judgements are often based on first impressions, which in turn may affect our decisions to "approach or avoid
". Previous studies investigating the neural correlates of social cognition tended to use static facial stimuli 2
. Here, we illustrate an experimental design in which whole-body animated characters were used in conjunction with functional magnetic resonance imaging (fMRI) recordings. Fifteen participants were presented with short movie-clips of guest-host interactions in a business setting, while fMRI data were recorded; at the end of each movie, participants also provided ratings of the host behaviour. This design mimics more closely real-life situations, and hence may contribute to better understanding of the neural mechanisms of social interactions in healthy behaviour, and to gaining insight into possible causes of deficits in social behaviour in such clinical conditions as social anxiety and autism 3
Neuroscience, Issue 53, Social Perception, Social Knowledge, Social Cognition Network, Non-Verbal Communication, Decision-Making, Event-Related fMRI
Using Learning Outcome Measures to assess Doctoral Nursing Education
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric