Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
25 Related JoVE Articles!
Experimental Protocol for Manipulating Plant-induced Soil Heterogeneity
Institutions: Case Western Reserve University.
Coexistence theory has often treated environmental heterogeneity as being independent of the community composition; however biotic feedbacks such as plant-soil feedbacks (PSF) have large effects on plant performance, and create environmental heterogeneity that depends on the community composition. Understanding the importance of PSF for plant community assembly necessitates understanding of the role of heterogeneity in PSF, in addition to mean PSF effects. Here, we describe a protocol for manipulating plant-induced soil heterogeneity. Two example experiments are presented: (1) a field experiment with a 6-patch grid of soils to measure plant population responses and (2) a greenhouse experiment with 2-patch soils to measure individual plant responses. Soils can be collected from the zone of root influence (soils from the rhizosphere and directly adjacent to the rhizosphere) of plants in the field from conspecific and heterospecific plant species. Replicate collections are used to avoid pseudoreplicating soil samples. These soils are then placed into separate patches for heterogeneous treatments or mixed for a homogenized treatment. Care should be taken to ensure that heterogeneous and homogenized treatments experience the same degree of soil disturbance. Plants can then be placed in these soil treatments to determine the effect of plant-induced soil heterogeneity on plant performance. We demonstrate that plant-induced heterogeneity results in different outcomes than predicted by traditional coexistence models, perhaps because of the dynamic nature of these feedbacks. Theory that incorporates environmental heterogeneity influenced by the assembling community and additional empirical work is needed to determine when heterogeneity intrinsic to the assembling community will result in different assembly outcomes compared with heterogeneity extrinsic to the community composition.
Environmental Sciences, Issue 85, Coexistence, community assembly, environmental drivers, plant-soil feedback, soil heterogeneity, soil microbial communities, soil patch
RNAi Screening for Host Factors Involved in Vaccinia Virus Infection using Drosophila Cells
Institutions: University of Pennsylvania .
Viral pathogens represent a significant public health threat; not only can viruses cause natural epidemics of human disease, but their potential use in bioterrorism is also a concern. A better understanding of the cellular factors that impact infection would facilitate the development of much-needed therapeutics. Recent advances in RNA interference (RNAi) technology coupled with complete genome sequencing of several organisms has led to the optimization of genome-wide, cell-based loss-of-function screens. Drosophila
cells are particularly amenable to genome-scale screens because of the ease and efficiency of RNAi in this system 1
. Importantly, a wide variety of viruses can infect Drosophila
cells, including a number of mammalian viruses of medical and agricultural importance 2,3,4
. Previous RNAi screens in Drosophila
have identified host factors that are required for various steps in virus infection including entry, translation and RNA replication 5
. Moreover, many of the cellular factors required for viral replication in Drosophila
cell culture are also limiting in human cells infected with these viruses 4,6,7,8, 9
. Therefore, the identification of host factors co-opted during viral infection presents novel targets for antiviral therapeutics. Here we present a generalized protocol for a high-throughput RNAi screen to identify cellular factors involved in viral infection, using vaccinia
virus as an example.
cellular biology, Issue 42, RNAi, high-throughput screening, virus-host interactions, Drosophila, viral infections
Monitoring the Wall Mechanics During Stent Deployment in a Vessel
Institutions: University of Nebraska-Lincoln.
Clinical trials have reported different restenosis rates for various stent designs1
. It is speculated that stent-induced strain concentrations on the arterial wall lead to tissue injury, which initiates restenosis2-7
. This hypothesis needs further investigations including better quantifications of non-uniform strain distribution on the artery following stent implantation. A non-contact surface strain measurement method for the stented artery is presented in this work. ARAMIS stereo optical surface strain measurement system uses two optical high speed cameras to capture the motion of each reference point, and resolve three dimensional strains over the deforming surface8,9
. As a mesh stent is deployed into a latex vessel with a random contrasting pattern sprayed or drawn on its outer surface, the surface strain is recorded at every instant of the deformation. The calculated strain distributions can then be used to understand the local lesion response, validate the computational models, and formulate hypotheses for further in vivo
Biomedical Engineering, Issue 63, Stent, vessel, interaction, strain distribution, stereo optical surface strain measurement system, bioengineering
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro
using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro
preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers.
In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo
counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure
neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic
SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
Electric Cell-substrate Impedance Sensing for the Quantification of Endothelial Proliferation, Barrier Function, and Motility
Institutions: Institute for Cardiovascular Research, VU University Medical Center, Institute for Cardiovascular Research, VU University Medical Center.
Electric Cell-substrate Impedance Sensing (ECIS) is an in vitro
impedance measuring system to quantify the behavior of cells within adherent cell layers. To this end, cells are grown in special culture chambers on top of opposing, circular gold electrodes. A constant small alternating current is applied between the electrodes and the potential across is measured. The insulating properties of the cell membrane create a resistance towards the electrical current flow resulting in an increased electrical potential between the electrodes. Measuring cellular impedance in this manner allows the automated study of cell attachment, growth, morphology, function, and motility. Although the ECIS measurement itself is straightforward and easy to learn, the underlying theory is complex and selection of the right settings and correct analysis and interpretation of the data is not self-evident. Yet, a clear protocol describing the individual steps from the experimental design to preparation, realization, and analysis of the experiment is not available. In this article the basic measurement principle as well as possible applications, experimental considerations, advantages and limitations of the ECIS system are discussed. A guide is provided for the study of cell attachment, spreading and proliferation; quantification of cell behavior in a confluent layer, with regard to barrier function, cell motility, quality of cell-cell and cell-substrate adhesions; and quantification of wound healing and cellular responses to vasoactive stimuli. Representative results are discussed based on human microvascular (MVEC) and human umbilical vein endothelial cells (HUVEC), but are applicable to all adherent growing cells.
Bioengineering, Issue 85, ECIS, Impedance Spectroscopy, Resistance, TEER, Endothelial Barrier, Cell Adhesions, Focal Adhesions, Proliferation, Migration, Motility, Wound Healing
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Scale-Up of Mammalian Cell Culture using a New Multilayered Flask
Institutions: BD Biosciences .
A growing number of cell-based applications require large numbers of cells. Usage of single layer T-flasks, that are adequate during small-scale expansion, may become cumbersome, laborious and time-consuming when large numbers of cells are required. To address this need, the performance of a new multi-layered cell culture vessel to facilitate easy scale up of cells from single layered T-flasks will be discussed. The flasks tested are available in 3- and 5-layer format and enable culture and complete recovery of three and five times the number of cells respectively, compared to T-175 flasks. A key feature of the BD Multi-Flask is a mix/equilibration port that allows rapid in-vessel mixing as well as uniform distribution of cells and reagents within and
between layers of each vessel and consistently produce cells that can be cultured in an environment that is congruent to T-175 flasks.
The design of these Multi-Flasks also allows for convenient pipette access for adding reagents and cells directly into the flasks as well as efficient recovery of valuable cells and reagents and reduces risk of contamination due to pouring. For applications where pouring is preferred over pipetting, the design allows for minimal residual liquid retention so as to reduce wastage of valuable cells and reagents.
Basic Protocols, Issue 58, Multi-Flask, multi-layered, stackable, scale-up, cell culture, flasks
Strategies for Study of Neuroprotection from Cold-preconditioning
Institutions: The University of Chicago Medical Center.
Neurological injury is a frequent cause of morbidity and mortality from general anesthesia and related surgical procedures that could be alleviated by development of effective, easy to administer and safe preconditioning treatments. We seek to define the neural immune signaling responsible for cold-preconditioning as means to identify novel targets for therapeutics development to protect brain before injury onset. Low-level pro-inflammatory mediator signaling changes over time are essential for cold-preconditioning neuroprotection. This signaling is consistent with the basic tenets of physiological conditioning hormesis, which require that irritative stimuli reach a threshold magnitude with sufficient time for adaptation to the stimuli for protection to become evident.
Accordingly, delineation of the immune signaling involved in cold-preconditioning neuroprotection requires that biological systems and experimental manipulations plus technical capacities are highly reproducible and sensitive. Our approach is to use hippocampal slice cultures as an in vitro
model that closely reflects their in vivo
counterparts with multi-synaptic neural networks influenced by mature and quiescent macroglia / microglia. This glial state is particularly important for microglia since they are the principal source of cytokines, which are operative in the femtomolar range. Also, slice cultures can be maintained in vitro
for several weeks, which is sufficient time to evoke activating stimuli and assess adaptive responses. Finally, environmental conditions can be accurately controlled using slice cultures so that cytokine signaling of cold-preconditioning can be measured, mimicked, and modulated to dissect the critical node aspects. Cytokine signaling system analyses require the use of sensitive and reproducible multiplexed techniques. We use quantitative PCR for TNF-α to screen for microglial activation followed by quantitative real-time qPCR array screening to assess tissue-wide cytokine changes. The latter is a most sensitive and reproducible means to measure multiple cytokine system signaling changes simultaneously. Significant changes are confirmed with targeted qPCR and then protein detection. We probe for tissue-based cytokine protein changes using multiplexed microsphere flow cytometric assays using Luminex technology. Cell-specific cytokine production is determined with double-label immunohistochemistry. Taken together, this brain tissue preparation and style of use, coupled to the suggested investigative strategies, may be an optimal approach for identifying potential targets for the development of novel therapeutics that could mimic the advantages of cold-preconditioning.
Neuroscience, Issue 43, innate immunity, hormesis, microglia, hippocampus, slice culture, immunohistochemistry, neural-immune, gene expression, real-time PCR
In Vitro Reconstitution of Light-harvesting Complexes of Plants and Green Algae
Institutions: VU University Amsterdam.
In plants and green algae, light is captured by the light-harvesting complexes (LHCs), a family of integral membrane proteins that coordinate chlorophylls and carotenoids. In vivo
, these proteins are folded with pigments to form complexes which are inserted in the thylakoid membrane of the chloroplast. The high similarity in the chemical and physical properties of the members of the family, together with the fact that they can easily lose pigments during isolation, makes their purification in a native state challenging. An alternative approach to obtain homogeneous preparations of LHCs was developed by Plumley and Schmidt in 19871
, who showed that it was possible to reconstitute these complexes in vitro
starting from purified pigments and unfolded apoproteins, resulting in complexes with properties very similar to that of native complexes. This opened the way to the use of bacterial expressed recombinant proteins for in vitro
reconstitution. The reconstitution method is powerful for various reasons: (1) pure preparations of individual complexes can be obtained, (2) pigment composition can be controlled to assess their contribution to structure and function, (3) recombinant proteins can be mutated to study the functional role of the individual residues (e.g.,
pigment binding sites) or protein domain (e.g.,
protein-protein interaction, folding). This method has been optimized in several laboratories and applied to most of the light-harvesting complexes. The protocol described here details the method of reconstituting light-harvesting complexes in vitro
currently used in our laboratory,
and examples describing applications of the method are provided.
Biochemistry, Issue 92, Reconstitution, Photosynthesis, Chlorophyll, Carotenoids, Light Harvesting Protein, Chlamydomonas reinhardtii, Arabidopsis thaliana
A New Approach for the Comparative Analysis of Multiprotein Complexes Based on 15N Metabolic Labeling and Quantitative Mass Spectrometry
Institutions: University of Münster, Carnegie Institution for Science.
The introduced protocol provides a tool for the analysis of multiprotein complexes in the thylakoid membrane, by revealing insights into complex composition under different conditions. In this protocol the approach is demonstrated by comparing the composition of the protein complex responsible for cyclic electron flow (CEF) in Chlamydomonas reinhardtii
, isolated from genetically different strains. The procedure comprises the isolation of thylakoid membranes, followed by their separation into multiprotein complexes by sucrose density gradient centrifugation, SDS-PAGE, immunodetection and comparative, quantitative mass spectrometry (MS) based on differential metabolic labeling (14
N) of the analyzed strains. Detergent solubilized thylakoid membranes are loaded on sucrose density gradients at equal chlorophyll concentration. After ultracentrifugation, the gradients are separated into fractions, which are analyzed by mass-spectrometry based on equal volume. This approach allows the investigation of the composition within the gradient fractions and moreover to analyze the migration behavior of different proteins, especially focusing on ANR1, CAS, and PGRL1. Furthermore, this method is demonstrated by confirming the results with immunoblotting and additionally by supporting the findings from previous studies (the identification and PSI-dependent migration of proteins that were previously described to be part of the CEF-supercomplex such as PGRL1, FNR, and cyt f
). Notably, this approach is applicable to address a broad range of questions for which this protocol can be adopted and e.g.
used for comparative analyses of multiprotein complex composition isolated from distinct environmental conditions.
Microbiology, Issue 85, Sucrose density gradients, Chlamydomonas, multiprotein complexes, 15N metabolic labeling, thylakoids
Irrelevant Stimuli and Action Control: Analyzing the Influence of Ignored Stimuli via the Distractor-Response Binding Paradigm
Institutions: Trier University, Trier University.
Selection tasks in which simple stimuli (e.g.
letters) are presented and a target stimulus has to be selected against one or more distractor stimuli are frequently used in the research on human action control. One important question in these settings is how distractor stimuli, competing with the target stimulus for a response, influence actions. The distractor-response binding paradigm can be used to investigate this influence. It is particular useful to separately analyze response retrieval and distractor inhibition effects. Computer-based experiments are used to collect the data (reaction times and error rates). In a number of sequentially presented pairs of stimulus arrays (prime-probe design), participants respond to targets while ignoring distractor stimuli. Importantly, the factors response relation in the arrays of each pair (repetition vs. change) and distractor relation (repetition vs. change) are varied orthogonally. The repetition of the same distractor then has a different effect depending on response relation (repetition vs. change) between arrays. This result pattern can be explained by response retrieval due to distractor repetition. In addition, distractor inhibition effects are indicated by a general advantage due to distractor repetition. The described paradigm has proven useful to determine relevant parameters for response retrieval effects on human action.
Behavior, Issue 87, stimulus-response binding, distractor-response binding, response retrieval, distractor inhibition, event file, action control, selection task
Non-invasive Optical Measurement of Cerebral Metabolism and Hemodynamics in Infants
Institutions: Massachusetts General Hospital, Harvard Medical School, Université de Caen Basse-Normandie, Boston Children's Hospital, Harvard Medical School, ISS, INC..
Perinatal brain injury remains a significant cause of infant mortality and morbidity, but there is not yet an effective bedside tool that can accurately screen for brain injury, monitor injury evolution, or assess response to therapy. The energy used by neurons is derived largely from tissue oxidative metabolism, and neural hyperactivity and cell death are reflected by corresponding changes in cerebral oxygen metabolism (CMRO2
). Thus, measures of CMRO2
are reflective of neuronal viability and provide critical diagnostic information, making CMRO2
an ideal target for bedside measurement of brain health.
Brain-imaging techniques such as positron emission tomography (PET) and single-photon emission computed tomography (SPECT) yield measures of cerebral glucose and oxygen metabolism, but these techniques require the administration of radionucleotides, so they are used in only the most acute cases.
Continuous-wave near-infrared spectroscopy (CWNIRS) provides non-invasive and non-ionizing radiation measures of hemoglobin oxygen saturation (SO2
) as a surrogate for cerebral oxygen consumption. However, SO2
is less than ideal as a surrogate for cerebral oxygen metabolism as it is influenced by both oxygen delivery and consumption. Furthermore, measurements of SO2
are not sensitive enough to detect brain injury hours after the insult 1,2
, because oxygen consumption and delivery reach equilibrium after acute transients 3
. We investigated the possibility of using more sophisticated NIRS optical methods to quantify cerebral oxygen metabolism at the bedside in healthy and brain-injured newborns. More specifically, we combined the frequency-domain NIRS (FDNIRS) measure of SO2
with the diffuse correlation spectroscopy (DCS) measure of blood flow index (CBFi
) to yield an index of CMRO2
With the combined FDNIRS/DCS system we are able to quantify cerebral metabolism and hemodynamics. This represents an improvement over CWNIRS for detecting brain health, brain development, and response to therapy in neonates. Moreover, this method adheres to all neonatal intensive care unit (NICU) policies on infection control and institutional policies on laser safety. Future work will seek to integrate the two instruments to reduce acquisition time at the bedside and to implement real-time feedback on data quality to reduce the rate of data rejection.
Medicine, Issue 73, Developmental Biology, Neurobiology, Neuroscience, Biomedical Engineering, Anatomy, Physiology, Near infrared spectroscopy, diffuse correlation spectroscopy, cerebral hemodynamic, cerebral metabolism, brain injury screening, brain health, brain development, newborns, neonates, imaging, clinical techniques
GENPLAT: an Automated Platform for Biomass Enzyme Discovery and Cocktail Optimization
Institutions: Michigan State University, Michigan State University.
The high cost of enzymes for biomass deconstruction is a major impediment to the economic conversion of lignocellulosic feedstocks to liquid transportation fuels such as ethanol. We have developed an integrated high throughput platform, called GENPLAT, for the discovery and development of novel enzymes and enzyme cocktails for the release of sugars from diverse pretreatment/biomass combinations. GENPLAT comprises four elements: individual pure enzymes, statistical design of experiments, robotic pipeting of biomass slurries and enzymes, and automated colorimeteric determination of released Glc and Xyl. Individual enzymes are produced by expression in Pichia pastoris
or Trichoderma reesei
, or by chromatographic purification from commercial cocktails or from extracts of novel microorganisms. Simplex lattice (fractional factorial) mixture models are designed using commercial Design of Experiment statistical software. Enzyme mixtures of high complexity are constructed using robotic pipeting into a 96-well format. The measurement of released Glc and Xyl is automated using enzyme-linked colorimetric assays. Optimized enzyme mixtures containing as many as 16 components have been tested on a variety of feedstock and pretreatment combinations.
GENPLAT is adaptable to mixtures of pure enzymes, mixtures of commercial products (e.g., Accellerase 1000 and Novozyme 188), extracts of novel microbes, or combinations thereof. To make and test mixtures of ˜10 pure enzymes requires less than 100 μg of each protein and fewer than 100 total reactions, when operated at a final total loading of 15 mg protein/g glucan. We use enzymes from several sources. Enzymes can be purified from natural sources such as fungal cultures (e.g., Aspergillus niger
, Cochliobolus carbonum
, and Galerina marginata
), or they can be made by expression of the encoding genes (obtained from the increasing number of microbial genome sequences) in hosts such as E. coli, Pichia pastoris
, or a filamentous fungus such as T. reesei
. Proteins can also be purified from commercial enzyme cocktails (e.g., Multifect Xylanase, Novozyme 188). An increasing number of pure enzymes, including glycosyl hydrolases, cell wall-active esterases, proteases, and lyases, are available from commercial sources, e.g., Megazyme, Inc. (www.megazyme.com), NZYTech (www.nzytech.com), and PROZOMIX (www.prozomix.com).
Design-Expert software (Stat-Ease, Inc.) is used to create simplex-lattice designs and to analyze responses (in this case, Glc and Xyl release). Mixtures contain 4-20 components, which can vary in proportion between 0 and 100%. Assay points typically include the extreme vertices with a sufficient number of intervening points to generate a valid model. In the terminology of experimental design, most of our studies are "mixture" experiments, meaning that the sum of all components adds to a total fixed protein loading (expressed as mg/g glucan). The number of mixtures in the simplex-lattice depends on both the number of components in the mixture and the degree of polynomial (quadratic or cubic). For example, a 6-component experiment will entail 63 separate reactions with an augmented special cubic model, which can detect three-way interactions, whereas only 23 individual reactions are necessary with an augmented quadratic model. For mixtures containing more than eight components, a quadratic experimental design is more practical, and in our experience such models are usually statistically valid.
All enzyme loadings are expressed as a percentage of the final total loading (which for our experiments is typically 15 mg protein/g glucan). For "core" enzymes, the lower percentage limit is set to 5%. This limit was derived from our experience in which yields of Glc and/or Xyl were very low if any core enzyme was present at 0%. Poor models result from too many samples showing very low Glc or Xyl yields. Setting a lower limit in turn determines an upper limit. That is, for a six-component experiment, if the lower limit for each single component is set to 5%, then the upper limit of each single component will be 75%. The lower limits of all other enzymes considered as "accessory" are set to 0%. "Core" and "accessory" are somewhat arbitrary designations and will differ depending on the substrate, but in our studies the core enzymes for release of Glc from corn stover comprise the following enzymes from T. reesei
: CBH1 (also known as Cel7A), CBH2 (Cel6A), EG1(Cel7B), BG (β-glucosidase), EX3 (endo-β1,4-xylanase, GH10), and BX (β-xylosidase).
Bioengineering, Issue 56, cellulase, cellobiohydrolase, glucanase, xylanase, hemicellulase, experimental design, biomass, bioenergy, corn stover, glycosyl hydrolase
The Generation of Higher-order Laguerre-Gauss Optical Beams for High-precision Interferometry
Institutions: University of Birmingham.
Thermal noise in high-reflectivity mirrors is a major impediment for several types of high-precision interferometric experiments that aim to reach the standard quantum limit or to cool mechanical systems to their quantum ground state. This is for example the case of future gravitational wave observatories, whose sensitivity to gravitational wave signals is expected to be limited in the most sensitive frequency band, by atomic vibration of their mirror masses. One promising approach being pursued to overcome this limitation is to employ higher-order Laguerre-Gauss (LG) optical beams in place of the conventionally used fundamental mode. Owing to their more homogeneous light intensity distribution these beams average more effectively over the thermally driven fluctuations of the mirror surface, which in turn reduces the uncertainty in the mirror position sensed by the laser light.
We demonstrate a promising method to generate higher-order LG beams by shaping a fundamental Gaussian beam with the help of diffractive optical elements. We show that with conventional sensing and control techniques that are known for stabilizing fundamental laser beams, higher-order LG modes can be purified and stabilized just as well at a comparably high level. A set of diagnostic tools allows us to control and tailor the properties of generated LG beams. This enabled us to produce an LG beam with the highest purity reported to date. The demonstrated compatibility of higher-order LG modes with standard interferometry techniques and with the use of standard spherical optics makes them an ideal candidate for application in a future generation of high-precision interferometry.
Physics, Issue 78, Optics, Astronomy, Astrophysics, Gravitational waves, Laser interferometry, Metrology, Thermal noise, Laguerre-Gauss modes, interferometry
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (https://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
In Situ Neutron Powder Diffraction Using Custom-made Lithium-ion Batteries
Institutions: University of Sydney, University of Wollongong, Australian Synchrotron, Australian Nuclear Science and Technology Organisation, University of Wollongong, University of New South Wales.
Li-ion batteries are widely used in portable electronic devices and are considered as promising candidates for higher-energy applications such as electric vehicles.1,2
However, many challenges, such as energy density and battery lifetimes, need to be overcome before this particular battery technology can be widely implemented in such applications.3
This research is challenging, and we outline a method to address these challenges using in situ
NPD to probe the crystal structure of electrodes undergoing electrochemical cycling (charge/discharge) in a battery. NPD data help determine the underlying structural mechanism responsible for a range of electrode properties, and this information can direct the development of better electrodes and batteries.
We briefly review six types of battery designs custom-made for NPD experiments and detail the method to construct the ‘roll-over’ cell that we have successfully used on the high-intensity NPD instrument, WOMBAT, at the Australian Nuclear Science and Technology Organisation (ANSTO). The design considerations and materials used for cell construction are discussed in conjunction with aspects of the actual in situ
NPD experiment and initial directions are presented on how to analyze such complex in situ
Physics, Issue 93, In operando, structure-property relationships, electrochemical cycling, electrochemical cells, crystallography, battery performance
A Manual Small Molecule Screen Approaching High-throughput Using Zebrafish Embryos
Institutions: University of Notre Dame.
Zebrafish have become a widely used model organism to investigate the mechanisms that underlie developmental biology and to study human disease pathology due to their considerable degree of genetic conservation with humans. Chemical genetics entails testing the effect that small molecules have on a biological process and is becoming a popular translational research method to identify therapeutic compounds. Zebrafish are specifically appealing to use for chemical genetics because of their ability to produce large clutches of transparent embryos, which are externally fertilized. Furthermore, zebrafish embryos can be easily drug treated by the simple addition of a compound to the embryo media. Using whole-mount in situ
hybridization (WISH), mRNA expression can be clearly visualized within zebrafish embryos. Together, using chemical genetics and WISH, the zebrafish becomes a potent whole organism context in which to determine the cellular and physiological effects of small molecules. Innovative advances have been made in technologies that utilize machine-based screening procedures, however for many labs such options are not accessible or remain cost-prohibitive. The protocol described here explains how to execute a manual high-throughput chemical genetic screen that requires basic resources and can be accomplished by a single individual or small team in an efficient period of time. Thus, this protocol provides a feasible strategy that can be implemented by research groups to perform chemical genetics in zebrafish, which can be useful for gaining fundamental insights into developmental processes, disease mechanisms, and to identify novel compounds and signaling pathways that have medically relevant applications.
Developmental Biology, Issue 93, zebrafish, chemical genetics, chemical screen, in vivo small molecule screen, drug discovery, whole mount in situ hybridization (WISH), high-throughput screening (HTS), high-content screening (HCS)
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Protein Crystallization for X-ray Crystallography
Institutions: Yale University.
Using the three-dimensional structure of biological macromolecules to infer how they function is one of the most important fields of modern biology. The availability of atomic resolution structures provides a deep and unique understanding of protein function, and helps to unravel the inner workings of the living cell. To date, 86% of the Protein Data Bank (rcsb-PDB) entries are macromolecular structures that were determined using X-ray crystallography.
To obtain crystals suitable for crystallographic studies, the macromolecule (e.g. protein, nucleic acid, protein-protein complex or protein-nucleic acid complex) must be purified to homogeneity, or as close as possible to homogeneity. The homogeneity of the preparation is a key factor in obtaining crystals that diffract to high resolution (Bergfors, 1999; McPherson, 1999).
Crystallization requires bringing the macromolecule to supersaturation. The sample should therefore be concentrated to the highest possible concentration without causing aggregation or precipitation of the macromolecule (usually 2-50 mg/ mL). Introducing the sample to precipitating agent can promote the nucleation of protein crystals in the solution, which can result in large three-dimensional crystals growing from the solution. There are two main techniques to obtain crystals: vapor diffusion and batch crystallization. In vapor diffusion, a drop containing a mixture of precipitant and protein solutions is sealed in a chamber with pure precipitant. Water vapor then diffuses out of the drop until the osmolarity of the drop and the precipitant are equal (Figure 1A). The dehydration of the drop causes a slow concentration of both protein and precipitant until equilibrium is achieved, ideally in the crystal nucleation zone of the phase diagram. The batch method relies on bringing the protein directly into the nucleation zone by mixing protein with the appropriate amount of precipitant (Figure 1B). This method is usually performed under a paraffin/mineral oil mixture to prevent the diffusion of water out of the drop.
Here we will demonstrate two kinds of experimental setup for vapor diffusion, hanging drop and sitting drop, in addition to batch crystallization under oil.
Molecular Biology, Issue 47, protein crystallization, nucleic acid crystallization, vapor diffusion, X-ray crystallography, precipitant
Hydrogel Nanoparticle Harvesting of Plasma or Urine for Detecting Low Abundance Proteins
Institutions: George Mason University, Ceres Nanosciences.
Novel biomarker discovery plays a crucial role in providing more sensitive and specific disease detection. Unfortunately many low-abundance biomarkers that exist in biological fluids cannot be easily detected with mass spectrometry or immunoassays because they are present in very low concentration, are labile, and are often masked by high-abundance proteins such as albumin or immunoglobulin. Bait containing poly(N-isopropylacrylamide) (NIPAm) based nanoparticles are able to overcome these physiological barriers. In one step they are able to capture, concentrate and preserve biomarkers from body fluids. Low-molecular weight analytes enter the core of the nanoparticle and are captured by different organic chemical dyes, which act as high affinity protein baits. The nanoparticles are able to concentrate the proteins of interest by several orders of magnitude. This concentration factor is sufficient to increase the protein level such that the proteins are within the detection limit of current mass spectrometers, western blotting, and immunoassays. Nanoparticles can be incubated with a plethora of biological fluids and they are able to greatly enrich the concentration of low-molecular weight proteins and peptides while excluding albumin and other high-molecular weight proteins. Our data show that a 10,000 fold amplification in the concentration of a particular analyte can be achieved, enabling mass spectrometry and immunoassays to detect previously undetectable biomarkers.
Bioengineering, Issue 90, biomarker, hydrogel, low abundance, mass spectrometry, nanoparticle, plasma, protein, urine
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques