In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2.
Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings.
First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints.
Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases.
Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms.
Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics15,16. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects9,13. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper.
We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have.
Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis.
25 Related JoVE Articles!
Isolation of Ribosome Bound Nascent Polypeptides in vitro to Identify Translational Pause Sites Along mRNA
Institutions: Cleveland State University.
The rate of translational elongation is non-uniform. mRNA secondary structure, codon usage and mRNA associated proteins may alter ribosome movement on the messagefor review see 1
. However, it's now widely accepted that synonymous codon usage is the primary cause of non-uniform translational elongation rates1
. Synonymous codons are not used with identical frequency. A bias exists in the use of synonymous codons with some codons used more frequently than others2
. Codon bias is organism as well as tissue specific2,3
. Moreover, frequency of codon usage is directly proportional to the concentrations of cognate tRNAs4
. Thus, a frequently used codon will have higher multitude of corresponding tRNAs, which further implies that a frequent codon will be translated faster than an infrequent one. Thus, regions on mRNA enriched in rare codons (potential pause sites) will as a rule slow down ribosome movement on the message and cause accumulation of nascent peptides of the respective sizes5-8
. These pause sites can have functional impact on the protein expression, mRNA stability and protein foldingfor review see 9
. Indeed, it was shown that alleviation of such pause sites can alter ribosome movement on mRNA and subsequently may affect the efficiency of co-translational (in vivo
) protein folding1,7,10,11
. To understand the process of protein folding in vivo
, in the cell, that is ultimately coupled to the process of protein synthesis it is essential to gain comprehensive insights into the impact of codon usage/tRNA content on the movement of ribosomes along mRNA during translational elongation.
Here we describe a simple technique that can be used to locate major translation pause sites for a given mRNA translated in various cell-free systems6-8
. This procedure is based on isolation of nascent polypeptides accumulating on ribosomes during in vitro
translation of a target mRNA. The rationale is that at low-frequency codons, the increase in the residence time of the ribosomes results in increased amounts of nascent peptides of the corresponding sizes. In vitro
transcribed mRNA is used for in vitro
translational reactions in the presence of radioactively labeled amino acids to allow the detection of the nascent chains. In order to isolate ribosome bound nascent polypeptide complexes the translation reaction is layered on top of 30% glycerol solution followed by centrifugation. Nascent polypeptides in polysomal pellet are further treated with ribonuclease A and resolved by SDS PAGE. This technique can be potentially used for any protein and allows analysis of ribosome movement along mRNA and the detection of the major pause sites. Additionally, this protocol can be adapted to study factors and conditions that can alter ribosome movement and thus potentially can also alter the function/conformation of the protein.
Genetics, Issue 65, Molecular Biology, Ribosome, Nascent polypeptide, Co-translational protein folding, Synonymous codon usage, gene regulation
Measuring Growth and Gene Expression Dynamics of Tumor-Targeted S. Typhimurium Bacteria
Institutions: Massachusetts Institute of Technology, University of California, San Diego , University of California, San Diego , University of California, San Diego , Broad Institute of Harvard and MIT, Brigham and Women's Hospital, Massachusetts Institute of Technology, Howard Hughes Medical Institute.
The goal of these experiments is to generate quantitative time-course data on the growth and gene expression dynamics of attenuated S. typhimurium
bacterial colonies growing inside tumors.
We generated model xenograft tumors in mice by subcutaneous injection of a human ovarian cancer cell line, OVCAR-8 (NCI DCTD Tumor Repository, Frederick, MD).
We transformed attenuated strains of S. typhimurium
bacteria (ELH430:SL1344 phoPQ- 1
) with a constitutively expressed luciferase (luxCDABE) plasmid for visualization2
. These strains specifically colonize tumors while remaining essentially non-virulent to the mouse1
Once measurable tumors were established, bacteria were injected intravenously via the tail vein with varying dosage. Tumor-localized, bacterial gene expression was monitored in real time over the course of 60 hours using an in vivo
imaging system (IVIS). At each time point, tumors were excised, homogenized, and plated to quantitate bacterial colonies for correlation with gene expression data.
Together, this data yields a quantitative measure of the in vivo
growth and gene expression dynamics of bacteria growing inside tumors.
Infection, Issue 77, Cancer Biology, Immunology, Infectious Diseases, Microbiology, Genetics, Molecular Biology, Cellular Biology, Bioengineering, Biomedical Engineering, Bacteria, Synthetic Biology, Biological Agents, Time-Lapse Imaging, Synthetic Biology, dynamics (physics), Synthetic Biology, cancer therapy, bacteria population dynamics, in-vivo imaging, cell, imaging
Driving Simulation in the Clinic: Testing Visual Exploratory Behavior in Daily Life Activities in Patients with Visual Field Defects
Institutions: Universitätsmedizin Charité, Universitätsmedizin Charité, Humboldt Universität zu Berlin.
Patients suffering from homonymous hemianopia after infarction of the posterior cerebral artery (PCA) report different degrees of constraint in daily life, despite similar visual deficits. We assume this could be due to variable development of compensatory strategies such as altered visual scanning behavior. Scanning compensatory therapy (SCT) is studied as part of the visual training after infarction next to vision restoration therapy. SCT consists of learning to make larger eye movements into the blind field enlarging the visual field of search, which has been proven to be the most useful strategy1
, not only in natural search tasks but also in mastering daily life activities2
. Nevertheless, in clinical routine it is difficult to identify individual levels and training effects of compensatory behavior, since it requires measurement of eye movements in a head unrestrained condition. Studies demonstrated that unrestrained head movements alter the visual exploratory behavior compared to a head-restrained laboratory condition3
. Martin et al.4
and Hayhoe et al.5
showed that behavior demonstrated in a laboratory setting cannot be assigned easily to a natural condition. Hence, our goal was to develop a study set-up which uncovers different compensatory oculomotor strategies quickly in a realistic testing situation: Patients are tested in the clinical environment in a driving simulator. SILAB software (Wuerzburg Institute for Traffic Sciences GmbH (WIVW)
) was used to program driving scenarios of varying complexity and recording the driver's performance. The software was combined with a head mounted infrared video pupil tracker, recording head- and eye-movements (EyeSeeCam, University of Munich Hospital, Clinical Neurosciences).
The positioning of the patient in the driving simulator and the positioning, adjustment and calibration of the camera is demonstrated. Typical performances of a patient with and without compensatory strategy and a healthy control are illustrated in this pilot study. Different oculomotor behaviors (frequency and amplitude of eye- and head-movements) are evaluated very quickly during the drive itself by dynamic overlay pictures indicating where the subjects gaze is located on the screen, and by analyzing the data. Compensatory gaze behavior in a patient leads to a driving performance comparable to a healthy control, while the performance of a patient without compensatory behavior is significantly worse. The data of eye- and head-movement-behavior as well as driving performance are discussed with respect to different oculomotor strategies and in a broader context with respect to possible training effects throughout the testing session and implications on rehabilitation potential.
Medicine, Issue 67, Neuroscience, Physiology, Anatomy, Ophthalmology, compensatory oculomotor behavior, driving simulation, eye movements, homonymous hemianopia, stroke, visual field defects, visual field enlargement
Optimize Flue Gas Settings to Promote Microalgae Growth in Photobioreactors via Computer Simulations
Institutions: Washington University in St. Louis, St. Louis, Wuhan University of China, Washington University in St. Louis.
Flue gas from power plants can promote algal cultivation and reduce greenhouse gas emissions1
. Microalgae not only capture solar energy more efficiently than plants3
, but also synthesize advanced biofuels2-4
. Generally, atmospheric CO2
is not a sufficient source for supporting maximal algal growth5
. On the other hand, the high concentrations of CO2
in industrial exhaust gases have adverse effects on algal physiology. Consequently, both cultivation conditions (such as nutrients and light) and the control of the flue gas flow into the photo-bioreactors are important to develop an efficient “flue gas to algae” system. Researchers have proposed different photobioreactor configurations4,6
and cultivation strategies7,8
with flue gas. Here, we present a protocol that demonstrates how to use models to predict the microalgal growth in response to flue gas settings. We perform both experimental illustration and model simulations to determine the favorable conditions for algal growth with flue gas. We develop a Monod-based model coupled with mass transfer and light intensity equations to simulate the microalgal growth in a homogenous photo-bioreactor. The model simulation compares algal growth and flue gas consumptions under different flue-gas settings. The model illustrates: 1) how algal growth is influenced by different volumetric mass transfer coefficients of CO2
; 2) how we can find optimal CO2
concentration for algal growth via the dynamic optimization approach (DOA); 3) how we can design a rectangular on-off flue gas pulse to promote algal biomass growth and to reduce the usage of flue gas. On the experimental side, we present a protocol for growing Chlorella
under the flue gas (generated by natural gas combustion). The experimental results qualitatively validate the model predictions that the high frequency flue gas pulses can significantly improve algal cultivation.
Environmental Sciences, Issue 80, Microbiology, Cellular Biology, Marine Biology, Primary Cell Culture, Chlorella, CO2, mass transfer, Monod model, On-off pulse, Simulink
Aseptic Laboratory Techniques: Plating Methods
Institutions: University of California, Los Angeles .
Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories
(BMBL) as well as Material Safety Data Sheets
(MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection
(ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to:
● Perform plating procedures without contaminating media.
● Isolate single bacterial colonies by the streak-plating method.
● Use pour-plating and spread-plating methods to determine the concentration of bacteria.
● Perform soft agar overlays when working with phage.
● Transfer bacterial cells from one plate to another using the replica-plating procedure.
● Given an experimental task, select the appropriate plating method.
Basic Protocols, Issue 63, Streak plates, pour plates, soft agar overlays, spread plates, replica plates, bacteria, colonies, phage, plaques, dilutions
A Toolkit to Enable Hydrocarbon Conversion in Aqueous Environments
Institutions: Delft University of Technology, Delft University of Technology.
This work puts forward a toolkit that enables the conversion of alkanes by Escherichia coli
and presents a proof of principle of its applicability. The toolkit consists of multiple standard interchangeable parts (BioBricks)9
addressing the conversion of alkanes, regulation of gene expression and survival in toxic hydrocarbon-rich environments.
A three-step pathway for alkane degradation was implemented in E. coli
to enable the conversion of medium- and long-chain alkanes to their respective alkanols, alkanals and ultimately alkanoic-acids. The latter were metabolized via the native β-oxidation pathway. To facilitate the oxidation of medium-chain alkanes (C5-C13) and cycloalkanes (C5-C8), four genes (alkB2
) of the alkane hydroxylase system from Gordonia
were transformed into E. coli
. For the conversion of long-chain alkanes (C15-C36), theladA
gene from Geobacillus thermodenitrificans
was implemented. For the required further steps of the degradation process, ADH
and ALDH (
originating from G. thermodenitrificans
) were introduced10,11
. The activity was measured by resting cell assays. For each oxidative step, enzyme activity was observed.
To optimize the process efficiency, the expression was only induced under low glucose conditions: a substrate-regulated promoter, pCaiF, was used. pCaiF is present in E. coli
K12 and regulates the expression of the genes involved in the degradation of non-glucose carbon sources.
The last part of the toolkit - targeting survival - was implemented using solvent tolerance genes, PhPFDα and β, both from Pyrococcus horikoshii
OT3. Organic solvents can induce cell stress and decreased survivability by negatively affecting protein folding. As chaperones, PhPFDα and β improve the protein folding process e.g.
under the presence of alkanes. The expression of these genes led to an improved hydrocarbon tolerance shown by an increased growth rate (up to 50%) in the presences of 10% n
-hexane in the culture medium were observed.
Summarizing, the results indicate that the toolkit enables E. coli
to convert and tolerate hydrocarbons in aqueous environments. As such, it represents an initial step towards a sustainable solution for oil-remediation using a synthetic biology approach.
Bioengineering, Issue 68, Microbiology, Biochemistry, Chemistry, Chemical Engineering, Oil remediation, alkane metabolism, alkane hydroxylase system, resting cell assay, prefoldin, Escherichia coli, synthetic biology, homologous interaction mapping, mathematical model, BioBrick, iGEM
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g.
protein-protein) and functional (e.g.
gene-gene or genetic) interactions (GI)1
. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2
. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7
, but GI information remains sparse for prokaryotes8
, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10
Here, we present the key steps required to perform quantitative E. coli
Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9
, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format.
Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g.
the 'Keio' collection11
) and essential gene hypomorphic mutations (i.e.
alleles conferring reduced protein expression, stability, or activity9, 12, 13
) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14
. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9
. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2
. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e.
slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2
as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro
. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro
replication of HIV-1 as influenced by the gag
gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag
gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro
replication of chronically derived gag-pro
sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Chromatin Interaction Analysis with Paired-End Tag Sequencing (ChIA-PET) for Mapping Chromatin Interactions and Understanding Transcription Regulation
Institutions: Agency for Science, Technology and Research, Singapore, A*STAR-Duke-NUS Neuroscience Research Partnership, Singapore, National University of Singapore, Singapore.
Genomes are organized into three-dimensional structures, adopting higher-order conformations inside the micron-sized nuclear spaces 7, 2, 12
. Such architectures are not random and involve interactions between gene promoters and regulatory elements 13
. The binding of transcription factors to specific regulatory sequences brings about a network of transcription regulation and coordination 1, 14
Chromatin Interaction Analysis by Paired-End Tag Sequencing (ChIA-PET) was developed to identify these higher-order chromatin structures 5,6
. Cells are fixed and interacting loci are captured by covalent DNA-protein cross-links. To minimize non-specific noise and reduce complexity, as well as to increase the specificity of the chromatin interaction analysis, chromatin immunoprecipitation (ChIP) is used against specific protein factors to enrich chromatin fragments of interest before proximity ligation. Ligation involving half-linkers subsequently forms covalent links between pairs of DNA fragments tethered together within individual chromatin complexes. The flanking MmeI restriction enzyme sites in the half-linkers allow extraction of paired end tag-linker-tag constructs (PETs) upon MmeI digestion. As the half-linkers are biotinylated, these PET constructs are purified using streptavidin-magnetic beads. The purified PETs are ligated with next-generation sequencing adaptors and a catalog of interacting fragments is generated via next-generation sequencers such as the Illumina Genome Analyzer. Mapping and bioinformatics analysis is then performed to identify ChIP-enriched binding sites and ChIP-enriched chromatin interactions 8
We have produced a video to demonstrate critical aspects of the ChIA-PET protocol, especially the preparation of ChIP as the quality of ChIP plays a major role in the outcome of a ChIA-PET library. As the protocols are very long, only the critical steps are shown in the video.
Genetics, Issue 62, ChIP, ChIA-PET, Chromatin Interactions, Genomics, Next-Generation Sequencing
Metabolomic Analysis of Rat Brain by High Resolution Nuclear Magnetic Resonance Spectroscopy of Tissue Extracts
Institutions: Aix-Marseille Université, Aix-Marseille Université.
Studies of gene expression on the RNA and protein levels have long been used to explore biological processes underlying disease. More recently, genomics and proteomics have been complemented by comprehensive quantitative analysis of the metabolite pool present in biological systems. This strategy, termed metabolomics, strives to provide a global characterization of the small-molecule complement involved in metabolism. While the genome and the proteome define the tasks cells can perform, the metabolome is part of the actual phenotype. Among the methods currently used in metabolomics, spectroscopic techniques are of special interest because they allow one to simultaneously analyze a large number of metabolites without prior selection for specific biochemical pathways, thus enabling a broad unbiased approach. Here, an optimized experimental protocol for metabolomic analysis by high-resolution NMR spectroscopy is presented, which is the method of choice for efficient quantification of tissue metabolites. Important strengths of this method are (i) the use of crude extracts, without the need to purify the sample and/or separate metabolites; (ii) the intrinsically quantitative nature of NMR, permitting quantitation of all metabolites represented by an NMR spectrum with one reference compound only; and (iii) the nondestructive nature of NMR enabling repeated use of the same sample for multiple measurements. The dynamic range of metabolite concentrations that can be covered is considerable due to the linear response of NMR signals, although metabolites occurring at extremely low concentrations may be difficult to detect. For the least abundant compounds, the highly sensitive mass spectrometry method may be advantageous although this technique requires more intricate sample preparation and quantification procedures than NMR spectroscopy. We present here an NMR protocol adjusted to rat brain analysis; however, the same protocol can be applied to other tissues with minor modifications.
Neuroscience, Issue 91, metabolomics, brain tissue, rodents, neurochemistry, tissue extracts, NMR spectroscopy, quantitative metabolite analysis, cerebral metabolism, metabolic profile
Rapid Synthesis and Screening of Chemically Activated Transcription Factors with GFP-based Reporters
Institutions: Princeton University, Princeton University, California Institute of Technology.
Synthetic biology aims to rationally design and build synthetic circuits with desired quantitative properties, as well as provide tools to interrogate the structure of native control circuits. In both cases, the ability to program gene expression in a rapid and tunable fashion, with no off-target effects, can be useful. We have constructed yeast strains containing the ACT1
promoter upstream of a URA3
cassette followed by the ligand-binding domain of the human estrogen receptor and VP16. By transforming this strain with a linear PCR product containing a DNA binding domain and selecting against the presence of URA3
, a constitutively expressed artificial transcription factor (ATF) can be generated by homologous recombination. ATFs engineered in this fashion can activate a unique target gene in the presence of inducer, thereby eliminating both the off-target activation and nonphysiological growth conditions found with commonly used conditional gene expression systems. A simple method for the rapid construction of GFP reporter plasmids that respond specifically to a native or artificial transcription factor of interest is also provided.
Genetics, Issue 81, transcription, transcription factors, artificial transcription factors, zinc fingers, Zif268, synthetic biology
Test Samples for Optimizing STORM Super-Resolution Microscopy
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus
, consequently the name Taq
PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to:
● Set up reactions and thermal cycling conditions for a conventional PCR experiment
● Understand the function of various reaction components and their overall effect on a PCR experiment
● Design and optimize a PCR experiment for any DNA template
● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Acquiring Fluorescence Time-lapse Movies of Budding Yeast and Analyzing Single-cell Dynamics using GRAFTS
Institutions: Massachusetts Institute of Technology.
Fluorescence time-lapse microscopy has become a powerful tool in the study of many biological processes at the single-cell level. In particular, movies depicting the temporal dependence of gene expression provide insight into the dynamics of its regulation; however, there are many technical challenges to obtaining and analyzing fluorescence movies of single cells. We describe here a simple protocol using a commercially available microfluidic culture device to generate such data, and a MATLAB-based, graphical user interface (GUI) -based software package to quantify the fluorescence images. The software segments and tracks cells, enables the user to visually curate errors in the data, and automatically assigns lineage and division times. The GUI further analyzes the time series to produce whole cell traces as well as their first and second time derivatives. While the software was designed for S. cerevisiae
, its modularity and versatility should allow it to serve as a platform for studying other cell types with few modifications.
Microbiology, Issue 77, Cellular Biology, Molecular Biology, Genetics, Biophysics, Saccharomyces cerevisiae, Microscopy, Fluorescence, Cell Biology, microscopy/fluorescence and time-lapse, budding yeast, gene expression dynamics, segmentation, lineage tracking, image tracking, software, yeast, cells, imaging
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Designing Silk-silk Protein Alloy Materials for Biomedical Applications
Institutions: Rowan University, Rowan University, Cooper Medical School of Rowan University, Rowan University.
Fibrous proteins display different sequences and structures that have been used for various applications in biomedical fields such as biosensors, nanomedicine, tissue regeneration, and drug delivery. Designing materials based on the molecular-scale interactions between these proteins will help generate new multifunctional protein alloy biomaterials with tunable properties. Such alloy material systems also provide advantages in comparison to traditional synthetic polymers due to the materials biodegradability, biocompatibility, and tenability in the body. This article used the protein blends of wild tussah silk (Antheraea pernyi
) and domestic mulberry silk (Bombyx mori
) as an example to provide useful protocols regarding these topics, including how to predict protein-protein interactions by computational methods, how to produce protein alloy solutions, how to verify alloy systems by thermal analysis, and how to fabricate variable alloy materials including optical materials with diffraction gratings, electric materials with circuits coatings, and pharmaceutical materials for drug release and delivery. These methods can provide important information for designing the next generation multifunctional biomaterials based on different protein alloys.
Bioengineering, Issue 90, protein alloys, biomaterials, biomedical, silk blends, computational simulation, implantable electronic devices
RNA Secondary Structure Prediction Using High-throughput SHAPE
Institutions: Frederick National Laboratory for Cancer Research.
Understanding the function of RNA involved in biological processes requires a thorough knowledge of RNA structure. Toward this end, the methodology dubbed "high-throughput selective 2' hydroxyl acylation analyzed by primer extension", or SHAPE, allows prediction of RNA secondary structure with single nucleotide resolution. This approach utilizes chemical probing agents that preferentially acylate single stranded or flexible regions of RNA in aqueous solution. Sites of chemical modification are detected by reverse transcription of the modified RNA, and the products of this reaction are fractionated by automated capillary electrophoresis (CE). Since reverse transcriptase pauses at those RNA nucleotides modified by the SHAPE reagents, the resulting cDNA library indirectly maps those ribonucleotides that are single stranded in the context of the folded RNA. Using ShapeFinder software, the electropherograms produced by automated CE are processed and converted into nucleotide reactivity tables that are themselves converted into pseudo-energy constraints used in the RNAStructure (v5.3) prediction algorithm. The two-dimensional RNA structures obtained by combining SHAPE probing with in silico
RNA secondary structure prediction have been found to be far more accurate than structures obtained using either method alone.
Genetics, Issue 75, Molecular Biology, Biochemistry, Virology, Cancer Biology, Medicine, Genomics, Nucleic Acid Probes, RNA Probes, RNA, High-throughput SHAPE, Capillary electrophoresis, RNA structure, RNA probing, RNA folding, secondary structure, DNA, nucleic acids, electropherogram, synthesis, transcription, high throughput, sequencing
Investigation of Early Plasma Evolution Induced by Ultrashort Laser Pulses
Institutions: Purdue University.
Early plasma is generated owing to high intensity laser irradiation of target and the subsequent target material ionization. Its dynamics plays a significant role in laser-material interaction, especially in the air environment1-11
Early plasma evolution has been captured through pump-probe shadowgraphy1-3
. However, the studied time frames and applied laser parameter ranges are limited. For example, direct examinations of plasma front locations and electron number densities within a delay time of 100 picosecond (ps) with respect to the laser pulse peak are still very few, especially for the ultrashort pulse of a duration around 100 femtosecond (fs) and a low power density around 1014
. Early plasma generated under these conditions has only been captured recently with high temporal and spatial resolutions12
. The detailed setup strategy and procedures of this high precision measurement will be illustrated in this paper. The rationale of the measurement is optical pump-probe shadowgraphy: one ultrashort laser pulse is split to a pump pulse and a probe pulse, while the delay time between them can be adjusted by changing their beam path lengths. The pump pulse ablates the target and generates the early plasma, and the probe pulse propagates through the plasma region and detects the non-uniformity of electron number density. In addition, animations are generated using the calculated results from the simulation model of Ref. 12
to illustrate the plasma formation and evolution with a very high resolution (0.04 ~ 1 ps).
Both the experimental method and the simulation method can be applied to a broad range of time frames and laser parameters. These methods can be used to examine the early plasma generated not only from metals, but also from semiconductors and insulators.
Physics, Issue 65, Mechanical Engineering, Early plasma, air ionization, pump-probe shadowgraph, molecular dynamics, Monte Carlo, particle-in-cell
Protocols for Implementing an Escherichia coli Based TX-TL Cell-Free Expression System for Synthetic Biology
Institutions: California Institute of Technology, California Institute of Technology, Massachusetts Institute of Technology, University of Minnesota.
Ideal cell-free expression systems can theoretically emulate an in vivo
cellular environment in a controlled in vitro
This is useful for expressing proteins and genetic circuits in a controlled manner as well as for providing a prototyping environment for synthetic biology.2,3
To achieve the latter goal, cell-free expression systems that preserve endogenous Escherichia coli transcription-translation mechanisms are able to more accurately reflect in vivo
cellular dynamics than those based on T7 RNA polymerase transcription. We describe the preparation and execution of an efficient endogenous E. coli
based transcription-translation (TX-TL) cell-free expression system that can produce equivalent amounts of protein as T7-based systems at a 98% cost reduction to similar commercial systems.4,5
The preparation of buffers and crude cell extract are described, as well as the execution of a three tube TX-TL reaction. The entire protocol takes five days to prepare and yields enough material for up to 3000 single reactions in one preparation. Once prepared, each reaction takes under 8 hr from setup to data collection and analysis. Mechanisms of regulation and transcription exogenous to E. coli
, such as lac/tet repressors and T7 RNA polymerase, can be supplemented.6
Endogenous properties, such as mRNA and DNA degradation rates, can also be adjusted.7
The TX-TL cell-free expression system has been demonstrated for large-scale circuit assembly, exploring biological phenomena, and expression of proteins under both T7- and endogenous promoters.6,8
Accompanying mathematical models are available.9,10
The resulting system has unique applications in synthetic biology as a prototyping environment, or "TX-TL biomolecular breadboard."
Cellular Biology, Issue 79, Bioengineering, Synthetic Biology, Chemistry Techniques, Synthetic, Molecular Biology, control theory, TX-TL, cell-free expression, in vitro, transcription-translation, cell-free protein synthesis, synthetic biology, systems biology, Escherichia coli cell extract, biological circuits, biomolecular breadboard
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Purifying Plasmid DNA from Bacterial Colonies Using the Qiagen Miniprep Kit
Institutions: University of California, Irvine (UCI).
Plasmid DNA purification from E. coli is a core technique for molecular cloning. Small scale purification (miniprep) from less than 5 ml of bacterial culture is a quick way for clone verification or DNA isolation, followed by further enzymatic reactions (polymerase chain reaction and restriction enzyme digestion). Here, we video-recorded the general procedures of miniprep through the QIAGEN's QIAprep 8 Miniprep Kit, aiming to introducing this highly efficient technique to the general beginners for molecular biology techniques. The whole procedure is based on alkaline lysis of E. coli cells followed by adsorption of DNA onto silica in the presence of high salt. It consists of three steps: 1) preparation and clearing of a bacterial lysate, 2) adsorption of DNA onto the QIAprep membrane, 3) washing and elution of plasmid DNA. All steps are performed without the use of phenol, chloroform, CsCl, ethidium bromide, and without alcohol precipitation. It usually takes less than 2 hours to finish the entire procedure.
Issue 6, Basic Protocols, plasmid, DNA, purification, Qiagen
Using an Automated Cell Counter to Simplify Gene Expression Studies: siRNA Knockdown of IL-4 Dependent Gene Expression in Namalwa Cells
Institutions: Bio-Rad Laboratories.
The use of siRNA mediated gene knockdown is continuing to be an important tool in studies of gene expression. siRNA studies are being conducted not only to study the effects of downregulating single genes, but also to interrogate signaling pathways and other complex interaction networks. These pathway analyses require both the use of relevant cellular models and methods that cause less perturbation to the cellular physiology. Electroporation is increasingly being used as an effective way to introduce siRNA and other nucleic acids into difficult to transfect cell lines and primary cells without altering the signaling pathway under investigation. There are multiple critical steps to a successful siRNA experiment, and there are ways to simplify the work while improving the data quality at several experimental stages. To help you get started with your siRNA mediated gene knockdown project, we will demonstrate how to perform a pathway study complete from collecting and counting the cells prior to electroporation through post transfection real-time PCR gene expression analysis. The following study investigates the role of the transcriptional activator STAT6 in IL-4 dependent gene expression of CCL17 in a Burkitt lymphoma cell line (Namalwa). The techniques demonstrated are useful for a wide range of siRNA-based experiments on both adherent and suspension cells. We will also show how to streamline cell counting with the TC10 automated cell counter, how to electroporate multiple samples simultaneously using the MXcell electroporation system, and how to simultaneously assess RNA quality and quantity with the Experion automated electrophoresis system.
Cellular Biology, Issue 38, Cell Counting, Gene Silencing, siRNA, Namalwa Cells, IL4, Gene Expression, Electroporation, Real Time PCR
Designing and Implementing Nervous System Simulations on LEGO Robots
Institutions: Northeastern University, Bremen University of Applied Sciences.
We present a method to use the commercially available LEGO Mindstorms NXT robotics platform to test systems level neuroscience hypotheses. The first step of the method is to develop a nervous system simulation of specific reflexive behaviors of an appropriate model organism; here we use the American Lobster. Exteroceptive reflexes mediated by decussating (crossing) neural connections can explain an animal's taxis towards or away from a stimulus as described by Braitenberg and are particularly well suited for investigation using the NXT platform.1
The nervous system simulation is programmed using LabVIEW software on the LEGO Mindstorms platform. Once the nervous system is tuned properly, behavioral experiments are run on the robot and on the animal under identical environmental conditions. By controlling the sensory milieu experienced by the specimens, differences in behavioral outputs can be observed. These differences may point to specific deficiencies in the nervous system model and serve to inform the iteration of the model for the particular behavior under study. This method allows for the experimental manipulation of electronic nervous systems and serves as a way to explore neuroscience hypotheses specifically regarding the neurophysiological basis of simple innate reflexive behaviors. The LEGO Mindstorms NXT kit provides an affordable and efficient platform on which to test preliminary biomimetic robot control schemes. The approach is also well suited for the high school classroom to serve as the foundation for a hands-on inquiry-based biorobotics curriculum.
Neuroscience, Issue 75, Neurobiology, Bioengineering, Behavior, Mechanical Engineering, Computer Science, Marine Biology, Biomimetics, Marine Science, Neurosciences, Synthetic Biology, Robotics, robots, Modeling, models, Sensory Fusion, nervous system, Educational Tools, programming, software, lobster, Homarus americanus, animal model
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e.
, lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.
) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization.
Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25
. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7
with a multiobjective evolutionary algorithm SPEA226
, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development