Extracellular protein:protein interactions between secreted or membrane-tethered proteins are critical for both initiating intercellular communication and ensuring cohesion within multicellular organisms. Proteins predicted to form extracellular interactions are encoded by approximately a quarter of human genes1, but despite their importance and abundance, the majority of these proteins have no documented binding partner. Primarily, this is due to their biochemical intractability: membrane-embedded proteins are difficult to solubilise in their native conformation and contain structurally-important posttranslational modifications. Also, the interaction affinities between receptor proteins are often characterised by extremely low interaction strengths (half-lives < 1 second) precluding their detection with many commonly-used high throughput methods2.
Here, we describe an assay, AVEXIS (AVidity-based EXtracellular Interaction Screen) that overcomes these technical challenges enabling the detection of very weak protein interactions (t1/2 ≤ 0.1 sec) with a low false positive rate3. The assay is usually implemented in a high throughput format to enable the systematic screening of many thousands of interactions in a convenient microtitre plate format (Fig. 1). It relies on the production of soluble recombinant protein libraries that contain the ectodomain fragments of cell surface receptors or secreted proteins within which to screen for interactions; therefore, this approach is suitable for type I, type II, GPI-linked cell surface receptors and secreted proteins but not for multipass membrane proteins such as ion channels or transporters.
The recombinant protein libraries are produced using a convenient and high-level mammalian expression system4, to ensure that important posttranslational modifications such as glycosylation and disulphide bonds are added. Expressed recombinant proteins are secreted into the medium and produced in two forms: a biotinylated bait which can be captured on a streptavidin-coated solid phase suitable for screening, and a pentamerised enzyme-tagged (β-lactamase) prey. The bait and prey proteins are presented to each other in a binary fashion to detect direct interactions between them, similar to a conventional ELISA (Fig. 1). The pentamerisation of the proteins in the prey is achieved through a peptide sequence from the cartilage oligomeric matrix protein (COMP) and increases the local concentration of the ectodomains thereby providing significant avidity gains to enable even very transient interactions to be detected. By normalising the activities of both the bait and prey to predetermined levels prior to screening, we have shown that interactions having monomeric half-lives of 0.1 sec can be detected with low false positive rates3.
28 Related JoVE Articles!
Large Scale Non-targeted Metabolomic Profiling of Serum by Ultra Performance Liquid Chromatography-Mass Spectrometry (UPLC-MS)
Institutions: Colorado State University.
Non-targeted metabolite profiling by ultra performance liquid chromatography coupled with mass spectrometry (UPLC-MS) is a powerful technique to investigate metabolism. The approach offers an unbiased and in-depth analysis that can enable the development of diagnostic tests, novel therapies, and further our understanding of disease processes. The inherent chemical diversity of the metabolome creates significant analytical challenges and there is no single experimental approach that can detect all metabolites. Additionally, the biological variation in individual metabolism and the dependence of metabolism on environmental factors necessitates large sample numbers to achieve the appropriate statistical power required for meaningful biological interpretation. To address these challenges, this tutorial outlines an analytical workflow for large scale non-targeted metabolite profiling of serum by UPLC-MS. The procedure includes guidelines for sample organization and preparation, data acquisition, quality control, and metabolite identification and will enable reliable acquisition of data for large experiments and provide a starting point for laboratories new to non-targeted metabolite profiling by UPLC-MS.
Chemistry, Issue 73, Biochemistry, Genetics, Molecular Biology, Physiology, Genomics, Proteins, Proteomics, Metabolomics, Metabolite Profiling, Non-targeted metabolite profiling, mass spectrometry, Ultra Performance Liquid Chromatography, UPLC-MS, serum, spectrometry
Bimolecular Fluorescence Complementation
Institutions: University of Illinois at Chicago.
Defining the subcellular distribution of signaling complexes is imperative to understanding the output from that complex.
Conventional methods such as immunoprecipitation do not provide information on the spatial localization of complexes. In contrast, BiFC monitors the interaction and subcellular compartmentalization of protein complexes. In this method, a fluororescent protein is split into amino- and carboxy-terminal non-fluorescent fragments which are then fused to two proteins of interest. Interaction of the proteins results in reconstitution of the fluorophore (Figure 1)1,2
. A limitation of BiFC is that once the fragmented fluorophore is reconstituted the complex is irreversible3
. This limitation is advantageous in detecting transient or weak interactions, but precludes a kinetic analysis of complex dynamics. An additional caveat is that the reconstituted flourophore requires 30min to mature and fluoresce, again precluding the observation of real time interactions4
. BiFC is a specific example of the protein fragment complementation assay (PCA) which employs reporter proteins such as green fluorescent protein variants (BiFC), dihydrofolate reductase, b-lactamase, and luciferase to measure protein:protein interactions5,6
. Alternative methods to study protein:protein interactions in cells include fluorescence co-localization and Förster resonance energy transfer (FRET)7
. For co-localization, two proteins are individually tagged either directly with a fluorophore or by indirect immunofluorescence. However, this approach leads to high background of non-interacting proteins making it difficult to interpret co-localization data. In addition, due to the limits of resolution of confocal microscopy, two proteins may appear co-localized without necessarily interacting. With BiFC, fluorescence is only observed when the two proteins of interest interact. FRET is another excellent method for studying protein:protein interactions, but can be technically challenging. FRET experiments require the donor and acceptor to be of similar brightness and stoichiometry in the cell. In addition, one must account for bleed through of the donor into the acceptor channel and vice versa. Unlike FRET, BiFC has little background fluorescence, little post processing of image data, does not require high overexpression, and can detect weak or transient interactions. Bioluminescence resonance energy transfer (BRET) is a method similar to FRET except the donor is an enzyme (e.g. luciferase) that catalyzes a substrate to become bioluminescent thereby exciting an acceptor. BRET lacks the technical problems of bleed through and high background fluorescence but lacks the ability to provide spatial information due to the lack of substrate localization to specific compartments8
. Overall, BiFC is an excellent method for visualizing subcellular localization of protein complexes to gain insight into compartmentalized signaling.
Cellular Biology, Issue 50, Fluorescence, imaging, compartmentalized signaling, subcellular localization, signal transduction
Optimization and Utilization of Agrobacterium-mediated Transient Protein Production in Nicotiana
Institutions: Fraunhofer USA Center for Molecular Biotechnology.
-mediated transient protein production in plants is a promising approach to produce vaccine antigens and therapeutic proteins within a short period of time. However, this technology is only just beginning to be applied to large-scale production as many technological obstacles to scale up are now being overcome. Here, we demonstrate a simple and reproducible method for industrial-scale transient protein production based on vacuum infiltration of Nicotiana
plants with Agrobacteria
carrying launch vectors. Optimization of Agrobacterium
cultivation in AB medium allows direct dilution of the bacterial culture in Milli-Q water, simplifying the infiltration process. Among three tested species of Nicotiana
, N. excelsiana
× N. excelsior
) was selected as the most promising host due to the ease of infiltration, high level of reporter protein production, and about two-fold higher biomass production under controlled environmental conditions. Induction of Agrobacterium
harboring pBID4-GFP (Tobacco mosaic virus
-based) using chemicals such as acetosyringone and monosaccharide had no effect on the protein production level. Infiltrating plant under 50 to 100 mbar for 30 or 60 sec resulted in about 95% infiltration of plant leaf tissues. Infiltration with Agrobacterium
laboratory strain GV3101 showed the highest protein production compared to Agrobacteria
laboratory strains LBA4404 and C58C1 and wild-type Agrobacteria
strains at6, at10, at77 and A4. Co-expression of a viral RNA silencing suppressor, p23 or p19, in N. benthamiana
resulted in earlier accumulation and increased production (15-25%) of target protein (influenza virus hemagglutinin).
Plant Biology, Issue 86, Agroinfiltration, Nicotiana benthamiana, transient protein production, plant-based expression, viral vector, Agrobacteria
A Guide to Modern Quantitative Fluorescent Western Blotting with Troubleshooting Strategies
Institutions: University of Edinburgh, University of Edinburgh, University of Edinburgh, University of Edinburgh.
The late 1970s saw the first publicly reported use of the western blot, a technique for assessing the presence and relative abundance of specific proteins within complex biological samples. Since then, western blotting methodology has become a common component of the molecular biologists experimental repertoire. A cursory search of PubMed using the term “western blot” suggests that in excess of two hundred and twenty thousand published manuscripts have made use of this technique by the year 2014. Importantly, the last ten years have seen technical imaging advances coupled with the development of sensitive fluorescent labels which have improved sensitivity and yielded even greater ranges of linear detection. The result is a now truly Quantifiable Fluorescence based Western Blot (QFWB) that allows biologists to carry out comparative expression analysis with greater sensitivity and accuracy than ever before. Many “optimized” western blotting methodologies exist and are utilized in different laboratories. These often prove difficult to implement due to the requirement of subtle but undocumented procedural amendments. This protocol provides a comprehensive description of an established and robust QFWB method, complete with troubleshooting strategies.
Basic Protocols, Issue 93, western blotting, fluorescent, LI-COR, protein, quantitative analysis, loading control
Setting-up an In Vitro Model of Rat Blood-brain Barrier (BBB): A Focus on BBB Impermeability and Receptor-mediated Transport
Institutions: VECT-HORUS SAS, CNRS, NICN UMR 7259.
The blood brain barrier (BBB) specifically regulates molecular and cellular flux between the blood and the nervous tissue. Our aim was to develop and characterize a highly reproducible rat syngeneic in vitro
model of the BBB using co-cultures of primary rat brain endothelial cells (RBEC) and astrocytes to study receptors involved in transcytosis across the endothelial cell monolayer. Astrocytes were isolated by mechanical dissection following trypsin digestion and were frozen for later co-culture. RBEC were isolated from 5-week-old rat cortices. The brains were cleaned of meninges and white matter, and mechanically dissociated following enzymatic digestion. Thereafter, the tissue homogenate was centrifuged in bovine serum albumin to separate vessel fragments from nervous tissue. The vessel fragments underwent a second enzymatic digestion to free endothelial cells from their extracellular matrix. The remaining contaminating cells such as pericytes were further eliminated by plating the microvessel fragments in puromycin-containing medium. They were then passaged onto filters for co-culture with astrocytes grown on the bottom of the wells. RBEC expressed high levels of tight junction (TJ) proteins such as occludin, claudin-5 and ZO-1 with a typical localization at the cell borders. The transendothelial electrical resistance (TEER) of brain endothelial monolayers, indicating the tightness of TJs reached 300 ohm·cm2
on average. The endothelial permeability coefficients (Pe) for lucifer yellow (LY) was highly reproducible with an average of 0.26 ± 0.11 x 10-3
cm/min. Brain endothelial cells organized in monolayers expressed the efflux transporter P-glycoprotein (P-gp), showed a polarized transport of rhodamine 123, a ligand for P-gp, and showed specific transport of transferrin-Cy3 and DiILDL across the endothelial cell monolayer. In conclusion, we provide a protocol for setting up an in vitro
BBB model that is highly reproducible due to the quality assurance methods, and that is suitable for research on BBB transporters and receptors.
Medicine, Issue 88, rat brain endothelial cells (RBEC), mouse, spinal cord, tight junction (TJ), receptor-mediated transport (RMT), low density lipoprotein (LDL), LDLR, transferrin, TfR, P-glycoprotein (P-gp), transendothelial electrical resistance (TEER),
Pre-clinical Evaluation of Tyrosine Kinase Inhibitors for Treatment of Acute Leukemia
Institutions: University of Colorado Anschutz Medical Campus, University Hospital of Essen.
Receptor tyrosine kinases have been implicated in the development and progression of many cancers, including both leukemia and solid tumors, and are attractive druggable therapeutic targets. Here we describe an efficient four-step strategy for pre-clinical evaluation of tyrosine kinase inhibitors (TKIs) in the treatment of acute leukemia. Initially, western blot analysis is used to confirm target inhibition in cultured leukemia cells. Functional activity is then evaluated using clonogenic assays in methylcellulose or soft agar cultures. Experimental compounds that demonstrate activity in cell culture assays are evaluated in vivo
using NOD-SCID-gamma (NSG) mice transplanted orthotopically with human leukemia cell lines. Initial in vivo
pharmacodynamic studies evaluate target inhibition in leukemic blasts isolated from the bone marrow. This approach is used to determine the dose and schedule of administration required for effective target inhibition. Subsequent studies evaluate the efficacy of the TKIs in vivo
using luciferase expressing leukemia cells, thereby allowing for non-invasive bioluminescent monitoring of leukemia burden and assessment of therapeutic response using an in vivo
bioluminescence imaging system. This strategy has been effective for evaluation of TKIs in vitro
and in vivo
and can be applied for identification of molecularly-targeted agents with therapeutic potential or for direct comparison and prioritization of multiple compounds.
Medicine, Issue 79, Leukemia, Receptor Protein-Tyrosine Kinases, Molecular Targeted Therapy, Therapeutics, novel small molecule inhibitor, receptor tyrosine kinase, leukemia
Quantitative Analysis of Chromatin Proteomes in Disease
Institutions: David Geffen School of Medicine at UCLA, David Geffen School of Medicine at UCLA, David Geffen School of Medicine at UCLA, Nora Eccles Harrison Cardiovascular Research and Training Institute, University of Utah.
In the nucleus reside the proteomes whose functions are most intimately linked with gene regulation. Adult mammalian cardiomyocyte nuclei are unique due to the high percentage of binucleated cells,1
the predominantly heterochromatic state of the DNA, and the non-dividing nature of the cardiomyocyte which renders adult nuclei in a permanent state of interphase.2
Transcriptional regulation during development and disease have been well studied in this organ,3-5
but what remains relatively unexplored is the role played by the nuclear proteins responsible for DNA packaging and expression, and how these proteins control changes in transcriptional programs that occur during disease.6
In the developed world, heart disease is the number one cause of mortality for both men and women.7
Insight on how nuclear proteins cooperate to regulate the progression of this disease is critical for advancing the current treatment options.
Mass spectrometry is the ideal tool for addressing these questions as it allows for an unbiased annotation of the nuclear proteome and relative quantification for how the abundance of these proteins changes with disease. While there have been several proteomic studies for mammalian nuclear protein complexes,8-13
there has been only one study examining the cardiac nuclear proteome, and it considered the entire nucleus, rather than exploring the proteome at the level of nuclear sub compartments.15
In large part, this shortage of work is due to the difficulty of isolating cardiac nuclei. Cardiac nuclei occur within a rigid and dense actin-myosin apparatus to which they are connected via multiple extensions from the endoplasmic reticulum, to the extent that myocyte contraction alters their overall shape.16
Additionally, cardiomyocytes are 40% mitochondria by volume17
which necessitates enrichment of the nucleus apart from the other organelles. Here we describe a protocol for cardiac nuclear enrichment and further fractionation into biologically-relevant compartments. Furthermore, we detail methods for label-free quantitative mass spectrometric dissection of these fractions-techniques amenable to in vivo
experimentation in various animal models and organ systems where metabolic labeling is not feasible.
Medicine, Issue 70, Molecular Biology, Immunology, Genetics, Genomics, Physiology, Protein, DNA, Chromatin, cardiovascular disease, proteomics, mass spectrometry
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro
using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro
preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers.
In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo
counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure
neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic
SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
The Fastest Western in Town: A Contemporary Twist on the Classic Western Blot Analysis
Institutions: University of California, San Francisco.
The Western blot techniques that were originally established in the late 1970s are still actively utilized today. However, this traditional method of Western blotting has several drawbacks that include low quality resolution, spurious bands, decreased sensitivity, and poor protein integrity. Recent advances have drastically improved numerous aspects of the standard Western blot protocol to produce higher qualitative and quantitative data. The Bis-Tris gel system, an alternative to the conventional Laemmli system, generates better protein separation and resolution, maintains protein integrity, and reduces electrophoresis to a 35 min run time. Moreover, the iBlot dry blotting system, dramatically improves the efficacy and speed of protein transfer to the membrane in 7 min, which is in contrast to the traditional protein transfer methods that are often more inefficient with lengthy transfer times. In combination with these highly innovative modifications, protein detection using infrared fluorescent imaging results in higher-quality, more accurate and consistent data compared to the standard Western blotting technique of chemiluminescence. This technology can simultaneously detect two different antigens on the same membrane by utilizing two-color near-infrared dyes that are visualized in different fluorescent channels. Furthermore, the linearity and broad dynamic range of fluorescent imaging allows for the precise quantification of both strong and weak protein bands. Thus, this protocol describes the key improvements to the classic Western blotting method, in which these advancements significantly increase the quality of data while greatly reducing the performance time of this experiment.
Basic Protocol, Issue 84, Western blot, Bis-Tris, electrophoresis, dry blotting, protein transfer, infrared, Fluorescence, quantification, Antibody, Protein
Discovering Protein Interactions and Characterizing Protein Function Using HaloTag Technology
Institutions: Promega Corporation, MS Bioworks LLC.
Research in proteomics has exploded in recent years with advances in mass spectrometry capabilities that have led to the characterization of numerous proteomes, including those from viruses, bacteria, and yeast. In comparison, analysis of the human proteome lags behind, partially due to the sheer number of proteins which must be studied, but also the complexity of networks and interactions these present. To specifically address the challenges of understanding the human proteome, we have developed HaloTag technology for protein isolation, particularly strong for isolation of multiprotein complexes and allowing more efficient capture of weak or transient interactions and/or proteins in low abundance. HaloTag is a genetically encoded protein fusion tag, designed for covalent, specific, and rapid immobilization or labelling of proteins with various ligands. Leveraging these properties, numerous applications for mammalian cells were developed to characterize protein function and here we present methodologies including: protein pull-downs used for discovery of novel interactions or functional assays, and cellular localization. We find significant advantages in the speed, specificity, and covalent capture of fusion proteins to surfaces for proteomic analysis as compared to other traditional non-covalent approaches. We demonstrate these and the broad utility of the technology using two important epigenetic proteins as examples, the human bromodomain protein BRD4, and histone deacetylase HDAC1. These examples demonstrate the power of this technology in enabling the discovery of novel interactions and characterizing cellular localization in eukaryotes, which will together further understanding of human functional proteomics.
Cellular Biology, Issue 89, proteomics, HaloTag, protein interactions, mass spectrometry, bromodomain proteins, BRD4, histone deacetylase (HDAC), HDAC cellular assays, and confocal imaging
V3 Stain-free Workflow for a Practical, Convenient, and Reliable Total Protein Loading Control in Western Blotting
Institutions: Bio-Rad Laboratories.
The western blot is a very useful and widely adopted lab technique, but its execution is challenging. The workflow is often characterized as a "black box" because an experimentalist does not know if it has been performed successfully until the last of several steps. Moreover, the quality of western blot data is sometimes challenged due to a lack of effective quality control tools in place throughout the western blotting process. Here we describe the V3 western workflow, which applies stain-free technology to address the major concerns associated with the traditional western blot protocol. This workflow allows researchers: 1) to run a gel in about 20-30 min; 2) to visualize sample separation quality within 5 min after the gel run; 3) to transfer proteins in 3-10 min; 4) to verify transfer efficiency quantitatively; and most importantly 5) to validate changes in the level of the protein of interest using total protein loading control. This novel approach eliminates the need of stripping and reprobing the blot for housekeeping proteins such as β-actin, β-tubulin, GAPDH, etc.
The V3 stain-free workflow makes the western blot process faster, transparent, more quantitative and reliable.
Basic Protocol, Issue 82, Biotechnology, Pharmaceutical, Protein electrophoresis, Western blot, Stain-Free, loading control, total protein normalization, stain-free technology
siRNA Screening to Identify Ubiquitin and Ubiquitin-like System Regulators of Biological Pathways in Cultured Mammalian Cells
Institutions: University of Dundee, University of Dundee.
Post-translational modification of proteins with ubiquitin and ubiquitin-like molecules (UBLs) is emerging as a dynamic cellular signaling network that regulates diverse biological pathways including the hypoxia response, proteostasis, the DNA damage response and transcription. To better understand how UBLs regulate pathways relevant to human disease, we have compiled a human siRNA “ubiquitome” library consisting of 1,186 siRNA duplex pools targeting all known and predicted components of UBL system pathways. This library can be screened against a range of cell lines expressing reporters of diverse biological pathways to determine which UBL components act as positive or negative regulators of the pathway in question. Here, we describe a protocol utilizing this library to identify ubiquitome-regulators of the HIF1A-mediated cellular response to hypoxia using a transcription-based luciferase reporter. An initial assay development stage is performed to establish suitable screening parameters of the cell line before performing the screen in three stages: primary, secondary and tertiary/deconvolution screening. The use of targeted over whole genome siRNA libraries is becoming increasingly popular as it offers the advantage of reporting only on members of the pathway with which the investigators are most interested. Despite inherent limitations of siRNA screening, in particular false-positives caused by siRNA off-target effects, the identification of genuine novel regulators of the pathways in question outweigh these shortcomings, which can be overcome by performing a series of carefully undertaken control experiments.
Biochemistry, Issue 87, siRNA screening, ubiquitin, UBL, ubiquitome, hypoxia, HIF1A, High-throughput, mammalian cells, luciferase reporter
A Novel Bayesian Change-point Algorithm for Genome-wide Analysis of Diverse ChIPseq Data Types
Institutions: Stony Brook University, Cold Spring Harbor Laboratory, University of Texas at Dallas.
ChIPseq is a widely used technique for investigating protein-DNA interactions. Read density profiles are generated by using next-sequencing of protein-bound DNA and aligning the short reads to a reference genome. Enriched regions are revealed as peaks, which often differ dramatically in shape, depending on the target protein1
. For example, transcription factors often bind in a site- and sequence-specific manner and tend to produce punctate peaks, while histone modifications are more pervasive and are characterized by broad, diffuse islands of enrichment2
. Reliably identifying these regions was the focus of our work.
Algorithms for analyzing ChIPseq data have employed various methodologies, from heuristics3-5
to more rigorous statistical models, e.g.
Hidden Markov Models (HMMs)6-8
. We sought a solution that minimized the necessity for difficult-to-define, ad hoc parameters that often compromise resolution and lessen the intuitive usability of the tool. With respect to HMM-based methods, we aimed to curtail parameter estimation procedures and simple, finite state classifications that are often utilized.
Additionally, conventional ChIPseq data analysis involves categorization of the expected read density profiles as either punctate or diffuse followed by subsequent application of the appropriate tool. We further aimed to replace the need for these two distinct models with a single, more versatile model, which can capably address the entire spectrum of data types.
To meet these objectives, we first constructed a statistical framework that naturally modeled ChIPseq data structures using a cutting edge advance in HMMs9
, which utilizes only explicit formulas-an innovation crucial to its performance advantages. More sophisticated then heuristic models, our HMM accommodates infinite hidden states through a Bayesian model. We applied it to identifying reasonable change points in read density, which further define segments of enrichment. Our analysis revealed how our Bayesian Change Point (BCP) algorithm had a reduced computational complexity-evidenced by an abridged run time and memory footprint. The BCP algorithm was successfully applied to both punctate peak and diffuse island identification with robust accuracy and limited user-defined parameters. This illustrated both its versatility and ease of use. Consequently, we believe it can be implemented readily across broad ranges of data types and end users in a manner that is easily compared and contrasted, making it a great tool for ChIPseq data analysis that can aid in collaboration and corroboration between research groups. Here, we demonstrate the application of BCP to existing transcription factor10,11
and epigenetic data12
to illustrate its usefulness.
Genetics, Issue 70, Bioinformatics, Genomics, Molecular Biology, Cellular Biology, Immunology, Chromatin immunoprecipitation, ChIP-Seq, histone modifications, segmentation, Bayesian, Hidden Markov Models, epigenetics
Voltage Biasing, Cyclic Voltammetry, & Electrical Impedance Spectroscopy for Neural Interfaces
Institutions: Purdue University, University of Wisconsin-Madison, University of Michigan , Purdue University.
Electrical impedance spectroscopy (EIS) and cyclic voltammetry (CV) measure properties of the electrode-tissue interface without additional invasive procedures, and can be used to monitor electrode performance over the long term. EIS measures electrical impedance at multiple frequencies, and increases in impedance indicate increased glial scar formation around the device, while cyclic voltammetry measures the charge carrying capacity of the electrode, and indicates how charge is transferred at different voltage levels. As implanted electrodes age, EIS and CV data change, and electrode sites that previously recorded spiking neurons often exhibit significantly lower efficacy for neural recording. The application of a brief voltage pulse to implanted electrode arrays, known as rejuvenation, can bring back spiking activity on otherwise silent electrode sites for a period of time. Rejuvenation alters EIS and CV, and can be monitored by these complementary methods. Typically, EIS is measured daily as an indication of the tissue response at the electrode site. If spikes are absent in a channel that previously had spikes, then CV is used to determine the charge carrying capacity of the electrode site, and rejuvenation can be applied to improve the interface efficacy. CV and EIS are then repeated to check the changes at the electrode-tissue interface, and neural recordings are collected. The overall goal of rejuvenation is to extend the functional lifetime of implanted arrays.
Neuroscience, Issue 60, neuroprosthesis, electrode-tissue interface, rejuvenation, neural engineering, neuroscience, neural implant, electrode, brain-computer interface, electrochemistry
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+
release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Viability Assays for Cells in Culture
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
Multi-step Preparation Technique to Recover Multiple Metabolite Compound Classes for In-depth and Informative Metabolomic Analysis
Institutions: National Jewish Health, University of Colorado Denver.
Metabolomics is an emerging field which enables profiling of samples from living organisms in order to obtain insight into biological processes. A vital aspect of metabolomics is sample preparation whereby inconsistent techniques generate unreliable results. This technique encompasses protein precipitation, liquid-liquid extraction, and solid-phase extraction as a means of fractionating metabolites into four distinct classes. Improved enrichment of low abundance molecules with a resulting increase in sensitivity is obtained, and ultimately results in more confident identification of molecules. This technique has been applied to plasma, bronchoalveolar lavage fluid, and cerebrospinal fluid samples with volumes as low as 50 µl. Samples can be used for multiple downstream applications; for example, the pellet resulting from protein precipitation can be stored for later analysis. The supernatant from that step undergoes liquid-liquid extraction using water and strong organic solvent to separate the hydrophilic and hydrophobic compounds. Once fractionated, the hydrophilic layer can be processed for later analysis or discarded if not needed. The hydrophobic fraction is further treated with a series of solvents during three solid-phase extraction steps to separate it into fatty acids, neutral lipids, and phospholipids. This allows the technician the flexibility to choose which class of compounds is preferred for analysis. It also aids in more reliable metabolite identification since some knowledge of chemical class exists.
Bioengineering, Issue 89, plasma, chemistry techniques, analytical, solid phase extraction, mass spectrometry, metabolomics, fluids and secretions, profiling, small molecules, lipids, liquid chromatography, liquid-liquid extraction, cerebrospinal fluid, bronchoalveolar lavage fluid
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
The ChroP Approach Combines ChIP and Mass Spectrometry to Dissect Locus-specific Proteomic Landscapes of Chromatin
Institutions: European Institute of Oncology.
Chromatin is a highly dynamic nucleoprotein complex made of DNA and proteins that controls various DNA-dependent processes. Chromatin structure and function at specific regions is regulated by the local enrichment of histone post-translational modifications (hPTMs) and variants, chromatin-binding proteins, including transcription factors, and DNA methylation. The proteomic characterization of chromatin composition at distinct functional regions has been so far hampered by the lack of efficient protocols to enrich such domains at the appropriate purity and amount for the subsequent in-depth analysis by Mass Spectrometry (MS). We describe here a newly designed chromatin proteomics strategy, named ChroP (Chromatin Proteomics
), whereby a preparative chromatin immunoprecipitation is used to isolate distinct chromatin regions whose features, in terms of hPTMs, variants and co-associated non-histonic proteins, are analyzed by MS. We illustrate here the setting up of ChroP for the enrichment and analysis of transcriptionally silent heterochromatic regions, marked by the presence of tri-methylation of lysine 9 on histone H3. The results achieved demonstrate the potential of ChroP
in thoroughly characterizing the heterochromatin proteome and prove it as a powerful analytical strategy for understanding how the distinct protein determinants of chromatin interact and synergize to establish locus-specific structural and functional configurations.
Biochemistry, Issue 86, chromatin, histone post-translational modifications (hPTMs), epigenetics, mass spectrometry, proteomics, SILAC, chromatin immunoprecipitation , histone variants, chromatome, hPTMs cross-talks
A Protocol for Analyzing Hepatitis C Virus Replication
Institutions: Cedars-Sinai Medical Center, David Geffen School of Medicine at UCLA.
Hepatitis C Virus (HCV) affects 3% of the world’s population and causes serious liver ailments including chronic hepatitis, cirrhosis, and hepatocellular carcinoma. HCV is an enveloped RNA virus belonging to the family Flaviviridae
. Current treatment is not fully effective and causes adverse side effects. There is no HCV vaccine available. Thus, continued effort is required for developing a vaccine and better therapy. An HCV cell culture system is critical for studying various stages of HCV growth including viral entry, genome replication, packaging, and egress. In the current procedure presented, we used a wild-type intragenotype 2a chimeric virus, FNX-HCV, and a recombinant FNX-Rluc virus carrying a Renilla
luciferase reporter gene to study the virus replication. A human hepatoma cell line (Huh-7 based) was used for transfection of in vitro
transcribed HCV genomic RNAs. Cell-free culture supernatants, protein lysates and total RNA were harvested at various time points post-transfection to assess HCV growth. HCV genome replication status was evaluated by quantitative RT-PCR and visualizing the presence of HCV double-stranded RNA. The HCV protein expression was verified by Western blot and immunofluorescence assays using antibodies specific for HCV NS3 and NS5A proteins. HCV RNA transfected cells released infectious particles into culture supernatant and the viral titer was measured. Luciferase assays were utilized to assess the replication level and infectivity of reporter HCV. In conclusion, we present various virological assays for characterizing different stages of the HCV replication cycle.
Infectious Diseases, Issue 88, Hepatitis C Virus, HCV, Tumor-virus, Hepatitis C, Cirrhosis, Liver Cancer, Hepatocellular Carcinoma
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo
mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo
mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo
mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1
and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo
mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2
. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo
mutations. This is the case for autism and schizophrenia3
. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo
mutations would more frequently come from males, particularly older males4
. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo
mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo
mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing
Modified Annexin V/Propidium Iodide Apoptosis Assay For Accurate Assessment of Cell Death
Institutions: University of Alberta, University of Alberta.
Studies of cellular apoptosis have been significantly impacted since the introduction of flow cytometry-based methods. Propidium iodide (PI) is widely used in conjunction with Annexin V to determine if cells are viable, apoptotic, or necrotic through differences in plasma membrane integrity and permeability1,2
. The Annexin V/ PI protocol is a commonly used approach for studying apoptotic cells3
. PI is used more often than other nuclear stains because it is economical, stable and a good indicator of cell viability, based on its capacity to exclude dye in living cells 4,5
. The ability of PI to enter a cell is dependent upon the permeability of the membrane; PI does not stain live or early apoptotic cells due to the presence of an intact plasma membrane 1,2,6
. In late apoptotic and necrotic cells, the integrity of the plasma and nuclear membranes decreases7,8
, allowing PI to pass through the membranes, intercalate into nucleic acids, and display red fluorescence 1,2,9
. Unfortunately, we find that conventional Annexin V/ PI protocols lead to a significant number of false positive events (up to 40%), which are associated with PI staining of RNA within the cytoplasmic compartment10
. Primary cells and cell lines in a broad range of animal models are affected, with large cells (nuclear: cytoplasmic ratios <0.5) showing the highest occurrence10
. Herein, we demonstrate a modified Annexin V/ PI method that provides a significant improvement for assessment of cell death compared to conventional methods. This protocol takes advantage of changes in cellular permeability during cell fixing to promote entry of RNase A into cells following staining. Both the timing and concentration of RNase A have been optimized for removal of cytoplasmic RNA. The result is a significant improvement over conventional Annexin V/ PI protocols (< 5% events with cytoplasmic PI staining).
Cellular Biology, Issue 50, Apoptosis, cell death, propidium iodide, Annexin V, necrosis, immunology
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
MISSION esiRNA for RNAi Screening in Mammalian Cells
Institutions: Max Planck Institute of Molecular Cell Biology and Genetics.
RNA interference (RNAi) is a basic cellular mechanism for the control of gene expression. RNAi is induced by short double-stranded RNAs also known as small interfering RNAs (siRNAs). The short double-stranded RNAs originate from longer double stranded precursors by the activity of Dicer, a protein of the RNase III family of endonucleases. The resulting fragments are components of the RNA-induced silencing complex (RISC), directing it to the cognate target mRNA. RISC cleaves the target mRNA thereby reducing the expression of the encoded protein1,2,3
. RNAi has become a powerful and widely used experimental method for loss of gene function studies in mammalian cells utilizing small interfering RNAs.
Currently two main methods are available for the production of small interfering RNAs. One method involves chemical synthesis, whereas an alternative method employs endonucleolytic cleavage of target specific long double-stranded RNAs by RNase III in vitro
. Thereby, a diverse pool of siRNA-like oligonucleotides is produced which is also known as endoribonuclease-prepared siRNA or esiRNA. A comparison of efficacy of chemically derived siRNAs and esiRNAs shows that both triggers are potent in target-gene silencing. Differences can, however, be seen when comparing specificity. Many single chemically synthesized siRNAs produce prominent off-target effects, whereas the complex mixture inherent in esiRNAs leads to a more specific knockdown10
In this study, we present the design of genome-scale MISSION esiRNA libraries and its utilization for RNAi screening exemplified by a DNA-content screen for the identification of genes involved in cell cycle progression. We show how to optimize the transfection protocol and the assay for screening in high throughput. We also demonstrate how large data-sets can be evaluated statistically and present methods to validate primary hits. Finally, we give potential starting points for further functional characterizations of validated hits.
Cellular Biology, Issue 39, MISSION, esiRNA, RNAi, cell cycle, high throughput screening