JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Reaction factoring and bipartite update graphs accelerate the Gillespie Algorithm for large-scale biochemical systems.
PUBLISHED: 01-06-2010
ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
27 Related JoVE Articles!
Play Button
Increasing cDNA Yields from Single-cell Quantities of mRNA in Standard Laboratory Reverse Transcriptase Reactions using Acoustic Microstreaming
Authors: Wah Chin Boon, Karolina Petkovic-Duran, Yonggang Zhu, Richard Manasseh, Malcolm K. Horne, Tim D. Aumann.
Institutions: University of Melbourne, CSIRO Materials Science and Engineering, Faculty of Engineering and Industrial Sciences.
Correlating gene expression with cell behavior is ideally done at the single-cell level. However, this is not easily achieved because the small amount of labile mRNA present in a single cell (1-5% of 1-50pg total RNA, or 0.01-2.5pg mRNA, per cell 1) mostly degrades before it can be reverse transcribed into a stable cDNA copy. For example, using standard laboratory reagents and hardware, only a small number of genes can be qualitatively assessed per cell 2. One way to increase the efficiency of standard laboratory reverse transcriptase (RT) reactions (i.e. standard reagents in microliter volumes) comprising single-cell amounts of mRNA would be to more rapidly mix the reagents so the mRNA can be converted to cDNA before it degrades. However this is not trivial because at microliter scales liquid flow is laminar, i.e. currently available methods of mixing (i.e. shaking, vortexing and trituration) fail to produce sufficient chaotic motion to effectively mix reagents. To solve this problem, micro-scale mixing techniques have to be used 3,4. A number of microfluidic-based mixing technologies have been developed which successfully increase RT reaction yields 5-8. However, microfluidics technologies require specialized hardware that is relatively expensive and not yet widely available. A cheaper, more convenient solution is desirable. The main objective of this study is to demonstrate how application of a novel "micromixing" technique to standard laboratory RT reactions comprising single-cell quantities of mRNA significantly increases their cDNA yields. We find cDNA yields increase by approximately 10-100-fold, which enables: (1) greater numbers of genes to be analyzed per cell; (2) more quantitative analysis of gene expression; and (3) better detection of low-abundance genes in single cells. The micromixing is based on acoustic microstreaming 9-12, a phenomenon where sound waves propagating around a small obstacle create a mean flow near the obstacle. We have developed an acoustic microstreaming-based device ("micromixer") with a key simplification; acoustic microstreaming can be achieved at audio frequencies by ensuring the system has a liquid-air interface with a small radius of curvature 13. The meniscus of a microliter volume of solution in a tube provides an appropriately small radius of curvature. The use of audio frequencies means that the hardware can be inexpensive and versatile 13, and nucleic acids and other biochemical reagents are not damaged like they can be with standard laboratory sonicators.
Bioengineering, Issue 53, neuroscience, brain, cells, reverse transcription, qPCR, gene expression, acoustic microstreaming, micromixer, microfluidics
Play Button
Monitoring the Reductive and Oxidative Half-Reactions of a Flavin-Dependent Monooxygenase using Stopped-Flow Spectrophotometry
Authors: Elvira Romero, Reeder Robinson, Pablo Sobrado.
Institutions: Virginia Polytechnic Institute and State University.
Aspergillus fumigatus siderophore A (SidA) is an FAD-containing monooxygenase that catalyzes the hydroxylation of ornithine in the biosynthesis of hydroxamate siderophores that are essential for virulence (e.g. ferricrocin or N',N",N'''-triacetylfusarinine C)1. The reaction catalyzed by SidA can be divided into reductive and oxidative half-reactions (Scheme 1). In the reductive half-reaction, the oxidized FAD bound to Af SidA, is reduced by NADPH2,3. In the oxidative half-reaction, the reduced cofactor reacts with molecular oxygen to form a C4a-hydroperoxyflavin intermediate, which transfers an oxygen atom to ornithine. Here, we describe a procedure to measure the rates and detect the different spectral forms of SidA using a stopped-flow instrument installed in an anaerobic glove box. In the stopped-flow instrument, small volumes of reactants are rapidly mixed, and after the flow is stopped by the stop syringe (Figure 1), the spectral changes of the solution placed in the observation cell are recorded over time. In the first part of the experiment, we show how we can use the stopped-flow instrument in single mode, where the anaerobic reduction of the flavin in Af SidA by NADPH is directly measured. We then use double mixing settings where Af SidA is first anaerobically reduced by NADPH for a designated period of time in an aging loop, and then reacted with molecular oxygen in the observation cell (Figure 1). In order to perform this experiment, anaerobic buffers are necessary because when only the reductive half-reaction is monitored, any oxygen in the solutions will react with the reduced flavin cofactor and form a C4a-hydroperoxyflavin intermediate that will ultimately decay back into the oxidized flavin. This would not allow the user to accurately measure rates of reduction since there would be complete turnover of the enzyme. When the oxidative half-reaction is being studied the enzyme must be reduced in the absence of oxygen so that just the steps between reduction and oxidation are observed. One of the buffers used in this experiment is oxygen saturated so that we can study the oxidative half-reaction at higher concentrations of oxygen. These are often the procedures carried out when studying either the reductive or oxidative half-reactions with flavin-containing monooxygenases. The time scale of the pre-steady-state experiments performed with the stopped-flow is milliseconds to seconds, which allow the determination of intrinsic rate constants and the detection and identification of intermediates in the reaction4. The procedures described here can be applied to other flavin-dependent monooxygenases.5,6
Bioengineering, Issue 61, Stopped-flow, kinetic mechanism, SidA, C4a-hydroperoxyflavin, monooxygenase, Aspergillus fumigatus
Play Button
Small Bowel Transplantation In Mice
Authors: Fengchun Liu, Sang-Mo Kang.
Institutions: University of California, San Francisco - UCSF.
Since 1990, the development of tacrolimus-based immunosuppression and improved surgical techniques, the increased array of potent immunosuppressive medications, infection prophylaxis, and suitable patient selection helped improve actuarial graft and patient survival rates for all types of intestine transplantation. Patients with irreversible intestinal failure and complications of parenteral nutrition should now be routinely considered for small intestine transplantation. However, Survival rates for small intestinal transplantation have been slow to improve compares increasingly favorably with renal, liver, heart and lung. The small bowel transplantation is still unsatisfactory compared with other organs. Further progress may depend on better understanding of immunology and physiology of the graft and can be greatly facilitated by animal models. A wider use of mouse small bowel transplantation model is needed in the study of immunology and physiology of the transplantation gut as well as efficient methods in diagnosing early rejection. However, this model is limited to use because the techniques involved is an extremely technically challenging. We have developed a modified technique. When making anastomosis of portal vein and inferior vena cava, two stay sutures are made at the proximal apex and distal apex of the recipient s inferior vena cava with the donor s portal vein. The left wall of the inferior vena cava and donor s portal vein is closed with continuing sutures in the inside of the inferior vena cava after, after one knot with the proximal apex stay suture the right wall of the inferior vena cava and the donor s portal vein are closed with continuing sutures outside the inferior vena cave with 10-0 sutures. This method is easier to perform because anastomosis is made just on the one side of the inferior vena cava and 10-0 sutures is the right size to avoid bleeding and thrombosis. In this article, we provide details of the technique to supplement the video.
Issue 7, Immunology, Transplantation, Transplant Rejection, Small Bowel
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
High-throughput Image Analysis of Tumor Spheroids: A User-friendly Software Application to Measure the Size of Spheroids Automatically and Accurately
Authors: Wenjin Chen, Chung Wong, Evan Vosburgh, Arnold J. Levine, David J. Foran, Eugenia Y. Xu.
Institutions: Raymond and Beverly Sackler Foundation, New Jersey, Rutgers University, Rutgers University, Institute for Advanced Study, New Jersey.
The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application – SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary “Manual Initialize” and “Hand Draw” tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.
Cancer Biology, Issue 89, computer programming, high-throughput, image analysis, tumor spheroids, 3D, software application, cancer therapy, drug screen, neuroendocrine tumor cell line, BON-1, cancer research
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Play Button
Reconstitution of a Kv Channel into Lipid Membranes for Structural and Functional Studies
Authors: Sungsoo Lee, Hui Zheng, Liang Shi, Qiu-Xing Jiang.
Institutions: University of Texas Southwestern Medical Center at Dallas.
To study the lipid-protein interaction in a reductionistic fashion, it is necessary to incorporate the membrane proteins into membranes of well-defined lipid composition. We are studying the lipid-dependent gating effects in a prototype voltage-gated potassium (Kv) channel, and have worked out detailed procedures to reconstitute the channels into different membrane systems. Our reconstitution procedures take consideration of both detergent-induced fusion of vesicles and the fusion of protein/detergent micelles with the lipid/detergent mixed micelles as well as the importance of reaching an equilibrium distribution of lipids among the protein/detergent/lipid and the detergent/lipid mixed micelles. Our data suggested that the insertion of the channels in the lipid vesicles is relatively random in orientations, and the reconstitution efficiency is so high that no detectable protein aggregates were seen in fractionation experiments. We have utilized the reconstituted channels to determine the conformational states of the channels in different lipids, record electrical activities of a small number of channels incorporated in planar lipid bilayers, screen for conformation-specific ligands from a phage-displayed peptide library, and support the growth of 2D crystals of the channels in membranes. The reconstitution procedures described here may be adapted for studying other membrane proteins in lipid bilayers, especially for the investigation of the lipid effects on the eukaryotic voltage-gated ion channels.
Molecular Biology, Issue 77, Biochemistry, Genetics, Cellular Biology, Structural Biology, Biophysics, Membrane Lipids, Phospholipids, Carrier Proteins, Membrane Proteins, Micelles, Molecular Motor Proteins, life sciences, biochemistry, Amino Acids, Peptides, and Proteins, lipid-protein interaction, channel reconstitution, lipid-dependent gating, voltage-gated ion channel, conformation-specific ligands, lipids
Play Button
Quantitative Optical Microscopy: Measurement of Cellular Biophysical Features with a Standard Optical Microscope
Authors: Kevin G. Phillips, Sandra M. Baker-Groberg, Owen J.T. McCarty.
Institutions: Oregon Health & Science University, School of Medicine, Oregon Health & Science University, School of Medicine, Oregon Health & Science University, School of Medicine.
We describe the use of a standard optical microscope to perform quantitative measurements of mass, volume, and density on cellular specimens through a combination of bright field and differential interference contrast imagery. Two primary approaches are presented: noninterferometric quantitative phase microscopy (NIQPM), to perform measurements of total cell mass and subcellular density distribution, and Hilbert transform differential interference contrast microscopy (HTDIC) to determine volume. NIQPM is based on a simplified model of wave propagation, termed the paraxial approximation, with three underlying assumptions: low numerical aperture (NA) illumination, weak scattering, and weak absorption of light by the specimen. Fortunately, unstained cellular specimens satisfy these assumptions and low NA illumination is easily achieved on commercial microscopes. HTDIC is used to obtain volumetric information from through-focus DIC imagery under high NA illumination conditions. High NA illumination enables enhanced sectioning of the specimen along the optical axis. Hilbert transform processing on the DIC image stacks greatly enhances edge detection algorithms for localization of the specimen borders in three dimensions by separating the gray values of the specimen intensity from those of the background. The primary advantages of NIQPM and HTDIC lay in their technological accessibility using “off-the-shelf” microscopes. There are two basic limitations of these methods: slow z-stack acquisition time on commercial scopes currently abrogates the investigation of phenomena faster than 1 frame/minute, and secondly, diffraction effects restrict the utility of NIQPM and HTDIC to objects from 0.2 up to 10 (NIQPM) and 20 (HTDIC) μm in diameter, respectively. Hence, the specimen and its associated time dynamics of interest must meet certain size and temporal constraints to enable the use of these methods. Excitingly, most fixed cellular specimens are readily investigated with these methods.
Bioengineering, Issue 86, Label-free optics, quantitative microscopy, cellular biophysics, cell mass, cell volume, cell density
Play Button
A High Throughput MHC II Binding Assay for Quantitative Analysis of Peptide Epitopes
Authors: Regina Salvat, Leonard Moise, Chris Bailey-Kellogg, Karl E. Griswold.
Institutions: Dartmouth College, University of Rhode Island, Dartmouth College.
Biochemical assays with recombinant human MHC II molecules can provide rapid, quantitative insights into immunogenic epitope identification, deletion, or design1,2. Here, a peptide-MHC II binding assay is scaled to 384-well format. The scaled down protocol reduces reagent costs by 75% and is higher throughput than previously described 96-well protocols1,3-5. Specifically, the experimental design permits robust and reproducible analysis of up to 15 peptides against one MHC II allele per 384-well ELISA plate. Using a single liquid handling robot, this method allows one researcher to analyze approximately ninety test peptides in triplicate over a range of eight concentrations and four MHC II allele types in less than 48 hr. Others working in the fields of protein deimmunization or vaccine design and development may find the protocol to be useful in facilitating their own work. In particular, the step-by-step instructions and the visual format of JoVE should allow other users to quickly and easily establish this methodology in their own labs.
Biochemistry, Issue 85, Immunoassay, Protein Immunogenicity, MHC II, T cell epitope, High Throughput Screen, Deimmunization, Vaccine Design
Play Button
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Authors: Jeremy D. Smith, Abbie E. Ferris, Gary D. Heise, Richard N. Hinrichs, Philip E. Martin.
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Play Button
Quantitative Real-Time PCR using the Thermo Scientific Solaris qPCR Assay
Authors: Christy Ogrean, Ben Jackson, James Covino.
Institutions: Thermo Scientific Solaris qPCR Products.
The Solaris qPCR Gene Expression Assay is a novel type of primer/probe set, designed to simplify the qPCR process while maintaining the sensitivity and accuracy of the assay. These primer/probe sets are pre-designed to >98% of the human and mouse genomes and feature significant improvements from previously available technologies. These improvements were made possible by virtue of a novel design algorithm, developed by Thermo Scientific bioinformatics experts. Several convenient features have been incorporated into the Solaris qPCR Assay to streamline the process of performing quantitative real-time PCR. First, the protocol is similar to commonly employed alternatives, so the methods used during qPCR are likely to be familiar. Second, the master mix is blue, which makes setting the qPCR reactions easier to track. Third, the thermal cycling conditions are the same for all assays (genes), making it possible to run many samples at a time and reducing the potential for error. Finally, the probe and primer sequence information are provided, simplifying the publication process. Here, we demonstrate how to obtain the appropriate Solaris reagents using the GENEius product search feature found on the ordering web site ( and how to use the Solaris reagents for performing qPCR using the standard curve method.
Cellular Biology, Issue 40, qPCR, probe, real-time PCR, molecular biology, Solaris, primer, gene expression assays
Play Button
Synthesis of Hypervalent Iodonium Alkynyl Triflates for the Application of Generating Cyanocarbenes
Authors: I. F. Dempsey Hyatt, Daniel J. Nasrallah, Mitchell P. Croatt.
Institutions: University of North Carolina at Greensboro.
The procedures described in this article involve the synthesis and isolation of hypervalent iodonium alkynyl triflates (HIATs) and their subsequent reactions with azides to form cyanocarbene intermediates. The synthesis of hypervalent iodonium alkynyl triflates can be facile, but difficulties stem from their isolation and reactivity. In particular, the necessity to use filtration under inert atmosphere at -45 °C for some HIATs requires special care and equipment. Once isolated, the compounds can be stored and used in reactions with azides to form cyanocarbene intermediates. The evidence for cyanocarbene generation is shown by visible extrusion of dinitrogen as well as the characterization of products that occur from O-H insertion, sulfoxide complexation, and cyclopropanation. A side reaction of the cyanocarbene formation is the generation of a vinylidene-carbene and the conditions to control this process are discussed. There is also potential to form a hypervalent iodonium alkenyl triflate and the means of isolation and control of its generation are provided. The O-H insertion reaction involves using a HIAT, sodium azide or tetrabutylammonium azide, and methanol as solvent/substrate. The sulfoxide complexation reaction uses a HIAT, sodium azide or tetrabutylammonium azide, and dimethyl sulfoxide as solvent. The cyclopropanations can be performed with or without the use of solvent. The azide source must be tetrabutylammonium azide and the substrate shown is styrene.
Chemistry, Issue 79, Iodine Compounds, Azides, Hydrocarbons, Cyclic, Nitriles, Onium Compounds, Explosive Agents, chemistry (general), chemistry of compounds, chemistry of elements, Organic Chemicals, azides, carbenes, cyanides, hypervalent compounds, synthetic methods, organic
Play Button
Preparation and Use of Samarium Diiodide (SmI2) in Organic Synthesis: The Mechanistic Role of HMPA and Ni(II) Salts in the Samarium Barbier Reaction
Authors: Dhandapani V. Sadasivam, Kimberly A. Choquette, Robert A. Flowers II.
Institutions: Lehigh University .
Although initially considered an esoteric reagent, SmI2 has become a common tool for synthetic organic chemists. SmI2 is generated through the addition of molecular iodine to samarium metal in THF.1,2-3 It is a mild and selective single electron reductant and its versatility is a result of its ability to initiate a wide range of reductions including C-C bond-forming and cascade or sequential reactions. SmI2 can reduce a variety of functional groups including sulfoxides and sulfones, phosphine oxides, epoxides, alkyl and aryl halides, carbonyls, and conjugated double bonds.2-12 One of the fascinating features of SmI-2-mediated reactions is the ability to manipulate the outcome of reactions through the selective use of cosolvents or additives. In most instances, additives are essential in controlling the rate of reduction and the chemo- or stereoselectivity of reactions.13-14 Additives commonly utilized to fine tune the reactivity of SmI2 can be classified into three major groups: (1) Lewis bases (HMPA, other electron-donor ligands, chelating ethers, etc.), (2) proton sources (alcohols, water etc.), and (3) inorganic additives (Ni(acac)2, FeCl3, etc).3 Understanding the mechanism of SmI2 reactions and the role of the additives enables utilization of the full potential of the reagent in organic synthesis. The Sm-Barbier reaction is chosen to illustrate the synthetic importance and mechanistic role of two common additives: HMPA and Ni(II) in this reaction. The Sm-Barbier reaction is similar to the traditional Grignard reaction with the only difference being that the alkyl halide, carbonyl, and Sm reductant are mixed simultaneously in one pot.1,15 Examples of Sm-mediated Barbier reactions with a range of coupling partners have been reported,1,3,7,10,12 and have been utilized in key steps of the synthesis of large natural products.16,17 Previous studies on the effect of additives on SmI2 reactions have shown that HMPA enhances the reduction potential of SmI2 by coordinating to the samarium metal center, producing a more powerful,13-14,18 sterically encumbered reductant19-21 and in some cases playing an integral role in post electron-transfer steps facilitating subsequent bond-forming events.22 In the Sm-Barbier reaction, HMPA has been shown to additionally activate the alkyl halide by forming a complex in a pre-equilibrium step.23 Ni(II) salts are a catalytic additive used frequently in Sm-mediated transformations.24-27 Though critical for success, the mechanistic role of Ni(II) was not known in these reactions. Recently it has been shown that SmI2 reduces Ni(II) to Ni(0), and the reaction is then carried out through organometallic Ni(0) chemistry.28 These mechanistic studies highlight that although the same Barbier product is obtained, the use of different additives in the SmI2 reaction drastically alters the mechanistic pathway of the reaction. The protocol for running these SmI2-initiated reactions is described.
Chemistry, Issue 72, Organic Chemistry, Chemical Engineering, Biochemistry, Samarium diiodide, Sml2, Samarium-Barbier Reaction, HMPA, hexamethylphosphoramide, Ni(II), Nickel(II) acetylacetonate, nickel, samarium, iodine, additives, synthesis, catalyst, reaction, synthetic organic chemistry
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Authors: Daniel T. Claiborne, Jessica L. Prince, Eric Hunter.
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro replication of HIV-1 as influenced by the gag gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro replication of chronically derived gag-pro sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Play Button
In Situ SIMS and IR Spectroscopy of Well-defined Surfaces Prepared by Soft Landing of Mass-selected Ions
Authors: Grant E. Johnson, K. Don Dasitha Gunaratne, Julia Laskin.
Institutions: Pacific Northwest National Laboratory.
Soft landing of mass-selected ions onto surfaces is a powerful approach for the highly-controlled preparation of materials that are inaccessible using conventional synthesis techniques. Coupling soft landing with in situ characterization using secondary ion mass spectrometry (SIMS) and infrared reflection absorption spectroscopy (IRRAS) enables analysis of well-defined surfaces under clean vacuum conditions. The capabilities of three soft-landing instruments constructed in our laboratory are illustrated for the representative system of surface-bound organometallics prepared by soft landing of mass-selected ruthenium tris(bipyridine) dications, [Ru(bpy)3]2+ (bpy = bipyridine), onto carboxylic acid terminated self-assembled monolayer surfaces on gold (COOH-SAMs). In situ time-of-flight (TOF)-SIMS provides insight into the reactivity of the soft-landed ions. In addition, the kinetics of charge reduction, neutralization and desorption occurring on the COOH-SAM both during and after ion soft landing are studied using in situ Fourier transform ion cyclotron resonance (FT-ICR)-SIMS measurements. In situ IRRAS experiments provide insight into how the structure of organic ligands surrounding metal centers is perturbed through immobilization of organometallic ions on COOH-SAM surfaces by soft landing. Collectively, the three instruments provide complementary information about the chemical composition, reactivity and structure of well-defined species supported on surfaces.
Chemistry, Issue 88, soft landing, mass selected ions, electrospray, secondary ion mass spectrometry, infrared spectroscopy, organometallic, catalysis
Play Button
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Authors: Mirella Vivoli, Halina R. Novak, Jennifer A. Littlechild, Nicholas J. Harmer.
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Analysis of Oxidative Stress in Zebrafish Embryos
Authors: Vera Mugoni, Annalisa Camporeale, Massimo M. Santoro.
Institutions: University of Torino, Vesalius Research Center, VIB.
High levels of reactive oxygen species (ROS) may cause a change of cellular redox state towards oxidative stress condition. This situation causes oxidation of molecules (lipid, DNA, protein) and leads to cell death. Oxidative stress also impacts the progression of several pathological conditions such as diabetes, retinopathies, neurodegeneration, and cancer. Thus, it is important to define tools to investigate oxidative stress conditions not only at the level of single cells but also in the context of whole organisms. Here, we consider the zebrafish embryo as a useful in vivo system to perform such studies and present a protocol to measure in vivo oxidative stress. Taking advantage of fluorescent ROS probes and zebrafish transgenic fluorescent lines, we develop two different methods to measure oxidative stress in vivo: i) a “whole embryo ROS-detection method” for qualitative measurement of oxidative stress and ii) a “single-cell ROS detection method” for quantitative measurements of oxidative stress. Herein, we demonstrate the efficacy of these procedures by increasing oxidative stress in tissues by oxidant agents and physiological or genetic methods. This protocol is amenable for forward genetic screens and it will help address cause-effect relationships of ROS in animal models of oxidative stress-related pathologies such as neurological disorders and cancer.
Developmental Biology, Issue 89, Danio rerio, zebrafish embryos, endothelial cells, redox state analysis, oxidative stress detection, in vivo ROS measurements, FACS (fluorescence activated cell sorter), molecular probes
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Analysis of Nephron Composition and Function in the Adult Zebrafish Kidney
Authors: Kristen K. McCampbell, Kristin N. Springer, Rebecca A. Wingert.
Institutions: University of Notre Dame.
The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.
Cellular Biology, Issue 90, zebrafish; kidney; nephron; nephrology; renal; regeneration; proximal tubule; distal tubule; segment; mesonephros; physiology; acute kidney injury (AKI)
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: Set up reactions and thermal cycling conditions for a conventional PCR experiment Understand the function of various reaction components and their overall effect on a PCR experiment Design and optimize a PCR experiment for any DNA template Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Play Button
DNA-based Fish Species Identification Protocol
Authors: Rachel Formosa, Harini Ravi, Scott Happe, Danielle Huffman, Natalia Novoradovskaya, Robert Kincaid, Steve Garrett.
Institutions: Agilent Technologies.
We have developed a fast, simple, and accurate DNA-based screening method to identify the fish species present in fresh and processed seafood samples. This versatile method employs PCR amplification of genomic DNA extracted from fish samples, followed by restriction fragment length polymorphism (RFLP) analysis to generate fragment patterns that can be resolved on the Agilent 2100 Bioanalyzer and matched to the correct species using RFLP pattern matching software. The fish identification method uses a simple, reliable, spin column- based protocol to isolate DNA from fish samples. The samples are treated with proteinase K to release the nucleic acids into solution. DNA is then isolated by suspending the sample in binding buffer and loading onto a micro- spin cup containing a silica- based fiber matrix. The nucleic acids in the sample bind to the fiber matrix. The immobilized nucleic acids are washed to remove contaminants, and total DNA is recovered in a final volume of 100 μl. The isolated DNA is ready for PCR amplification with the provided primers that bind to sequences found in all fish genomes. The PCR products are then digested with three different restriction enzymes and resolved on the Agilent 2100 Bioanalyzer. The fragment lengths produced in the digestion reactions can be used to determine the species of fish from which the DNA sample was prepared, using the RFLP pattern matching software containing a database of experimentally- derived RFLP patterns from commercially relevant fish species.
Cellular Biology, Issue 38, seafood, fish, mislabeling, authenticity, PCR, Bioanalyzer, food, RFLP, identity
Play Button
Test Samples for Optimizing STORM Super-Resolution Microscopy
Authors: Daniel J. Metcalf, Rebecca Edwards, Neelam Kumarswami, Alex E. Knight.
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
Play Button
Purifying Plasmid DNA from Bacterial Colonies Using the Qiagen Miniprep Kit
Authors: Shenyuan Zhang, Michael D. Cahalan.
Institutions: University of California, Irvine (UCI).
Plasmid DNA purification from E. coli is a core technique for molecular cloning. Small scale purification (miniprep) from less than 5 ml of bacterial culture is a quick way for clone verification or DNA isolation, followed by further enzymatic reactions (polymerase chain reaction and restriction enzyme digestion). Here, we video-recorded the general procedures of miniprep through the QIAGEN's QIAprep 8 Miniprep Kit, aiming to introducing this highly efficient technique to the general beginners for molecular biology techniques. The whole procedure is based on alkaline lysis of E. coli cells followed by adsorption of DNA onto silica in the presence of high salt. It consists of three steps: 1) preparation and clearing of a bacterial lysate, 2) adsorption of DNA onto the QIAprep membrane, 3) washing and elution of plasmid DNA. All steps are performed without the use of phenol, chloroform, CsCl, ethidium bromide, and without alcohol precipitation. It usually takes less than 2 hours to finish the entire procedure.
Issue 6, Basic Protocols, plasmid, DNA, purification, Qiagen
Play Button
Using SCOPE to Identify Potential Regulatory Motifs in Coregulated Genes
Authors: Viktor Martyanov, Robert H. Gross.
Institutions: Dartmouth College.
SCOPE is an ensemble motif finder that uses three component algorithms in parallel to identify potential regulatory motifs by over-representation and motif position preference1. Each component algorithm is optimized to find a different kind of motif. By taking the best of these three approaches, SCOPE performs better than any single algorithm, even in the presence of noisy data1. In this article, we utilize a web version of SCOPE2 to examine genes that are involved in telomere maintenance. SCOPE has been incorporated into at least two other motif finding programs3,4 and has been used in other studies5-8. The three algorithms that comprise SCOPE are BEAM9, which finds non-degenerate motifs (ACCGGT), PRISM10, which finds degenerate motifs (ASCGWT), and SPACER11, which finds longer bipartite motifs (ACCnnnnnnnnGGT). These three algorithms have been optimized to find their corresponding type of motif. Together, they allow SCOPE to perform extremely well. Once a gene set has been analyzed and candidate motifs identified, SCOPE can look for other genes that contain the motif which, when added to the original set, will improve the motif score. This can occur through over-representation or motif position preference. Working with partial gene sets that have biologically verified transcription factor binding sites, SCOPE was able to identify most of the rest of the genes also regulated by the given transcription factor. Output from SCOPE shows candidate motifs, their significance, and other information both as a table and as a graphical motif map. FAQs and video tutorials are available at the SCOPE web site which also includes a "Sample Search" button that allows the user to perform a trial run. Scope has a very friendly user interface that enables novice users to access the algorithm's full power without having to become an expert in the bioinformatics of motif finding. As input, SCOPE can take a list of genes, or FASTA sequences. These can be entered in browser text fields, or read from a file. The output from SCOPE contains a list of all identified motifs with their scores, number of occurrences, fraction of genes containing the motif, and the algorithm used to identify the motif. For each motif, result details include a consensus representation of the motif, a sequence logo, a position weight matrix, and a list of instances for every motif occurrence (with exact positions and "strand" indicated). Results are returned in a browser window and also optionally by email. Previous papers describe the SCOPE algorithms in detail1,2,9-11.
Genetics, Issue 51, gene regulation, computational biology, algorithm, promoter sequence motif
Play Button
Electroporation of Mycobacteria
Authors: Renan Goude, Tanya Parish.
Institutions: Barts and the London School of Medicine and Dentistry, Barts and the London School of Medicine and Dentistry.
High efficiency transformation is a major limitation in the study of mycobacteria. The genus Mycobacterium can be difficult to transform; this is mainly caused by the thick and waxy cell wall, but is compounded by the fact that most molecular techniques have been developed for distantly-related species such as Escherichia coli and Bacillus subtilis. In spite of these obstacles, mycobacterial plasmids have been identified and DNA transformation of many mycobacterial species have now been described. The most successful method for introducing DNA into mycobacteria is electroporation. Many parameters contribute to successful transformation; these include the species/strain, the nature of the transforming DNA, the selectable marker used, the growth medium, and the conditions for the electroporation pulse. Optimized methods for the transformation of both slow- and fast-grower are detailed here. Transformation efficiencies for different mycobacterial species and with various selectable markers are reported.
Microbiology, Issue 15, Springer Protocols, Mycobacteria, Electroporation, Bacterial Transformation, Transformation Efficiency, Bacteria, Tuberculosis, M. Smegmatis, Springer Protocols
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.