JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
A Bayesian interpretation of the particle swarm optimization and its kernel extension.
Particle swarm optimization is a popular method for solving difficult optimization problems. There have been attempts to formulate the method in formal probabilistic or stochastic terms (e.g. bare bones particle swarm) with the aim to achieve more generality and explain the practical behavior of the method. Here we present a Bayesian interpretation of the particle swarm optimization. This interpretation provides a formal framework for incorporation of prior knowledge about the problem that is being solved. Furthermore, it also allows to extend the particle optimization method through the use of kernel functions that represent the intermediary transformation of the data into a different space where the optimization problem is expected to be easier to be resolved-such transformation can be seen as a form of prior knowledge about the nature of the optimization problem. We derive from the general Bayesian formulation the commonly used particle swarm methods as particular cases.
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Published: 08-30-2013
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
26 Related JoVE Articles!
Play Button
Formulation of Diblock Polymeric Nanoparticles through Nanoprecipitation Technique
Authors: Shrirang Karve, Michael E. Werner, Natalie D. Cummings, Rohit Sukumar, Edina C. Wang, Ying-Ao Zhang, Andrew Z. Wang.
Institutions: University of North Carolina School of Medicine, University of North Carolina .
Nanotechnology is a relatively new branch of science that involves harnessing the unique properties of particles that are nanometers in scale (nanoparticles). Nanoparticles can be engineered in a precise fashion where their size, composition and surface chemistry can be carefully controlled. This enables unprecedented freedom to modify some of the fundamental properties of their cargo, such as solubility, diffusivity, biodistribution, release characteristics and immunogenicity. Since their inception, nanoparticles have been utilized in many areas of science and medicine, including drug delivery, imaging, and cell biology1-4. However, it has not been fully utilized outside of "nanotechnology laboratories" due to perceived technical barrier. In this article, we describe a simple method to synthesize a polymer based nanoparticle platform that has a wide range of potential applications. The first step is to synthesize a diblock co-polymer that has both a hydrophobic domain and hydrophilic domain. Using PLGA and PEG as model polymers, we described a conjugation reaction using EDC/NHS chemistry5 (Fig 1). We also discuss the polymer purification process. The synthesized diblock co-polymer can self-assemble into nanoparticles in the nanoprecipitation process through hydrophobic-hydrophilic interactions. The described polymer nanoparticle is very versatile. The hydrophobic core of the nanoparticle can be utilized to carry poorly soluble drugs for drug delivery experiments6. Furthermore, the nanoparticles can overcome the problem of toxic solvents for poorly soluble molecular biology reagents, such as wortmannin, which requires a solvent like DMSO. However, DMSO can be toxic to cells and interfere with the experiment. These poorly soluble drugs and reagents can be effectively delivered using polymer nanoparticles with minimal toxicity. Polymer nanoparticles can also be loaded with fluorescent dye and utilized for intracellular trafficking studies. Lastly, these polymer nanoparticles can be conjugated to targeting ligands through surface PEG. Such targeted nanoparticles can be utilized to label specific epitopes on or in cells7-10.
Bioengineering, Issue 55, Nanoparticles, nanomedicine, drug delivery, polymeric micelles, polymeric nanoparticles, diblock co-polymers, nanoplatform, nanoparticle molecular imaging, polymer conjugation.
Play Button
AC Electrokinetic Phenomena Generated by Microelectrode Structures
Authors: Robert Hart, Jonghyun Oh, Jorge Capurro, Hongseok (Moses) Noh.
Institutions: Drexel University, Drexel University.
The field of AC electrokinetics is rapidly growing due to its ability to perform dynamic fluid and particle manipulation on the micro- and nano-scale, which is essential for Lab-on-a-Chip applications. AC electrokinetic phenomena use electric fields to generate forces that act on fluids or suspended particles (including those made of dielectric or biological material) and cause them to move in astonishing ways1, 2. Within a single channel, AC electrokinetics can accomplish many essential on-chip operations such as active micro-mixing, particle separation, particle positioning and micro-pattering. A single device may accomplish several of those operations by simply adjusting operating parameters such as frequency or amplitude of the applied voltage. Suitable electric fields can be readily created by micro-electrodes integrated into microchannels. It is clear from the tremendous growth in this field that AC electrokinetics will likely have a profound effect on healthcare diagnostics3-5, environmental monitoring6 and homeland security7. In general, there are three AC Electrokinetic phenomena (AC electroosmosis, dielectrophoresis and AC electrothermal effect) each with unique dependencies on the operating parameters. A change in these operating parameters can cause one phenomena to become dominant over another, thus changing the particle or fluid behavior. It is difficult to predict the behavior of particles and fluids due to the complicated physics that underlie AC electrokinetics. It is the goal of this publication to explain the physics and elucidate particle and fluid behavior. Our analysis also covers how to fabricate the electrode structures that generate them, and how to interpret a wide number of experimental observations using several popular device designs. This video article will help scientists and engineers understand these phenomena and may encourage them to start using AC Electrokinetics in their research.
Bioengineering, Issue 17, AC Electrokinetics, AC Electroosmosis, Dielectrophoresis, Electrothermal Effect, Microelectrode, Microfluidics, Simulation, Microsphere, Microfabrication
Play Button
Single Particle Electron Microscopy Reconstruction of the Exosome Complex Using the Random Conical Tilt Method
Authors: Xueqi Liu, Hong-Wei Wang.
Institutions: Yale University.
Single particle electron microscopy (EM) reconstruction has recently become a popular tool to get the three-dimensional (3D) structure of large macromolecular complexes. Compared to X-ray crystallography, it has some unique advantages. First, single particle EM reconstruction does not need to crystallize the protein sample, which is the bottleneck in X-ray crystallography, especially for large macromolecular complexes. Secondly, it does not need large amounts of protein samples. Compared with milligrams of proteins necessary for crystallization, single particle EM reconstruction only needs several micro-liters of protein solution at nano-molar concentrations, using the negative staining EM method. However, despite a few macromolecular assemblies with high symmetry, single particle EM is limited at relatively low resolution (lower than 1 nm resolution) for many specimens especially those without symmetry. This technique is also limited by the size of the molecules under study, i.e. 100 kDa for negatively stained specimens and 300 kDa for frozen-hydrated specimens in general. For a new sample of unknown structure, we generally use a heavy metal solution to embed the molecules by negative staining. The specimen is then examined in a transmission electron microscope to take two-dimensional (2D) micrographs of the molecules. Ideally, the protein molecules have a homogeneous 3D structure but exhibit different orientations in the micrographs. These micrographs are digitized and processed in computers as "single particles". Using two-dimensional alignment and classification techniques, homogenous molecules in the same views are clustered into classes. Their averages enhance the signal of the molecule's 2D shapes. After we assign the particles with the proper relative orientation (Euler angles), we will be able to reconstruct the 2D particle images into a 3D virtual volume. In single particle 3D reconstruction, an essential step is to correctly assign the proper orientation of each single particle. There are several methods to assign the view for each particle, including the angular reconstitution1 and random conical tilt (RCT) method2. In this protocol, we describe our practice in getting the 3D reconstruction of yeast exosome complex using negative staining EM and RCT. It should be noted that our protocol of electron microscopy and image processing follows the basic principle of RCT but is not the only way to perform the method. We first describe how to embed the protein sample into a layer of Uranyl-Formate with a thickness comparable to the protein size, using a holey carbon grid covered with a layer of continuous thin carbon film. Then the specimen is inserted into a transmission electron microscope to collect untilted (0-degree) and tilted (55-degree) pairs of micrographs that will be used later for processing and obtaining an initial 3D model of the yeast exosome. To this end, we perform RCT and then refine the initial 3D model by using the projection matching refinement method3.
Structural Biology, Issue 49, Electron microscopy, single particle three-dimensional reconstruction, exosome complex, negative staining
Play Button
High-throughput Crystallization of Membrane Proteins Using the Lipidic Bicelle Method
Authors: Rachna Ujwal, Jeff Abramson.
Institutions: University of California Los Angeles , David Geffen School of Medicine, UCLA.
Membrane proteins (MPs) play a critical role in many physiological processes such as pumping specific molecules across the otherwise impermeable membrane bilayer that surrounds all cells and organelles. Alterations in the function of MPs result in many human diseases and disorders; thus, an intricate understanding of their structures remains a critical objective for biological research. However, structure determination of MPs remains a significant challenge often stemming from their hydrophobicity. MPs have substantial hydrophobic regions embedded within the bilayer. Detergents are frequently used to solubilize these proteins from the bilayer generating a protein-detergent micelle that can then be manipulated in a similar manner as soluble proteins. Traditionally, crystallization trials proceed using a protein-detergent mixture, but they often resist crystallization or produce crystals of poor quality. These problems arise due to the detergent′s inability to adequately mimic the bilayer resulting in poor stability and heterogeneity. In addition, the detergent shields the hydrophobic surface of the MP reducing the surface area available for crystal contacts. To circumvent these drawbacks MPs can be crystallized in lipidic media, which more closely simulates their endogenous environment, and has recently become a de novo technique for MP crystallization. Lipidic cubic phase (LCP) is a three-dimensional lipid bilayer penetrated by an interconnected system of aqueous channels1. Although monoolein is the lipid of choice, related lipids such as monopalmitolein and monovaccenin have also been used to make LCP2. MPs are incorporated into the LCP where they diffuse in three dimensions and feed crystal nuclei. A great advantage of the LCP is that the protein remains in a more native environment, but the method has a number of technical disadvantages including high viscosity (requiring specialized apparatuses) and difficulties in crystal visualization and manipulation3,4. Because of these technical difficulties, we utilized another lipidic medium for crystallization-bicelles5,6 (Figure 1). Bicelles are lipid/amphiphile mixtures formed by blending a phosphatidylcholine lipid (DMPC) with an amphiphile (CHAPSO) or a short-chain lipid (DHPC). Within each bicelle disc, the lipid molecules generate a bilayer while the amphiphile molecules line the apolar edges providing beneficial properties of both bilayers and detergents. Importantly, below their transition temperature, protein-bicelle mixtures have a reduced viscosity and are manipulated in a similar manner as detergent-solubilized MPs, making bicelles compatible with crystallization robots. Bicelles have been successfully used to crystallize several membrane proteins5,7-11 (Table 1). This growing collection of proteins demonstrates the versatility of bicelles for crystallizing both alpha helical and beta sheet MPs from prokaryotic and eukaryotic sources. Because of these successes and the simplicity of high-throughput implementation, bicelles should be part of every membrane protein crystallographer′s arsenal. In this video, we describe the bicelle methodology and provide a step-by-step protocol for setting up high-throughput crystallization trials of purified MPs using standard robotics.
Molecular Biology, Issue 59, membrane proteins crystallization, bicelle, lipidic crystallization
Play Button
Flying Insect Detection and Classification with Inexpensive Sensors
Authors: Yanping Chen, Adena Why, Gustavo Batista, Agenor Mafra-Neto, Eamonn Keogh.
Institutions: University of California, Riverside, University of California, Riverside, University of São Paulo - USP, ISCA Technologies.
An inexpensive, noninvasive system that could accurately classify flying insects would have important implications for entomological research, and allow for the development of many useful applications in vector and pest control for both medical and agricultural entomology. Given this, the last sixty years have seen many research efforts devoted to this task. To date, however, none of this research has had a lasting impact. In this work, we show that pseudo-acoustic optical sensors can produce superior data; that additional features, both intrinsic and extrinsic to the insect’s flight behavior, can be exploited to improve insect classification; that a Bayesian classification approach allows to efficiently learn classification models that are very robust to over-fitting, and a general classification framework allows to easily incorporate arbitrary number of features. We demonstrate the findings with large-scale experiments that dwarf all previous works combined, as measured by the number of insects and the number of species considered.
Bioengineering, Issue 92, flying insect detection, automatic insect classification, pseudo-acoustic optical sensors, Bayesian classification framework, flight sound, circadian rhythm
Play Button
Magnetic Tweezers for the Measurement of Twist and Torque
Authors: Jan Lipfert, Mina Lee, Orkide Ordu, Jacob W. J. Kerssemakers, Nynke H. Dekker.
Institutions: Delft University of Technology.
Single-molecule techniques make it possible to investigate the behavior of individual biological molecules in solution in real time. These techniques include so-called force spectroscopy approaches such as atomic force microscopy, optical tweezers, flow stretching, and magnetic tweezers. Amongst these approaches, magnetic tweezers have distinguished themselves by their ability to apply torque while maintaining a constant stretching force. Here, it is illustrated how such a “conventional” magnetic tweezers experimental configuration can, through a straightforward modification of its field configuration to minimize the magnitude of the transverse field, be adapted to measure the degree of twist in a biological molecule. The resulting configuration is termed the freely-orbiting magnetic tweezers. Additionally, it is shown how further modification of the field configuration can yield a transverse field with a magnitude intermediate between that of the “conventional” magnetic tweezers and the freely-orbiting magnetic tweezers, which makes it possible to directly measure the torque stored in a biological molecule. This configuration is termed the magnetic torque tweezers. The accompanying video explains in detail how the conversion of conventional magnetic tweezers into freely-orbiting magnetic tweezers and magnetic torque tweezers can be accomplished, and demonstrates the use of these techniques. These adaptations maintain all the strengths of conventional magnetic tweezers while greatly expanding the versatility of this powerful instrument.
Bioengineering, Issue 87, magnetic tweezers, magnetic torque tweezers, freely-orbiting magnetic tweezers, twist, torque, DNA, single-molecule techniques
Play Button
Preparation of Primary Neurons for Visualizing Neurites in a Frozen-hydrated State Using Cryo-Electron Tomography
Authors: Sarah H. Shahmoradian, Mauricio R. Galiano, Chengbiao Wu, Shurui Chen, Matthew N. Rasband, William C. Mobley, Wah Chiu.
Institutions: Baylor College of Medicine, Baylor College of Medicine, University of California at San Diego, Baylor College of Medicine.
Neurites, both dendrites and axons, are neuronal cellular processes that enable the conduction of electrical impulses between neurons. Defining the structure of neurites is critical to understanding how these processes move materials and signals that support synaptic communication. Electron microscopy (EM) has been traditionally used to assess the ultrastructural features within neurites; however, the exposure to organic solvent during dehydration and resin embedding can distort structures. An important unmet goal is the formulation of procedures that allow for structural evaluations not impacted by such artifacts. Here, we have established a detailed and reproducible protocol for growing and flash-freezing whole neurites of different primary neurons on electron microscopy grids followed by their examination with cryo-electron tomography (cryo-ET). This technique allows for 3-D visualization of frozen, hydrated neurites at nanometer resolution, facilitating assessment of their morphological differences. Our protocol yields an unprecedented view of dorsal root ganglion (DRG) neurites, and a visualization of hippocampal neurites in their near-native state. As such, these methods create a foundation for future studies on neurites of both normal neurons and those impacted by neurological disorders.
Neuroscience, Issue 84, Neurons, Cryo-electron Microscopy, Electron Microscope Tomography, Brain, rat, primary neuron culture, morphological assay
Play Button
A Practical Guide to Phylogenetics for Nonexperts
Authors: Damien O'Halloran.
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
Play Button
Creating Dynamic Images of Short-lived Dopamine Fluctuations with lp-ntPET: Dopamine Movies of Cigarette Smoking
Authors: Evan D. Morris, Su Jin Kim, Jenna M. Sullivan, Shuo Wang, Marc D. Normandin, Cristian C. Constantinescu, Kelly P. Cosgrove.
Institutions: Yale University, Yale University, Yale University, Yale University, Massachusetts General Hospital, University of California, Irvine.
We describe experimental and statistical steps for creating dopamine movies of the brain from dynamic PET data. The movies represent minute-to-minute fluctuations of dopamine induced by smoking a cigarette. The smoker is imaged during a natural smoking experience while other possible confounding effects (such as head motion, expectation, novelty, or aversion to smoking repeatedly) are minimized. We present the details of our unique analysis. Conventional methods for PET analysis estimate time-invariant kinetic model parameters which cannot capture short-term fluctuations in neurotransmitter release. Our analysis - yielding a dopamine movie - is based on our work with kinetic models and other decomposition techniques that allow for time-varying parameters 1-7. This aspect of the analysis - temporal-variation - is key to our work. Because our model is also linear in parameters, it is practical, computationally, to apply at the voxel level. The analysis technique is comprised of five main steps: pre-processing, modeling, statistical comparison, masking and visualization. Preprocessing is applied to the PET data with a unique 'HYPR' spatial filter 8 that reduces spatial noise but preserves critical temporal information. Modeling identifies the time-varying function that best describes the dopamine effect on 11C-raclopride uptake. The statistical step compares the fit of our (lp-ntPET) model 7 to a conventional model 9. Masking restricts treatment to those voxels best described by the new model. Visualization maps the dopamine function at each voxel to a color scale and produces a dopamine movie. Interim results and sample dopamine movies of cigarette smoking are presented.
Behavior, Issue 78, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Medicine, Anatomy, Physiology, Image Processing, Computer-Assisted, Receptors, Dopamine, Dopamine, Functional Neuroimaging, Binding, Competitive, mathematical modeling (systems analysis), Neurotransmission, transient, dopamine release, PET, modeling, linear, time-invariant, smoking, F-test, ventral-striatum, clinical techniques
Play Button
Live Cell Imaging of Alphaherpes Virus Anterograde Transport and Spread
Authors: Matthew P. Taylor, Radomir Kratchmarov, Lynn W. Enquist.
Institutions: Montana State University, Princeton University.
Advances in live cell fluorescence microscopy techniques, as well as the construction of recombinant viral strains that express fluorescent fusion proteins have enabled real-time visualization of transport and spread of alphaherpes virus infection of neurons. The utility of novel fluorescent fusion proteins to viral membrane, tegument, and capsids, in conjunction with live cell imaging, identified viral particle assemblies undergoing transport within axons. Similar tools have been successfully employed for analyses of cell-cell spread of viral particles to quantify the number and diversity of virions transmitted between cells. Importantly, the techniques of live cell imaging of anterograde transport and spread produce a wealth of information including particle transport velocities, distributions of particles, and temporal analyses of protein localization. Alongside classical viral genetic techniques, these methodologies have provided critical insights into important mechanistic questions. In this article we describe in detail the imaging methods that were developed to answer basic questions of alphaherpes virus transport and spread.
Virology, Issue 78, Infection, Immunology, Medicine, Molecular Biology, Cellular Biology, Microbiology, Genetics, Microscopy, Fluorescence, Neurobiology, Herpes virus, fluorescent protein, epifluorescent microscopy, neuronal culture, axon, virion, video microscopy, virus, live cell, imaging
Play Button
Efficient Polyethylene Glycol (PEG) Mediated Transformation of the Moss Physcomitrella patens
Authors: Yen-Chun Liu, Luis Vidali.
Institutions: Worcester Polytechnic Institute- WPI.
A simple and efficient method to transform Physcomitrella pantens protoplasts is described. This method is adapted from protocols for Physocmitrella protonemal protoplast and Arabidopsis mesophyll protoplast transformation1. Due to its capacity to undergo efficient mitotic homologous recombination, Physcomitrella patens has emerged as an important model system in recent years2. This capacity allows high frequencies of gene targeting3-9, which is not seen in other model plants such as Arabidopsis. To take full advantage of this system, we need an effective and easy method to deliver DNA into moss cells. The most common ways to transform this moss are particle bombardment10 and PEG-mediated DNA uptake11. Although particle bombardment can produce a high transformation efficiency12, gene guns are not readily available to many laboratories and the protocol is difficult to standardize. On the other hand, PEG mediated transformation does not require specialized equipments, and can be performed in any laboratory with a sterile hood. Here, we show a simple and highly efficient method for transformation of moss protoplasts. This method can generate more than 120 transient transformants per microgram of DNA, which is an improvement from the most efficient protocol previously reported13. Because of its simplicity, efficiency, and reproducibility, this method can be applied to projects requiring large number of transformants as well as for routine transformation.
Plant Biology, Issue 50, Transformation, Physcomitrella patens, polyethylene glycol, protoplasts
Play Button
Nanomanipulation of Single RNA Molecules by Optical Tweezers
Authors: William Stephenson, Gorby Wan, Scott A. Tenenbaum, Pan T. X. Li.
Institutions: University at Albany, State University of New York, University at Albany, State University of New York, University at Albany, State University of New York, University at Albany, State University of New York, University at Albany, State University of New York.
A large portion of the human genome is transcribed but not translated. In this post genomic era, regulatory functions of RNA have been shown to be increasingly important. As RNA function often depends on its ability to adopt alternative structures, it is difficult to predict RNA three-dimensional structures directly from sequence. Single-molecule approaches show potentials to solve the problem of RNA structural polymorphism by monitoring molecular structures one molecule at a time. This work presents a method to precisely manipulate the folding and structure of single RNA molecules using optical tweezers. First, methods to synthesize molecules suitable for single-molecule mechanical work are described. Next, various calibration procedures to ensure the proper operations of the optical tweezers are discussed. Next, various experiments are explained. To demonstrate the utility of the technique, results of mechanically unfolding RNA hairpins and a single RNA kissing complex are used as evidence. In these examples, the nanomanipulation technique was used to study folding of each structural domain, including secondary and tertiary, independently. Lastly, the limitations and future applications of the method are discussed.
Bioengineering, Issue 90, RNA folding, single-molecule, optical tweezers, nanomanipulation, RNA secondary structure, RNA tertiary structure
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
Play Button
Test Samples for Optimizing STORM Super-Resolution Microscopy
Authors: Daniel J. Metcalf, Rebecca Edwards, Neelam Kumarswami, Alex E. Knight.
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
Play Button
Adaptation of Semiautomated Circulating Tumor Cell (CTC) Assays for Clinical and Preclinical Research Applications
Authors: Lori E. Lowes, Benjamin D. Hedley, Michael Keeney, Alison L. Allan.
Institutions: London Health Sciences Centre, Western University, London Health Sciences Centre, Lawson Health Research Institute, Western University.
The majority of cancer-related deaths occur subsequent to the development of metastatic disease. This highly lethal disease stage is associated with the presence of circulating tumor cells (CTCs). These rare cells have been demonstrated to be of clinical significance in metastatic breast, prostate, and colorectal cancers. The current gold standard in clinical CTC detection and enumeration is the FDA-cleared CellSearch system (CSS). This manuscript outlines the standard protocol utilized by this platform as well as two additional adapted protocols that describe the detailed process of user-defined marker optimization for protein characterization of patient CTCs and a comparable protocol for CTC capture in very low volumes of blood, using standard CSS reagents, for studying in vivo preclinical mouse models of metastasis. In addition, differences in CTC quality between healthy donor blood spiked with cells from tissue culture versus patient blood samples are highlighted. Finally, several commonly discrepant items that can lead to CTC misclassification errors are outlined. Taken together, these protocols will provide a useful resource for users of this platform interested in preclinical and clinical research pertaining to metastasis and CTCs.
Medicine, Issue 84, Metastasis, circulating tumor cells (CTCs), CellSearch system, user defined marker characterization, in vivo, preclinical mouse model, clinical research
Play Button
Metabolomic Analysis of Rat Brain by High Resolution Nuclear Magnetic Resonance Spectroscopy of Tissue Extracts
Authors: Norbert W. Lutz, Evelyne Béraud, Patrick J. Cozzone.
Institutions: Aix-Marseille Université, Aix-Marseille Université.
Studies of gene expression on the RNA and protein levels have long been used to explore biological processes underlying disease. More recently, genomics and proteomics have been complemented by comprehensive quantitative analysis of the metabolite pool present in biological systems. This strategy, termed metabolomics, strives to provide a global characterization of the small-molecule complement involved in metabolism. While the genome and the proteome define the tasks cells can perform, the metabolome is part of the actual phenotype. Among the methods currently used in metabolomics, spectroscopic techniques are of special interest because they allow one to simultaneously analyze a large number of metabolites without prior selection for specific biochemical pathways, thus enabling a broad unbiased approach. Here, an optimized experimental protocol for metabolomic analysis by high-resolution NMR spectroscopy is presented, which is the method of choice for efficient quantification of tissue metabolites. Important strengths of this method are (i) the use of crude extracts, without the need to purify the sample and/or separate metabolites; (ii) the intrinsically quantitative nature of NMR, permitting quantitation of all metabolites represented by an NMR spectrum with one reference compound only; and (iii) the nondestructive nature of NMR enabling repeated use of the same sample for multiple measurements. The dynamic range of metabolite concentrations that can be covered is considerable due to the linear response of NMR signals, although metabolites occurring at extremely low concentrations may be difficult to detect. For the least abundant compounds, the highly sensitive mass spectrometry method may be advantageous although this technique requires more intricate sample preparation and quantification procedures than NMR spectroscopy. We present here an NMR protocol adjusted to rat brain analysis; however, the same protocol can be applied to other tissues with minor modifications.
Neuroscience, Issue 91, metabolomics, brain tissue, rodents, neurochemistry, tissue extracts, NMR spectroscopy, quantitative metabolite analysis, cerebral metabolism, metabolic profile
Play Button
Confocal Imaging of Confined Quiescent and Flowing Colloid-polymer Mixtures
Authors: Rahul Pandey, Melissa Spannuth, Jacinta C. Conrad.
Institutions: University of Houston.
The behavior of confined colloidal suspensions with attractive interparticle interactions is critical to the rational design of materials for directed assembly1-3, drug delivery4, improved hydrocarbon recovery5-7, and flowable electrodes for energy storage8. Suspensions containing fluorescent colloids and non-adsorbing polymers are appealing model systems, as the ratio of the polymer radius of gyration to the particle radius and concentration of polymer control the range and strength of the interparticle attraction, respectively. By tuning the polymer properties and the volume fraction of the colloids, colloid fluids, fluids of clusters, gels, crystals, and glasses can be obtained9. Confocal microscopy, a variant of fluorescence microscopy, allows an optically transparent and fluorescent sample to be imaged with high spatial and temporal resolution in three dimensions. In this technique, a small pinhole or slit blocks the emitted fluorescent light from regions of the sample that are outside the focal volume of the microscope optical system. As a result, only a thin section of the sample in the focal plane is imaged. This technique is particularly well suited to probe the structure and dynamics in dense colloidal suspensions at the single-particle scale: the particles are large enough to be resolved using visible light and diffuse slowly enough to be captured at typical scan speeds of commercial confocal systems10. Improvements in scan speeds and analysis algorithms have also enabled quantitative confocal imaging of flowing suspensions11-16,37. In this paper, we demonstrate confocal microscopy experiments to probe the confined phase behavior and flow properties of colloid-polymer mixtures. We first prepare colloid-polymer mixtures that are density- and refractive-index matched. Next, we report a standard protocol for imaging quiescent dense colloid-polymer mixtures under varying confinement in thin wedge-shaped cells. Finally, we demonstrate a protocol for imaging colloid-polymer mixtures during microchannel flow.
Chemistry, Issue 87, confocal microscopy, particle tracking, colloids, suspensions, confinement, gelation, microfluidics, image correlation, dynamics, suspension flow
Play Button
Particle Agglutination Method for Poliovirus Identification
Authors: Minetaro Arita, Souji Masujima, Takaji Wakita, Hiroyuki Shimizu.
Institutions: National Institute of Infectious Diseases, Fujirebio Inc..
In the Global Polio Eradication Initiative, laboratory diagnosis plays a critical role by isolating and identifying PV from the stool samples of acute flaccid paralysis (AFP) cases. In the World Health Organization (WHO) Global Polio Laboratory Network, PV isolation and identification are currently being performed by using cell culture system and real-time RT-PCR, respectively. In the post-eradication era of PV, simple and rapid identification procedures would be helpful for rapid confirmation of polio cases at the national laboratories. In the present study, we will show the procedure of novel PA assay developed for PV identification. This PA assay utilizes interaction of PV receptor (PVR) molecule and virion that is specific and uniform affinity to all the serotypes of PV. The procedure is simple (one step procedure in reaction plates) and rapid (results can be obtained within 2 h of reaction), and the result is visually observed (observation of agglutination of gelatin particles).
Immunology, Issue 50, Poliovirus, identification, particle agglutination, virus receptor
Play Button
Recording Multicellular Behavior in Myxococcus xanthus Biofilms using Time-lapse Microcinematography
Authors: Rion G. Taylor, Roy D. Welch.
Institutions: University of South Carolina (USC), Syracuse University.
A swarm of the δ-proteobacterium Myxococcus xanthus contains millions of cells that act as a collective, coordinating movement through a series of signals to create complex, dynamic patterns as a response to environmental cues. These patterns are self-organizing and emergent; they cannot be predicted by observing the behavior of the individual cells. Using a time-lapse microcinematography tracking assay, we identified a distinct emergent pattern in M. xanthus called chemotaxis, defined as the directed movement of a swarm up a nutrient gradient toward its source 1. In order to efficiently characterize chemotaxis via time-lapse microcinematography, we developed a highly modifiable plate complex (Figure 1) and constructed a cluster of 8 microscopes (Figure 2), each capable of capturing time-lapse videos. The assay is rigorous enough to allow consistent replication of quantifiable data, and the resulting videos allow us to observe and track subtle changes in swarm behavior. Once captured, the videos are transferred to an analysis/storage computer with enough memory to process and store thousands of videos. The flexibility of this setup has proven useful to several members of the M. xanthus community.
Microbiology, Issue 42, microcinematography, Myxococcus, chemotaxis, time-lapse
Play Button
Transformation of Plasmid DNA into E. coli Using the Heat Shock Method
Authors: Alexandrine Froger, James E. Hall.
Institutions: University of California, Irvine (UCI).
Transformation of plasmid DNA into E. coli using the heat shock method is a basic technique of molecular biology. It consists of inserting a foreign plasmid or ligation product into bacteria. This video protocol describes the traditional method of transformation using commercially available chemically competent bacteria from Genlantis. After a short incubation in ice, a mixture of chemically competent bacteria and DNA is placed at 42°C for 45 seconds (heat shock) and then placed back in ice. SOC media is added and the transformed cells are incubated at 37°C for 30 min with agitation. To be assured of isolating colonies irrespective of transformation efficiency, two quantities of transformed bacteria are plated. This traditional protocol can be used successfully to transform most commercially available competent bacteria. The turbocells from Genlantis can also be used in a novel 3-minute transformation protocol, described in the instruction manual.
Issue 6, Basic Protocols, DNA, transformation, plasmid, cloning
Play Button
Electroporation of Mycobacteria
Authors: Renan Goude, Tanya Parish.
Institutions: Barts and the London School of Medicine and Dentistry, Barts and the London School of Medicine and Dentistry.
High efficiency transformation is a major limitation in the study of mycobacteria. The genus Mycobacterium can be difficult to transform; this is mainly caused by the thick and waxy cell wall, but is compounded by the fact that most molecular techniques have been developed for distantly-related species such as Escherichia coli and Bacillus subtilis. In spite of these obstacles, mycobacterial plasmids have been identified and DNA transformation of many mycobacterial species have now been described. The most successful method for introducing DNA into mycobacteria is electroporation. Many parameters contribute to successful transformation; these include the species/strain, the nature of the transforming DNA, the selectable marker used, the growth medium, and the conditions for the electroporation pulse. Optimized methods for the transformation of both slow- and fast-grower are detailed here. Transformation efficiencies for different mycobacterial species and with various selectable markers are reported.
Microbiology, Issue 15, Springer Protocols, Mycobacteria, Electroporation, Bacterial Transformation, Transformation Efficiency, Bacteria, Tuberculosis, M. Smegmatis, Springer Protocols
Play Button
Quantitatively Measuring In situ Flows using a Self-Contained Underwater Velocimetry Apparatus (SCUVA)
Authors: Kakani Katija, Sean P. Colin, John H. Costello, John O. Dabiri.
Institutions: Woods Hole Oceanographic Institution, Roger Williams University, Whitman Center, Providence College, California Institute of Technology.
The ability to directly measure velocity fields in a fluid environment is necessary to provide empirical data for studies in fields as diverse as oceanography, ecology, biology, and fluid mechanics. Field measurements introduce practical challenges such as environmental conditions, animal availability, and the need for field-compatible measurement techniques. To avoid these challenges, scientists typically use controlled laboratory environments to study animal-fluid interactions. However, it is reasonable to question whether one can extrapolate natural behavior (i.e., that which occurs in the field) from laboratory measurements. Therefore, in situ quantitative flow measurements are needed to accurately describe animal swimming in their natural environment. We designed a self-contained, portable device that operates independent of any connection to the surface, and can provide quantitative measurements of the flow field surrounding an animal. This apparatus, a self-contained underwater velocimetry apparatus (SCUVA), can be operated by a single scuba diver in depths up to 40 m. Due to the added complexity inherent of field conditions, additional considerations and preparation are required when compared to laboratory measurements. These considerations include, but are not limited to, operator motion, predicting position of swimming targets, available natural suspended particulate, and orientation of SCUVA relative to the flow of interest. The following protocol is intended to address these common field challenges and to maximize measurement success.
Bioengineering, Issue 56, In situ DPIV, SCUVA, animal flow measurements, zooplankton, propulsion
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.