JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Unit operation optimization for the manufacturing of botanical injections using a design space approach: a case study of water precipitation.
PLoS ONE
PUBLISHED: 01-01-2014
Quality by design (QbD) concept is a paradigm for the improvement of botanical injection quality control. In this work, water precipitation process for the manufacturing of Xueshuantong injection, a botanical injection made from Notoginseng Radix et Rhizoma, was optimized using a design space approach as a sample. Saponin recovery and total saponin purity (TSP) in supernatant were identified as the critical quality attributes (CQAs) of water precipitation using a risk assessment for all the processes of Xueshuantong injection. An Ishikawa diagram and experiments of fractional factorial design were applied to determine critical process parameters (CPPs). Dry matter content of concentrated extract (DMCC), amount of water added (AWA), and stirring speed (SS) were identified as CPPs. Box-Behnken designed experiments were carried out to develop models between CPPs and process CQAs. Determination coefficients were higher than 0.86 for all the models. High TSP in supernatant can be obtained when DMCC is low and SS is high. Saponin recoveries decreased as DMCC increased. Incomplete collection of supernatant was the main reason for the loss of saponins. Design space was calculated using a Monte-Carlo simulation method with acceptable probability of 0.90. Recommended normal operation region are located in DMCC of 0.38-0.41 g/g, AWA of 3.7-4.9 g/g, and SS of 280-350 rpm, with a probability more than 0.919 to attain CQA criteria. Verification experiment results showed that operating DMCC, SS, and AWA within design space can attain CQA criteria with high probability.
ABSTRACT
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
24 Related JoVE Articles!
Play Button
Manufacturing of Three-dimensionally Microstructured Nanocomposites through Microfluidic Infiltration
Authors: Rouhollah Dermanaki-Farahani, Louis Laberge Lebel, Daniel Therriault.
Institutions: École Polytechnique de Montréal.
Microstructured composite beams reinforced with complex three-dimensionally (3D) patterned nanocomposite microfilaments are fabricated via nanocomposite infiltration of 3D interconnected microfluidic networks. The manufacturing of the reinforced beams begins with the fabrication of microfluidic networks, which involves layer-by-layer deposition of fugitive ink filaments using a dispensing robot, filling the empty space between filaments using a low viscosity resin, curing the resin and finally removing the ink. Self-supported 3D structures with other geometries and many layers (e.g. a few hundreds layers) could be built using this method. The resulting tubular microfluidic networks are then infiltrated with thermosetting nanocomposite suspensions containing nanofillers (e.g. single-walled carbon nanotubes), and subsequently cured. The infiltration is done by applying a pressure gradient between two ends of the empty network (either by applying a vacuum or vacuum-assisted microinjection). Prior to the infiltration, the nanocomposite suspensions are prepared by dispersing nanofillers into polymer matrices using ultrasonication and three-roll mixing methods. The nanocomposites (i.e. materials infiltrated) are then solidified under UV exposure/heat cure, resulting in a 3D-reinforced composite structure. The technique presented here enables the design of functional nanocomposite macroscopic products for microengineering applications such as actuators and sensors.
Chemistry, Issue 85, Microstructures, Nanocomposites, 3D-patterning, Infiltration, Direct-write assembly, Microfluidic networks
51512
Play Button
Design and Construction of an Urban Runoff Research Facility
Authors: Benjamin G. Wherley, Richard H. White, Kevin J. McInnes, Charles H. Fontanier, James C. Thomas, Jacqueline A. Aitkenhead-Peterson, Steven T. Kelly.
Institutions: Texas A&M University, The Scotts Miracle-Gro Company.
As the urban population increases, so does the area of irrigated urban landscape. Summer water use in urban areas can be 2-3x winter base line water use due to increased demand for landscape irrigation. Improper irrigation practices and large rainfall events can result in runoff from urban landscapes which has potential to carry nutrients and sediments into local streams and lakes where they may contribute to eutrophication. A 1,000 m2 facility was constructed which consists of 24 individual 33.6 m2 field plots, each equipped for measuring total runoff volumes with time and collection of runoff subsamples at selected intervals for quantification of chemical constituents in the runoff water from simulated urban landscapes. Runoff volumes from the first and second trials had coefficient of variability (CV) values of 38.2 and 28.7%, respectively. CV values for runoff pH, EC, and Na concentration for both trials were all under 10%. Concentrations of DOC, TDN, DON, PO4-P, K+, Mg2+, and Ca2+ had CV values less than 50% in both trials. Overall, the results of testing performed after sod installation at the facility indicated good uniformity between plots for runoff volumes and chemical constituents. The large plot size is sufficient to include much of the natural variability and therefore provides better simulation of urban landscape ecosystems.
Environmental Sciences, Issue 90, urban runoff, landscapes, home lawns, turfgrass, St. Augustinegrass, carbon, nitrogen, phosphorus, sodium
51540
Play Button
Experimental Protocol for Manipulating Plant-induced Soil Heterogeneity
Authors: Angela J. Brandt, Gaston A. del Pino, Jean H. Burns.
Institutions: Case Western Reserve University.
Coexistence theory has often treated environmental heterogeneity as being independent of the community composition; however biotic feedbacks such as plant-soil feedbacks (PSF) have large effects on plant performance, and create environmental heterogeneity that depends on the community composition. Understanding the importance of PSF for plant community assembly necessitates understanding of the role of heterogeneity in PSF, in addition to mean PSF effects. Here, we describe a protocol for manipulating plant-induced soil heterogeneity. Two example experiments are presented: (1) a field experiment with a 6-patch grid of soils to measure plant population responses and (2) a greenhouse experiment with 2-patch soils to measure individual plant responses. Soils can be collected from the zone of root influence (soils from the rhizosphere and directly adjacent to the rhizosphere) of plants in the field from conspecific and heterospecific plant species. Replicate collections are used to avoid pseudoreplicating soil samples. These soils are then placed into separate patches for heterogeneous treatments or mixed for a homogenized treatment. Care should be taken to ensure that heterogeneous and homogenized treatments experience the same degree of soil disturbance. Plants can then be placed in these soil treatments to determine the effect of plant-induced soil heterogeneity on plant performance. We demonstrate that plant-induced heterogeneity results in different outcomes than predicted by traditional coexistence models, perhaps because of the dynamic nature of these feedbacks. Theory that incorporates environmental heterogeneity influenced by the assembling community and additional empirical work is needed to determine when heterogeneity intrinsic to the assembling community will result in different assembly outcomes compared with heterogeneity extrinsic to the community composition.
Environmental Sciences, Issue 85, Coexistence, community assembly, environmental drivers, plant-soil feedback, soil heterogeneity, soil microbial communities, soil patch
51580
Play Button
Multi-step Preparation Technique to Recover Multiple Metabolite Compound Classes for In-depth and Informative Metabolomic Analysis
Authors: Charmion Cruickshank-Quinn, Kevin D. Quinn, Roger Powell, Yanhui Yang, Michael Armstrong, Spencer Mahaffey, Richard Reisdorph, Nichole Reisdorph.
Institutions: National Jewish Health, University of Colorado Denver.
Metabolomics is an emerging field which enables profiling of samples from living organisms in order to obtain insight into biological processes. A vital aspect of metabolomics is sample preparation whereby inconsistent techniques generate unreliable results. This technique encompasses protein precipitation, liquid-liquid extraction, and solid-phase extraction as a means of fractionating metabolites into four distinct classes. Improved enrichment of low abundance molecules with a resulting increase in sensitivity is obtained, and ultimately results in more confident identification of molecules. This technique has been applied to plasma, bronchoalveolar lavage fluid, and cerebrospinal fluid samples with volumes as low as 50 µl.  Samples can be used for multiple downstream applications; for example, the pellet resulting from protein precipitation can be stored for later analysis. The supernatant from that step undergoes liquid-liquid extraction using water and strong organic solvent to separate the hydrophilic and hydrophobic compounds. Once fractionated, the hydrophilic layer can be processed for later analysis or discarded if not needed. The hydrophobic fraction is further treated with a series of solvents during three solid-phase extraction steps to separate it into fatty acids, neutral lipids, and phospholipids. This allows the technician the flexibility to choose which class of compounds is preferred for analysis. It also aids in more reliable metabolite identification since some knowledge of chemical class exists.
Bioengineering, Issue 89, plasma, chemistry techniques, analytical, solid phase extraction, mass spectrometry, metabolomics, fluids and secretions, profiling, small molecules, lipids, liquid chromatography, liquid-liquid extraction, cerebrospinal fluid, bronchoalveolar lavage fluid
51670
Play Button
Collection and Extraction of Saliva DNA for Next Generation Sequencing
Authors: Michael R. Goode, Soo Yeon Cheong, Ning Li, William C. Ray, Christopher W. Bartlett.
Institutions: The Research Institute at Nationwide Children's Hospital, The Ohio State University, The Ohio State University.
The preferred source of DNA in human genetics research is blood, or cell lines derived from blood, as these sources yield large quantities of high quality DNA. However, DNA extraction from saliva can yield high quality DNA with little to no degradation/fragmentation that is suitable for a variety of DNA assays without the expense of a phlebotomist and can even be acquired through the mail. However, at present, no saliva DNA collection/extraction protocols for next generation sequencing have been presented in the literature. This protocol optimizes parameters of saliva collection/storage and DNA extraction to be of sufficient quality and quantity for DNA assays with the highest standards, including microarray genotyping and next generation sequencing.
Medicine, Issue 90, DNA collection, saliva, DNA extraction, Next generation sequencing, DNA purification, DNA
51697
Play Button
Reduced-gravity Environment Hardware Demonstrations of a Prototype Miniaturized Flow Cytometer and Companion Microfluidic Mixing Technology
Authors: William S. Phipps, Zhizhong Yin, Candice Bae, Julia Z. Sharpe, Andrew M. Bishara, Emily S. Nelson, Aaron S. Weaver, Daniel Brown, Terri L. McKay, DeVon Griffin, Eugene Y. Chan.
Institutions: DNA Medicine Institute, Harvard Medical School, NASA Glenn Research Center, ZIN Technologies.
Until recently, astronaut blood samples were collected in-flight, transported to earth on the Space Shuttle, and analyzed in terrestrial laboratories. If humans are to travel beyond low Earth orbit, a transition towards space-ready, point-of-care (POC) testing is required. Such testing needs to be comprehensive, easy to perform in a reduced-gravity environment, and unaffected by the stresses of launch and spaceflight. Countless POC devices have been developed to mimic laboratory scale counterparts, but most have narrow applications and few have demonstrable use in an in-flight, reduced-gravity environment. In fact, demonstrations of biomedical diagnostics in reduced gravity are limited altogether, making component choice and certain logistical challenges difficult to approach when seeking to test new technology. To help fill the void, we are presenting a modular method for the construction and operation of a prototype blood diagnostic device and its associated parabolic flight test rig that meet the standards for flight-testing onboard a parabolic flight, reduced-gravity aircraft. The method first focuses on rig assembly for in-flight, reduced-gravity testing of a flow cytometer and a companion microfluidic mixing chip. Components are adaptable to other designs and some custom components, such as a microvolume sample loader and the micromixer may be of particular interest. The method then shifts focus to flight preparation, by offering guidelines and suggestions to prepare for a successful flight test with regard to user training, development of a standard operating procedure (SOP), and other issues. Finally, in-flight experimental procedures specific to our demonstrations are described.
Cellular Biology, Issue 93, Point-of-care, prototype, diagnostics, spaceflight, reduced gravity, parabolic flight, flow cytometry, fluorescence, cell counting, micromixing, spiral-vortex, blood mixing
51743
Play Button
Rapid and Low-cost Prototyping of Medical Devices Using 3D Printed Molds for Liquid Injection Molding
Authors: Philip Chung, J. Alex Heller, Mozziyar Etemadi, Paige E. Ottoson, Jonathan A. Liu, Larry Rand, Shuvo Roy.
Institutions: University of California, San Francisco, University of California, San Francisco, University of Southern California.
Biologically inert elastomers such as silicone are favorable materials for medical device fabrication, but forming and curing these elastomers using traditional liquid injection molding processes can be an expensive process due to tooling and equipment costs. As a result, it has traditionally been impractical to use liquid injection molding for low-cost, rapid prototyping applications. We have devised a method for rapid and low-cost production of liquid elastomer injection molded devices that utilizes fused deposition modeling 3D printers for mold design and a modified desiccator as an injection system. Low costs and rapid turnaround time in this technique lower the barrier to iteratively designing and prototyping complex elastomer devices. Furthermore, CAD models developed in this process can be later adapted for metal mold tooling design, enabling an easy transition to a traditional injection molding process. We have used this technique to manufacture intravaginal probes involving complex geometries, as well as overmolding over metal parts, using tools commonly available within an academic research laboratory. However, this technique can be easily adapted to create liquid injection molded devices for many other applications.
Bioengineering, Issue 88, liquid injection molding, reaction injection molding, molds, 3D printing, fused deposition modeling, rapid prototyping, medical devices, low cost, low volume, rapid turnaround time.
51745
Play Button
Metabolomic Analysis of Rat Brain by High Resolution Nuclear Magnetic Resonance Spectroscopy of Tissue Extracts
Authors: Norbert W. Lutz, Evelyne Béraud, Patrick J. Cozzone.
Institutions: Aix-Marseille Université, Aix-Marseille Université.
Studies of gene expression on the RNA and protein levels have long been used to explore biological processes underlying disease. More recently, genomics and proteomics have been complemented by comprehensive quantitative analysis of the metabolite pool present in biological systems. This strategy, termed metabolomics, strives to provide a global characterization of the small-molecule complement involved in metabolism. While the genome and the proteome define the tasks cells can perform, the metabolome is part of the actual phenotype. Among the methods currently used in metabolomics, spectroscopic techniques are of special interest because they allow one to simultaneously analyze a large number of metabolites without prior selection for specific biochemical pathways, thus enabling a broad unbiased approach. Here, an optimized experimental protocol for metabolomic analysis by high-resolution NMR spectroscopy is presented, which is the method of choice for efficient quantification of tissue metabolites. Important strengths of this method are (i) the use of crude extracts, without the need to purify the sample and/or separate metabolites; (ii) the intrinsically quantitative nature of NMR, permitting quantitation of all metabolites represented by an NMR spectrum with one reference compound only; and (iii) the nondestructive nature of NMR enabling repeated use of the same sample for multiple measurements. The dynamic range of metabolite concentrations that can be covered is considerable due to the linear response of NMR signals, although metabolites occurring at extremely low concentrations may be difficult to detect. For the least abundant compounds, the highly sensitive mass spectrometry method may be advantageous although this technique requires more intricate sample preparation and quantification procedures than NMR spectroscopy. We present here an NMR protocol adjusted to rat brain analysis; however, the same protocol can be applied to other tissues with minor modifications.
Neuroscience, Issue 91, metabolomics, brain tissue, rodents, neurochemistry, tissue extracts, NMR spectroscopy, quantitative metabolite analysis, cerebral metabolism, metabolic profile
51829
Play Button
Mechanical Expansion of Steel Tubing as a Solution to Leaky Wellbores
Authors: Mileva Radonjic, Darko Kupresan.
Institutions: Louisiana State University.
Wellbore cement, a procedural component of wellbore completion operations, primarily provides zonal isolation and mechanical support of the metal pipe (casing), and protects metal components from corrosive fluids. These are essential for uncompromised wellbore integrity. Cements can undergo multiple forms of failure, such as debonding at the cement/rock and cement/metal interfaces, fracturing, and defects within the cement matrix. Failures and defects within the cement will ultimately lead to fluid migration, resulting in inter-zonal fluid migration and premature well abandonment. Currently, there are over 1.8 million operating wells worldwide and over one third of these wells have leak related problems defined as Sustained Casing Pressure (SCP)1. The focus of this research was to develop an experimental setup at bench-scale to explore the effect of mechanical manipulation of wellbore casing-cement composite samples as a potential technology for the remediation of gas leaks. The experimental methodology utilized in this study enabled formation of an impermeable seal at the pipe/cement interface in a simulated wellbore system. Successful nitrogen gas flow-through measurements demonstrated that an existing microannulus was sealed at laboratory experimental conditions and fluid flow prevented by mechanical manipulation of the metal/cement composite sample. Furthermore, this methodology can be applied not only for the remediation of leaky wellbores, but also in plugging and abandonment procedures as well as wellbore completions technology, and potentially preventing negative impacts of wellbores on subsurface and surface environments.
Physics, Issue 93, Leaky wellbores, Wellbore cement, Microannular gas flow, Sustained casing pressure, Expandable casing technology.
52098
Play Button
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
52183
Play Button
In Situ Neutron Powder Diffraction Using Custom-made Lithium-ion Batteries
Authors: William R. Brant, Siegbert Schmid, Guodong Du, Helen E. A. Brand, Wei Kong Pang, Vanessa K. Peterson, Zaiping Guo, Neeraj Sharma.
Institutions: University of Sydney, University of Wollongong, Australian Synchrotron, Australian Nuclear Science and Technology Organisation, University of Wollongong, University of New South Wales.
Li-ion batteries are widely used in portable electronic devices and are considered as promising candidates for higher-energy applications such as electric vehicles.1,2 However, many challenges, such as energy density and battery lifetimes, need to be overcome before this particular battery technology can be widely implemented in such applications.3 This research is challenging, and we outline a method to address these challenges using in situ NPD to probe the crystal structure of electrodes undergoing electrochemical cycling (charge/discharge) in a battery. NPD data help determine the underlying structural mechanism responsible for a range of electrode properties, and this information can direct the development of better electrodes and batteries. We briefly review six types of battery designs custom-made for NPD experiments and detail the method to construct the ‘roll-over’ cell that we have successfully used on the high-intensity NPD instrument, WOMBAT, at the Australian Nuclear Science and Technology Organisation (ANSTO). The design considerations and materials used for cell construction are discussed in conjunction with aspects of the actual in situ NPD experiment and initial directions are presented on how to analyze such complex in situ data.
Physics, Issue 93, In operando, structure-property relationships, electrochemical cycling, electrochemical cells, crystallography, battery performance
52284
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
The Generation of Higher-order Laguerre-Gauss Optical Beams for High-precision Interferometry
Authors: Ludovico Carbone, Paul Fulda, Charlotte Bond, Frank Brueckner, Daniel Brown, Mengyao Wang, Deepali Lodhia, Rebecca Palmer, Andreas Freise.
Institutions: University of Birmingham.
Thermal noise in high-reflectivity mirrors is a major impediment for several types of high-precision interferometric experiments that aim to reach the standard quantum limit or to cool mechanical systems to their quantum ground state. This is for example the case of future gravitational wave observatories, whose sensitivity to gravitational wave signals is expected to be limited in the most sensitive frequency band, by atomic vibration of their mirror masses. One promising approach being pursued to overcome this limitation is to employ higher-order Laguerre-Gauss (LG) optical beams in place of the conventionally used fundamental mode. Owing to their more homogeneous light intensity distribution these beams average more effectively over the thermally driven fluctuations of the mirror surface, which in turn reduces the uncertainty in the mirror position sensed by the laser light. We demonstrate a promising method to generate higher-order LG beams by shaping a fundamental Gaussian beam with the help of diffractive optical elements. We show that with conventional sensing and control techniques that are known for stabilizing fundamental laser beams, higher-order LG modes can be purified and stabilized just as well at a comparably high level. A set of diagnostic tools allows us to control and tailor the properties of generated LG beams. This enabled us to produce an LG beam with the highest purity reported to date. The demonstrated compatibility of higher-order LG modes with standard interferometry techniques and with the use of standard spherical optics makes them an ideal candidate for application in a future generation of high-precision interferometry.
Physics, Issue 78, Optics, Astronomy, Astrophysics, Gravitational waves, Laser interferometry, Metrology, Thermal noise, Laguerre-Gauss modes, interferometry
50564
Play Button
Specific Marking of HIV-1 Positive Cells using a Rev-dependent Lentiviral Vector Expressing the Green Fluorescent Protein
Authors: Jia Guo, Clinton Enos, Yuntao Wu.
Institutions: George Mason University.
Most of HIV-responsive expression vectors are based on the HIV promoter, the long terminal repeat (LTR). While responsive to an early HIV protein, Tat, the LTR is also responsive to cellular activation states and to the local chromatin activity where the integration has occurred. This can result in high HIV-independent activity, and has restricted the usefulness of LTR-based reporter to mark HIV positive cells 1,2,3. Here, we constructed an expression lentiviral vector that possesses, in addition to the Tat-responsive LTR, numerous HIV DNA sequences that include the Rev-response element and HIV splicing sites 4,5,6. The vector was incorporated into a lentiviral reporter virus, permitting highly specific detection of replicating HIV in living cell populations. The activity of the vector was measured by expression of the green fluorescence protein (GFP). The application of this vector as reported here offers a novel alternative approach to existing methods, such as in situ PCR or HIV antigen staining, to identify HIV-positive cells. The vector can also express therapeutic genes for basic or clinical experimentation to target HIV-positive cells.
Infectious Disease, Issue 43, HIV-1, Rev, GFP, lentiviral vector, RRE
2198
Play Button
Protein Crystallization for X-ray Crystallography
Authors: Moshe A. Dessau, Yorgo Modis.
Institutions: Yale University.
Using the three-dimensional structure of biological macromolecules to infer how they function is one of the most important fields of modern biology. The availability of atomic resolution structures provides a deep and unique understanding of protein function, and helps to unravel the inner workings of the living cell. To date, 86% of the Protein Data Bank (rcsb-PDB) entries are macromolecular structures that were determined using X-ray crystallography. To obtain crystals suitable for crystallographic studies, the macromolecule (e.g. protein, nucleic acid, protein-protein complex or protein-nucleic acid complex) must be purified to homogeneity, or as close as possible to homogeneity. The homogeneity of the preparation is a key factor in obtaining crystals that diffract to high resolution (Bergfors, 1999; McPherson, 1999). Crystallization requires bringing the macromolecule to supersaturation. The sample should therefore be concentrated to the highest possible concentration without causing aggregation or precipitation of the macromolecule (usually 2-50 mg/ mL). Introducing the sample to precipitating agent can promote the nucleation of protein crystals in the solution, which can result in large three-dimensional crystals growing from the solution. There are two main techniques to obtain crystals: vapor diffusion and batch crystallization. In vapor diffusion, a drop containing a mixture of precipitant and protein solutions is sealed in a chamber with pure precipitant. Water vapor then diffuses out of the drop until the osmolarity of the drop and the precipitant are equal (Figure 1A). The dehydration of the drop causes a slow concentration of both protein and precipitant until equilibrium is achieved, ideally in the crystal nucleation zone of the phase diagram. The batch method relies on bringing the protein directly into the nucleation zone by mixing protein with the appropriate amount of precipitant (Figure 1B). This method is usually performed under a paraffin/mineral oil mixture to prevent the diffusion of water out of the drop. Here we will demonstrate two kinds of experimental setup for vapor diffusion, hanging drop and sitting drop, in addition to batch crystallization under oil.
Molecular Biology, Issue 47, protein crystallization, nucleic acid crystallization, vapor diffusion, X-ray crystallography, precipitant
2285
Play Button
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Authors: Aya D. Pusic, Yelena Y. Grinberg, Heidi M. Mitchell, Richard P. Kraig.
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers. In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic and modulate SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
2910
Play Button
A Simple Chelex Protocol for DNA Extraction from Anopheles spp.
Authors: Mulenga Musapa, Taida Kumwenda, Mtawa Mkulama, Sandra Chishimba, Douglas E. Norris, Philip E. Thuma, Sungano Mharakurwa.
Institutions: Malaria Institute at Macha, Johns Hopkins Bloomberg School of Public Health.
Endemic countries are increasingly adopting molecular tools for efficient typing, identification and surveillance against malaria parasites and vector mosquitoes, as an integral part of their control programs1,2,3,4,5. For sustainable establishment of these accurate approaches in operations research to strengthen malaria control and elimination efforts, simple and affordable methods, with parsimonious reagent and equipment requirements are essential6,7,8. Here we present a simple Chelex-based technique for extracting malaria parasite and vector DNA from field collected mosquito specimens. We morphologically identified 72 Anopheles gambiae sl. from 156 mosquitoes captured by pyrethrum spray catches in sleeping rooms of households within a 2,000 km2 vicinity of the Malaria Institute at Macha. After dissection to separate the head and thorax from the abdomen for all 72 Anopheles gambiae sl. mosquitoes, the two sections were individually placed in 1.5 ml microcentrifuge tubes and submerged in 20 μl of deionized water. Using a sterile pipette tip, each mosquito section was separately homogenized to a uniform suspension in the deionized water. Of the ensuing homogenate from each mosquito section, 10 μl was retained while the other 10 μl was transferred to a separate autoclaved 1.5 ml tube. The separate aliquots were subjected to DNA extraction by either the simplified Chelex or the standard salting out extraction protocol9,10. The salting out protocol is so-called and widely used because it employs high salt concentrations in lieu of hazardous organic solvents (such as phenol and chloroform) for the protein precipitation step during DNA extraction9. Extracts were used as templates for PCR amplification using primers targeting arthropod mitochondrial nicotinamide adenine dinucleotide dehydrogenase (NADH) subunit 4 gene (ND4) to check DNA quality11, a PCR for identification of Anopheles gambiae sibling species10 and a nested PCR for typing of Plasmodium falciparum infection12. Comparison using DNA quality (ND4) PCR showed 93% sensitivity and 82% specificity for the Chelex approach relative to the established salting out protocol. Corresponding values of sensitivity and specificity were 100% and 78%, respectively, using sibling species identification PCR and 92% and 80%, respectively for P. falciparum detection PCR. There were no significant differences in proportion of samples giving amplicon signal with the Chelex or the regular salting out protocol across all three PCR applications. The Chelex approach required three simple reagents and 37 min to complete, while the salting out protocol entailed 10 different reagents and 2 hr and 47 min' processing time, including an overnight step. Our results show that the Chelex method is comparable to the existing salting out extraction and can be substituted as a simple and sustainable approach in resource-limited settings where a constant reagent supply chain is often difficult to maintain.
Infection, Issue 71, Immunology, Infectious Diseases, Genetics, Molecular Biology, Microbiology, Parasitology, Entomology, Malaria, Plasmodium falciparum, vector, Anopheles, Diptera, mosquitoes, Chelex, DNA, extraction, PCR, dissection, insect, vector, pathogen
3281
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
3998
Play Button
A Practical and Novel Method to Extract Genomic DNA from Blood Collection Kits for Plasma Protein Preservation
Authors: Jon Waters, Vishal Dhere, Adam Benjamin, Arvind Sekar, Archana Kumar, Sampath Prahalad, David T. Okou, Subra Kugathasan.
Institutions: Emory University School of Medicine and Children's Health Care of Atlanta, Emory University School of Medicine and Children's Health Care of Atlanta.
Laboratory tests can be done on the cellular or fluid portions of the blood. The use of different blood collection tubes determines the portion of the blood that can be analyzed (whole blood, plasma or serum). Laboratories involved in studying the genetic basis of human disorders rely on anticoagulated whole blood collected in EDTA-containing vacutainer as the source of DNA for genetic / genomic analysis. Because most clinical laboratories perform biochemical, serologic and viral testing as a first step in phenotypic outcome investigation, anticoagulated blood is also collected in heparin-containing tube (plasma tube). Therefore when DNA and plasma are needed for simultaneous and parallel analyses of both genomic and proteomic data, it is customary to collect blood in both EDTA and heparin tubes. If blood could be collected in a single tube and serve as a source for both plasma and DNA, that method would be considered an advancement to existing methods. The use of the compacted blood after plasma extraction represents an alternative source for genomic DNA, thus minimizing the amount of blood samples processed and reducing the number of samples required from each patient. This would ultimately save time and resources. The BD P100 blood collection system for plasma protein preservation were created as an improved method over previous plasma or serum collection tubes1, to stabilize the protein content of blood, enabling better protein biomarker discovery and proteomics experimentation from human blood. The BD P100 tubes contain 15.8 ml of spray-dried K2EDTA and a lyophilized proprietary broad spectrum cocktail of protease inhibitors to prevent coagulation and stabilize the plasma proteins. They also include a mechanical separator, which provides a physical barrier between plasma and cell pellets after centrifugation. Few methods have been devised to extract DNA from clotted blood samples collected in old plasma tubes2-4. Challenges from these methods were mainly associated with the type of separator inside the tubes (gel separator) and included difficulty in recovering the clotted blood, the inconvenience of fragmenting or dispersing the clot, and obstruction of the clot extraction by the separation gel. We present the first method that extracts and purifies genomic DNA from blood drawn in the new BD P100 tubes. We compare the quality of the DNA sample from P100 tubes to that from EDTA tubes. Our approach is simple and efficient. It involves four major steps as follows: 1) the use of a plasma BD P100 (BD Diagnostics, Sparks, MD, USA) tube with mechanical separator for blood collection, 2) the removal of the mechanical separator using a combination of sucrose and a sterile paperclip metallic hook, 3) the separation of the buffy coat layer containing the white cells and 4) the isolation of the genomic DNA from the buffy coat using a regular commercial DNA extraction kit or a similar standard protocol.
Genetics, Issue 75, Molecular Biology, Cellular Biology, Medicine, Biochemistry, Hematology, Proteins, Genomics, genomic DNA, blood collection, P100 tubes, DNA extraction, buffy coat isolation, genotyping assays, red blood, whole blood, plasma, DNA, assay, genotyping
4241
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Antigens Protected Functional Red Blood Cells By The Membrane Grafting Of Compact Hyperbranched Polyglycerols
Authors: Rafi Chapanian, Iren Constantinescu, Donald E. Brooks, Mark D. Scott, Jayachandran Kizhakkedathu.
Institutions: University of British Columbia , University of British Columbia , University of British Columbia , University of British Columbia .
Red blood cell (RBC) transfusion is vital for the treatment of a number of acute and chronic medical problems such as thalassemia major and sickle cell anemia 1-3. Due to the presence of multitude of antigens on the RBC surface (~308 known antigens 4), patients in the chronic blood transfusion therapy develop alloantibodies due to the miss match of minor antigens on transfused RBCs 4, 5. Grafting of hydrophilic polymers such as polyethylene glycol (PEG) and hyperbranched polyglycerol (HPG) forms an exclusion layer on RBC membrane that prevents the interaction of antibodies with surface antigens without affecting the passage of small molecules such as oxygen ,glucose, and ions3. At present no method is available for the generation of universal red blood donor cells in part because of the daunting challenge presented by the presence of large number of antigens (protein and carbohydrate based) on the RBC surface and the development of such methods will significantly improve transfusion safety, and dramatically improve the availability and use of RBCs. In this report, the experiments that are used to develop antigen protected functional RBCs by the membrane grafting of HPG and their characterization are presented. HPGs are highly biocompatible compact polymers 6, 7, and are expected to be located within the cell glycocalyx that surrounds the lipid membrane 8, 9 and mask RBC surface antigens10, 11.
Immunology, Issue 71, Bioengineering, Pathology, Chemistry, Biochemistry, Hematology, polymers, Blood transfusion, surface antigens, antigen camouflage, RBC modification, hyperbranched polyglycerol, HPG, red blood cells, RBC, whole blood, flow cytometry
50075
Play Button
Measurement of Tactile Allodynia in a Murine Model of Bacterial Prostatitis
Authors: Marsha L Quick, Joseph D Done, Praveen Thumbikat.
Institutions: Northwestern University Feinberg School of Medicine.
Uropathogenic Escherichia coli (UPEC) are pathogens that play an important role in urinary tract infections and bacterial prostatitis1. We have recently shown that UPEC have an important role in the initiation of chronic pelvic pain2, a feature of Chronic prostatitis/Chronic pelvic pain syndrome (CP/CPPS)3,4. Infection of the prostate by clinically relevant UPEC can initiate and establish chronic pain through mechanisms that may involve tissue damage and the initiation of mechanisms of autoimmunity5. A challenge to understanding the pathogenesis of UPEC in the prostate is the relative inaccessibility of the prostate gland to manipulation. We utilized a previously described intraurethral infection method6 to deliver a clinical strain of UPEC into male mice thereby establishing an ascending infection of the prostate. Here, we describe our protocols for standardizing the bacterial inoculum7 as well as the procedure for catheterizing anesthetized male mice for instillation of bacteria. CP/CPPS is primarily characterized by the presence of tactile allodynia4. Behavior testing was based on the concept of cutaneous hyperalgesia resulting from referred visceral pain8-10. An irritable focus in visceral tissues reduces cutaneous pain thresholds allowing for an exaggerated response to normally non-painful stimuli (allodynia). Application of normal force to the skin result in abnormal responses that tend to increase with the intensity of the underlying visceral pain. We describe methodology in NOD/ShiLtJ mice that utilize von Frey fibers to quantify tactile allodynia over time in response to a single infection with UPEC bacteria.
Infection, Issue 71, Immunology, Infectious Diseases, Microbiology, Medicine, Urology, Pathology, Autoimmune Diseases, Bacterial Infections and Mycoses, Male Urogenital Diseases, Bacterial pathogenesis, pain, autoimmunity, prostatitis, catheterization, mice, animal model
50158
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Using Learning Outcome Measures to assess Doctoral Nursing Education
Authors: Glenn H. Raup, Jeff King, Romana J. Hughes, Natasha Faidley.
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
2048
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.