JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Building a comprehensive mill-level database for the Industrial Sectors Integrated Solutions (ISIS) model of the U.S. pulp and paper sector.
.
PLoS ONE
PUBLISHED: 03-26-2015
Air emissions from the U.S. pulp and paper sector have been federally regulated since 1978; however, regulations are periodically reviewed and revised to improve efficiency and effectiveness of existing emission standards. The Industrial Sectors Integrated Solutions (ISIS) model for the pulp and paper sector is currently under development at the U.S. Environmental Protection Agency (EPA), and can be utilized to facilitate multi-pollutant, sector-based analyses that are performed in conjunction with regulatory development. The model utilizes a multi-sector, multi-product dynamic linear modeling framework that evaluates the economic impact of emission reduction strategies for multiple air pollutants. The ISIS model considers facility-level economic, environmental, and technical parameters, as well as sector-level market data, to estimate the impacts of environmental regulations on the pulp and paper industry. Specifically, the model can be used to estimate U.S. and global market impacts of new or more stringent air regulations, such as impacts on product price, exports and imports, market demands, capital investment, and mill closures. One major challenge to developing a representative model is the need for an extensive amount of data. This article discusses the collection and processing of data for use in the model, as well as the methods used for building the ISIS pulp and paper database that facilitates the required analyses to support the air quality management of the pulp and paper sector.
ABSTRACT
The kraft process is applied to wood chips for separation of lignin from the polysaccharides within lignocellulose for pulp that will produce a high quality paper. Black liquor is a pulping waste generated by the kraft process that has potential for downstream bioconversion. However, the recalcitrant nature of the lignocellulose resources, its chemical derivatives that constitute the majority of available organic carbon within black liquor, and its basic pH present challenges to microbial biodegradation of this waste material. Methods for the collection and modification of black liquor for microbial growth are aimed at utilization of this pulp waste to convert the lignin, organic acids, and polysaccharide degradation byproducts into valuable chemicals. The lignocellulose extraction techniques presented provide a reproducible method for preparation of lignocellulose growth substrates for understanding metabolic capacities of cultured microorganisms. Use of gas chromatography-mass spectrometry enables the identification and quantification of the fermentation products resulting from the growth of microorganisms on pulping waste. These methods when used together can facilitate the determination of the metabolic activity of microorganisms with potential to produce fermentation products that would provide greater value to the pulping system and reduce effluent waste, thereby increasing potential paper milling profits and offering additional uses for black liquor.
21 Related JoVE Articles!
Play Button
Studying Aggression in Drosophila (fruit flies)
Authors: Sibu Mundiyanapurath, Sarah Certel, Edward A. Kravitz.
Institutions: Harvard Medical School.
Aggression is an innate behavior that evolved in the framework of defending or obtaining resources. This complex social behavior is influenced by genetic, hormonal and environmental factors. In many organisms, aggression is critical to survival but controlling and suppressing aggression in distinct contexts also has become increasingly important. In recent years, invertebrates have become increasingly useful as model systems for investigating the genetic and systems biological basis of complex social behavior. This is in part due to the diverse repertoire of behaviors exhibited by these organisms. In the accompanying video, we outline a method for analyzing aggression in Drosophila whose design encompasses important eco-ethological constraints. Details include steps for: making a fighting chamber; isolating and painting flies; adding flies to the fight chamber; and video taping fights. This approach is currently being used to identify candidate genes important in aggression and in elaborating the neuronal circuitry that underlies the output of aggression and other social behaviors.
Neuroscience, Issue 2, Drosophila, behavior
155
Play Button
Towards Biomimicking Wood: Fabricated Free-standing Films of Nanocellulose, Lignin, and a Synthetic Polycation
Authors: Karthik Pillai, Fernando Navarro Arzate, Wei Zhang, Scott Renneckar.
Institutions: Virginia Tech, Virginia Tech, Illinois Institute of Technology- Moffett Campus, University of Guadalajara, Virginia Tech, Virginia Tech.
Woody materials are comprised of plant cell walls that contain a layered secondary cell wall composed of structural polymers of polysaccharides and lignin. Layer-by-layer (LbL) assembly process which relies on the assembly of oppositely charged molecules from aqueous solutions was used to build a freestanding composite film of isolated wood polymers of lignin and oxidized nanofibril cellulose (NFC). To facilitate the assembly of these negatively charged polymers, a positively charged polyelectrolyte, poly(diallyldimethylammomium chloride) (PDDA), was used as a linking layer to create this simplified model cell wall. The layered adsorption process was studied quantitatively using quartz crystal microbalance with dissipation monitoring (QCM-D) and ellipsometry. The results showed that layer mass/thickness per adsorbed layer increased as a function of total number of layers. The surface coverage of the adsorbed layers was studied with atomic force microscopy (AFM). Complete coverage of the surface with lignin in all the deposition cycles was found for the system, however, surface coverage by NFC increased with the number of layers. The adsorption process was carried out for 250 cycles (500 bilayers) on a cellulose acetate (CA) substrate. Transparent free-standing LBL assembled nanocomposite films were obtained when the CA substrate was later dissolved in acetone. Scanning electron microscopy (SEM) of the fractured cross-sections showed a lamellar structure, and the thickness per adsorption cycle (PDDA-Lignin-PDDA-NC) was estimated to be 17 nm for two different lignin types used in the study. The data indicates a film with highly controlled architecture where nanocellulose and lignin are spatially deposited on the nanoscale (a polymer-polymer nanocomposites), similar to what is observed in the native cell wall.
Plant Biology, Issue 88, nanocellulose, thin films, quartz crystal microbalance, layer-by-layer, LbL
51257
Play Button
Lignin Down-regulation of Zea mays via dsRNAi and Klason Lignin Analysis
Authors: Sang-Hyuck Park, Rebecca Garlock Ong, Chuansheng Mei, Mariam Sticklen.
Institutions: University of Arizona, Michigan State University, The Institute for Advanced Learning and Research, Michigan State University.
To facilitate the use of lignocellulosic biomass as an alternative bioenergy resource, during biological conversion processes, a pretreatment step is needed to open up the structure of the plant cell wall, increasing the accessibility of the cell wall carbohydrates. Lignin, a polyphenolic material present in many cell wall types, is known to be a significant hindrance to enzyme access. Reduction in lignin content to a level that does not interfere with the structural integrity and defense system of the plant might be a valuable step to reduce the costs of bioethanol production. In this study, we have genetically down-regulated one of the lignin biosynthesis-related genes, cinnamoyl-CoA reductase (ZmCCR1) via a double stranded RNA interference technique. The ZmCCR1_RNAi construct was integrated into the maize genome using the particle bombardment method. Transgenic maize plants grew normally as compared to the wild-type control plants without interfering with biomass growth or defense mechanisms, with the exception of displaying of brown-coloration in transgenic plants leaf mid-ribs, husks, and stems. The microscopic analyses, in conjunction with the histological assay, revealed that the leaf sclerenchyma fibers were thinned but the structure and size of other major vascular system components was not altered. The lignin content in the transgenic maize was reduced by 7-8.7%, the crystalline cellulose content was increased in response to lignin reduction, and hemicelluloses remained unchanged. The analyses may indicate that carbon flow might have been shifted from lignin biosynthesis to cellulose biosynthesis. This article delineates the procedures used to down-regulate the lignin content in maize via RNAi technology, and the cell wall compositional analyses used to verify the effect of the modifications on the cell wall structure.
Bioengineering, Issue 89, Zea mays, cinnamoyl-CoA reductase (CCR), dsRNAi, Klason lignin measurement, cell wall carbohydrate analysis, gas chromatography (GC)
51340
Play Button
Transcript and Metabolite Profiling for the Evaluation of Tobacco Tree and Poplar as Feedstock for the Bio-based Industry
Authors: Colin Ruprecht, Takayuki Tohge, Alisdair Fernie, Cara L. Mortimer, Amanda Kozlo, Paul D. Fraser, Norma Funke, Igor Cesarino, Ruben Vanholme, Wout Boerjan, Kris Morreel, Ingo Burgert, Notburga Gierlinger, Vincent Bulone, Vera Schneider, Andrea Stockero, Juan Navarro-Aviñó, Frank Pudel, Bart Tambuyser, James Hygate, Jon Bumstead, Louis Notley, Staffan Persson.
Institutions: Max Planck Institute for Molecular Plant Physiology, Royal Holloway, University of London, VIB, UGhent, ETH Zurich, EMPA, Royal Institute of Technology (KTH), European Research and Project Office GmbH, ABBA Gaia S.L., Pflanzenöltechnologie, Capax Environmental Services, Green Fuels, Neutral Consulting Ltd, University of Melbourne.
The global demand for food, feed, energy, and water poses extraordinary challenges for future generations. It is evident that robust platforms for the exploration of renewable resources are necessary to overcome these challenges. Within the multinational framework MultiBioPro we are developing biorefinery pipelines to maximize the use of plant biomass. More specifically, we use poplar and tobacco tree (Nicotiana glauca) as target crop species for improving saccharification, isoprenoid, long chain hydrocarbon contents, fiber quality, and suberin and lignin contents. The methods used to obtain these outputs include GC-MS, LC-MS and RNA sequencing platforms. The metabolite pipelines are well established tools to generate these types of data, but also have the limitations in that only well characterized metabolites can be used. The deep sequencing will allow us to include all transcripts present during the developmental stages of the tobacco tree leaf, but has to be mapped back to the sequence of Nicotiana tabacum. With these set-ups, we aim at a basic understanding for underlying processes and at establishing an industrial framework to exploit the outcomes. In a more long term perspective, we believe that data generated here will provide means for a sustainable biorefinery process using poplar and tobacco tree as raw material. To date the basal level of metabolites in the samples have been analyzed and the protocols utilized are provided in this article.
Environmental Sciences, Issue 87, botany, plants, Biorefining, Poplar, Tobacco tree, Arabidopsis, suberin, lignin, cell walls, biomass, long-chain hydrocarbons, isoprenoids, Nicotiana glauca, systems biology
51393
Play Button
Reduced-gravity Environment Hardware Demonstrations of a Prototype Miniaturized Flow Cytometer and Companion Microfluidic Mixing Technology
Authors: William S. Phipps, Zhizhong Yin, Candice Bae, Julia Z. Sharpe, Andrew M. Bishara, Emily S. Nelson, Aaron S. Weaver, Daniel Brown, Terri L. McKay, DeVon Griffin, Eugene Y. Chan.
Institutions: DNA Medicine Institute, Harvard Medical School, NASA Glenn Research Center, ZIN Technologies.
Until recently, astronaut blood samples were collected in-flight, transported to earth on the Space Shuttle, and analyzed in terrestrial laboratories. If humans are to travel beyond low Earth orbit, a transition towards space-ready, point-of-care (POC) testing is required. Such testing needs to be comprehensive, easy to perform in a reduced-gravity environment, and unaffected by the stresses of launch and spaceflight. Countless POC devices have been developed to mimic laboratory scale counterparts, but most have narrow applications and few have demonstrable use in an in-flight, reduced-gravity environment. In fact, demonstrations of biomedical diagnostics in reduced gravity are limited altogether, making component choice and certain logistical challenges difficult to approach when seeking to test new technology. To help fill the void, we are presenting a modular method for the construction and operation of a prototype blood diagnostic device and its associated parabolic flight test rig that meet the standards for flight-testing onboard a parabolic flight, reduced-gravity aircraft. The method first focuses on rig assembly for in-flight, reduced-gravity testing of a flow cytometer and a companion microfluidic mixing chip. Components are adaptable to other designs and some custom components, such as a microvolume sample loader and the micromixer may be of particular interest. The method then shifts focus to flight preparation, by offering guidelines and suggestions to prepare for a successful flight test with regard to user training, development of a standard operating procedure (SOP), and other issues. Finally, in-flight experimental procedures specific to our demonstrations are described.
Cellular Biology, Issue 93, Point-of-care, prototype, diagnostics, spaceflight, reduced gravity, parabolic flight, flow cytometry, fluorescence, cell counting, micromixing, spiral-vortex, blood mixing
51743
Play Button
Integrated Field Lysimetry and Porewater Sampling for Evaluation of Chemical Mobility in Soils and Established Vegetation
Authors: Audrey R. Matteson, Denis J. Mahoney, Travis W. Gannon, Matthew L. Polizzotto.
Institutions: North Carolina State University, North Carolina State University.
Potentially toxic chemicals are routinely applied to land to meet growing demands on waste management and food production, but the fate of these chemicals is often not well understood. Here we demonstrate an integrated field lysimetry and porewater sampling method for evaluating the mobility of chemicals applied to soils and established vegetation. Lysimeters, open columns made of metal or plastic, are driven into bareground or vegetated soils. Porewater samplers, which are commercially available and use vacuum to collect percolating soil water, are installed at predetermined depths within the lysimeters. At prearranged times following chemical application to experimental plots, porewater is collected, and lysimeters, containing soil and vegetation, are exhumed. By analyzing chemical concentrations in the lysimeter soil, vegetation, and porewater, downward leaching rates, soil retention capacities, and plant uptake for the chemical of interest may be quantified. Because field lysimetry and porewater sampling are conducted under natural environmental conditions and with minimal soil disturbance, derived results project real-case scenarios and provide valuable information for chemical management. As chemicals are increasingly applied to land worldwide, the described techniques may be utilized to determine whether applied chemicals pose adverse effects to human health or the environment.
Environmental Sciences, Issue 89, Lysimetry, porewater, soil, chemical leaching, pesticides, turfgrass, waste
51862
Play Button
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
52183
Play Button
Removal of Trace Elements by Cupric Oxide Nanoparticles from Uranium In Situ Recovery Bleed Water and Its Effect on Cell Viability
Authors: Jodi R. Schilz, K. J. Reddy, Sreejayan Nair, Thomas E. Johnson, Ronald B. Tjalkens, Kem P. Krueger, Suzanne Clark.
Institutions: University of New Mexico, University of Wyoming, University of Wyoming, Colorado State University, Colorado State University, California Northstate University.
In situ recovery (ISR) is the predominant method of uranium extraction in the United States. During ISR, uranium is leached from an ore body and extracted through ion exchange. The resultant production bleed water (PBW) contains contaminants such as arsenic and other heavy metals. Samples of PBW from an active ISR uranium facility were treated with cupric oxide nanoparticles (CuO-NPs). CuO-NP treatment of PBW reduced priority contaminants, including arsenic, selenium, uranium, and vanadium. Untreated and CuO-NP treated PBW was used as the liquid component of the cell growth media and changes in viability were determined by the MTT (3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide) assay in human embryonic kidney (HEK 293) and human hepatocellular carcinoma (Hep G2) cells. CuO-NP treatment was associated with improved HEK and HEP cell viability. Limitations of this method include dilution of the PBW by growth media components and during osmolality adjustment as well as necessary pH adjustment. This method is limited in its wider context due to dilution effects and changes in the pH of the PBW which is traditionally slightly acidic however; this method could have a broader use assessing CuO-NP treatment in more neutral waters.
Environmental Sciences, Issue 100, Energy production, uranium in situ recovery, water decontamination, nanoparticles, toxicity, cytotoxicity, in vitro cell culture
52715
Play Button
Phage Phenomics: Physiological Approaches to Characterize Novel Viral Proteins
Authors: Savannah E. Sanchez, Daniel A. Cuevas, Jason E. Rostron, Tiffany Y. Liang, Cullen G. Pivaroff, Matthew R. Haynes, Jim Nulton, Ben Felts, Barbara A. Bailey, Peter Salamon, Robert A. Edwards, Alex B. Burgin, Anca M. Segall, Forest Rohwer.
Institutions: San Diego State University, San Diego State University, San Diego State University, San Diego State University, San Diego State University, Argonne National Laboratory, Broad Institute.
Current investigations into phage-host interactions are dependent on extrapolating knowledge from (meta)genomes. Interestingly, 60 - 95% of all phage sequences share no homology to current annotated proteins. As a result, a large proportion of phage genes are annotated as hypothetical. This reality heavily affects the annotation of both structural and auxiliary metabolic genes. Here we present phenomic methods designed to capture the physiological response(s) of a selected host during expression of one of these unknown phage genes. Multi-phenotype Assay Plates (MAPs) are used to monitor the diversity of host substrate utilization and subsequent biomass formation, while metabolomics provides bi-product analysis by monitoring metabolite abundance and diversity. Both tools are used simultaneously to provide a phenotypic profile associated with expression of a single putative phage open reading frame (ORF). Representative results for both methods are compared, highlighting the phenotypic profile differences of a host carrying either putative structural or metabolic phage genes. In addition, the visualization techniques and high throughput computational pipelines that facilitated experimental analysis are presented.
Immunology, Issue 100, phenomics, phage, viral metagenome, Multi-phenotype Assay Plates (MAPs), continuous culture, metabolomics
52854
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
Isolation and Quantification of Botulinum Neurotoxin From Complex Matrices Using the BoTest Matrix Assays
Authors: F. Mark Dunning, Timothy M. Piazza, Füsûn N. Zeytin, Ward C. Tucker.
Institutions: BioSentinel Inc., Madison, WI.
Accurate detection and quantification of botulinum neurotoxin (BoNT) in complex matrices is required for pharmaceutical, environmental, and food sample testing. Rapid BoNT testing of foodstuffs is needed during outbreak forensics, patient diagnosis, and food safety testing while accurate potency testing is required for BoNT-based drug product manufacturing and patient safety. The widely used mouse bioassay for BoNT testing is highly sensitive but lacks the precision and throughput needed for rapid and routine BoNT testing. Furthermore, the bioassay's use of animals has resulted in calls by drug product regulatory authorities and animal-rights proponents in the US and abroad to replace the mouse bioassay for BoNT testing. Several in vitro replacement assays have been developed that work well with purified BoNT in simple buffers, but most have not been shown to be applicable to testing in highly complex matrices. Here, a protocol for the detection of BoNT in complex matrices using the BoTest Matrix assays is presented. The assay consists of three parts: The first part involves preparation of the samples for testing, the second part is an immunoprecipitation step using anti-BoNT antibody-coated paramagnetic beads to purify BoNT from the matrix, and the third part quantifies the isolated BoNT's proteolytic activity using a fluorogenic reporter. The protocol is written for high throughput testing in 96-well plates using both liquid and solid matrices and requires about 2 hr of manual preparation with total assay times of 4-26 hr depending on the sample type, toxin load, and desired sensitivity. Data are presented for BoNT/A testing with phosphate-buffered saline, a drug product, culture supernatant, 2% milk, and fresh tomatoes and includes discussion of critical parameters for assay success.
Neuroscience, Issue 85, Botulinum, food testing, detection, quantification, complex matrices, BoTest Matrix, Clostridium, potency testing
51170
Play Button
Assembly, Loading, and Alignment of an Analytical Ultracentrifuge Sample Cell
Authors: Andrea Balbo, Huaying Zhao, Patrick H. Brown, Peter Schuck.
Institutions: Dynamics of Macromolecular Assembly, Laboratory of Bioengineering and Physical Science.
The analytical ultracentrifuge (AUC) is a powerful biophysical tool that allows us to record macromolecular sedimentation profiles during high speed centrifugation. When properly planned and executed, an AUC sedimentation velocity or sedimentation equilibrium experiment can reveal a great deal about a protein in regards to size and shape, sample purity, sedimentation coefficient, oligomerization states and protein-protein interactions. This technique, however, requires a rigorous level of technical attention. Sample cells hold a sectored center piece sandwiched between two window assemblies. They are sealed with a torque pressure of around 120-140 in/lbs. Reference buffer and sample are loaded into the centerpiece sectors and then after sealing, the cells are precisely aligned into a titanium rotor so that the optical detection systems scan both sample and reference buffer in the same radial path midline through each centerpiece sector while rotating at speeds of up to 60, 000 rpm and under very high vacuum Not only is proper sample cell assembly critical, sample cell components are very expensive and must be properly cared for to ensure they are in optimum working condition in order to avoid leaks and breakage during experiments. Handle windows carefully, for even the slightest crack or scratch can lead to breakage in the centrifuge. The contact between centerpiece and windows must be as tight as possible; i.e. no Newton s rings should be visible after torque pressure is applied. Dust, lint, scratches and oils on either the windows or the centerpiece all compromise this contact and can very easily lead to leaking of solutions from one sector to another or leaking out of the centerpiece all together. Not only are precious samples lost, leaking of solutions during an experiment will cause an imbalance of pressure in the cell that often leads to broken windows and centerpieces. In addition, plug gaskets and housing plugs must be securely in place to avoid solutions being pulled out of the centerpiece sector through the loading holes by the high vacuum in the centrifuge chamber. Window liners and gaskets must be free of breaks and cracks that could cause movement resulting in broken windows. This video will demonstrate our procedures of sample cell assembly, torque, loading and rotor alignment to help minimize component damage, solution leaking and breakage during the perfect AUC experiment.
Basic Protocols, Issue 33, analytical ultracentrifugation, sedimentation velocity, sedimentation equilibrium, protein characterization, sedimentation coefficient
1530
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
3705
Play Button
Long-term Lethal Toxicity Test with the Crustacean Artemia franciscana
Authors: Loredana Manfra, Federica Savorelli, Marco Pisapia, Erika Magaletti, Anna Maria Cicero.
Institutions: Institute for Environmental Protection and Research, Regional Agency for Environmental Protection in Emilia-Romagna.
Our research activities target the use of biological methods for the evaluation of environmental quality, with particular reference to saltwater/brackish water and sediment. The choice of biological indicators must be based on reliable scientific knowledge and, possibly, on the availability of standardized procedures. In this article, we present a standardized protocol that used the marine crustacean Artemia to evaluate the toxicity of chemicals and/or of marine environmental matrices. Scientists propose that the brine shrimp (Artemia) is a suitable candidate for the development of a standard bioassay for worldwide utilization. A number of papers have been published on the toxic effects of various chemicals and toxicants on brine shrimp (Artemia). The major advantage of this crustacean for toxicity studies is the overall availability of the dry cysts; these can be immediately used in testing and difficult cultivation is not demanded1,2. Cyst-based toxicity assays are cheap, continuously available, simple and reliable and are thus an important answer to routine needs of toxicity screening, for industrial monitoring requirements or for regulatory purposes3. The proposed method involves the mortality as an endpoint. The numbers of survivors were counted and percentage of deaths were calculated. Larvae were considered dead if they did not exhibit any internal or external movement during several seconds of observation4. This procedure was standardized testing a reference substance (Sodium Dodecyl Sulfate); some results are reported in this work. This article accompanies a video that describes the performance of procedural toxicity testing, showing all the steps related to the protocol.
Chemistry, Issue 62, Artemia franciscana, bioassays, chemical substances, crustaceans, marine environment
3790
Play Button
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Authors: Sergey Rabotyagov, Todd Campbell, Adriana Valcu, Philip Gassman, Manoj Jha, Keith Schilling, Calvin Wolter, Catherine Kling.
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,5,12,20) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7 with a multiobjective evolutionary algorithm SPEA226, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development
4009
Play Button
Isolation, Characterization and Comparative Differentiation of Human Dental Pulp Stem Cells Derived from Permanent Teeth by Using Two Different Methods
Authors: Razieh Karamzadeh, Mohamadreza Baghaban Eslaminejad, Reza Aflatoonian.
Institutions: Royan Institute for Stem Cell Biology and Technology, ACECR, Tehran, Iran, Royan Institute for Reproductive Biomedicine, ACECR, Tehran, Iran.
Developing wisdom teeth are easy-accessible source of stem cells during the adulthood which could be obtained by routine orthodontic treatments. Human pulp-derived stem cells (hDPSCs) possess high proliferation potential with multi-lineage differentiation capacity compare to the ordinary source of adult stem cells1-8; therefore, hDPSCs could be the good candidates for autologous transplantation in tissue engineering and regenerative medicine. Along with these benefits, possessing the mesenchymal stem cells (MSC) features, such as immunolodulatory effect, make hDPSCs more valuable, even in the case of allograft transplantation6,9,10. Therefore, the primary step for using this source of stem cells is to select the best protocol for isolating hDPSCs from pulp tissue. In order to achieve this goal, it is crucial to investigate the effect of various isolation conditions on different cellular behaviors, such as their common surface markers & also their differentiation capacity. Thus, here we separate human pulp tissue from impacted third molar teeth, and then used both existing protocols based on literature, for isolating hDPSCs,11-13 i.e. enzymatic dissociation of pulp tissue (DPSC-ED) or outgrowth from tissue explants (DPSC-OG). In this regards, we tried to facilitate the isolation methods by using dental diamond disk. Then, these cells characterized in terms of stromal-associated Markers (CD73, CD90, CD105 & CD44), hematopoietic/endothelial Markers (CD34, CD45 & CD11b), perivascular marker, like CD146 and also STRO-1. Afterwards, these two protocols were compared based on the differentiation potency into odontoblasts by both quantitative polymerase chain reaction (QPCR) & Alizarin Red Staining. QPCR were used for the assessment of the expression of the mineralization-related genes (alkaline phosphatase; ALP, matrix extracellular phosphoglycoprotein; MEPE & dentin sialophosphoprotein; DSPP).14
Stem Cell Biology, Issue 69, Medicine, Developmental Biology, Cellular Biology, Bioengineering, Dental pulp tissue, Human third molar, Human dental pulp stem cells, hDPSC, Odontoblasts, Outgrown stem cells, MSC, differentiation
4372
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
50645
Play Button
Identifying DNA Mutations in Purified Hematopoietic Stem/Progenitor Cells
Authors: Ziming Cheng, Ting Zhou, Azhar Merchant, Thomas J. Prihoda, Brian L. Wickes, Guogang Xu, Christi A. Walter, Vivienne I. Rebel.
Institutions: UT Health Science Center at San Antonio, UT Health Science Center at San Antonio, UT Health Science Center at San Antonio, UT Health Science Center at San Antonio, UT Health Science Center at San Antonio.
In recent years, it has become apparent that genomic instability is tightly related to many developmental disorders, cancers, and aging. Given that stem cells are responsible for ensuring tissue homeostasis and repair throughout life, it is reasonable to hypothesize that the stem cell population is critical for preserving genomic integrity of tissues. Therefore, significant interest has arisen in assessing the impact of endogenous and environmental factors on genomic integrity in stem cells and their progeny, aiming to understand the etiology of stem-cell based diseases. LacI transgenic mice carry a recoverable λ phage vector encoding the LacI reporter system, in which the LacI gene serves as the mutation reporter. The result of a mutated LacI gene is the production of β-galactosidase that cleaves a chromogenic substrate, turning it blue. The LacI reporter system is carried in all cells, including stem/progenitor cells and can easily be recovered and used to subsequently infect E. coli. After incubating infected E. coli on agarose that contains the correct substrate, plaques can be scored; blue plaques indicate a mutant LacI gene, while clear plaques harbor wild-type. The frequency of blue (among clear) plaques indicates the mutant frequency in the original cell population the DNA was extracted from. Sequencing the mutant LacI gene will show the location of the mutations in the gene and the type of mutation. The LacI transgenic mouse model is well-established as an in vivo mutagenesis assay. Moreover, the mice and the reagents for the assay are commercially available. Here we describe in detail how this model can be adapted to measure the frequency of spontaneously occurring DNA mutants in stem cell-enriched Lin-IL7R-Sca-1+cKit++(LSK) cells and other subpopulations of the hematopoietic system.
Infection, Issue 84, In vivo mutagenesis, hematopoietic stem/progenitor cells, LacI mouse model, DNA mutations, E. coli
50752
Play Button
A Sensitive and Specific Quantitation Method for Determination of Serum Cardiac Myosin Binding Protein-C by Electrochemiluminescence Immunoassay
Authors: Diederik W.D. Kuster, David Barefield, Suresh Govindan, Sakthivel Sadayappan.
Institutions: Loyola University Chicago.
Biomarkers are becoming increasingly more important in clinical decision-making, as well as basic science. Diagnosing myocardial infarction (MI) is largely driven by detecting cardiac-specific proteins in patients' serum or plasma as an indicator of myocardial injury. Having recently shown that cardiac myosin binding protein-C (cMyBP-C) is detectable in the serum after MI, we have proposed it as a potential biomarker for MI. Biomarkers are typically detected by traditional sandwich enzyme-linked immunosorbent assays. However, this technique requires a large sample volume, has a small dynamic range, and can measure only one protein at a time. Here we show a multiplex immunoassay in which three cardiac proteins can be measured simultaneously with high sensitivity. Measuring cMyBP-C in uniplex or together with creatine kinase MB and cardiac troponin I showed comparable sensitivity. This technique uses the Meso Scale Discovery (MSD) method of multiplexing in a 96-well plate combined with electrochemiluminescence for detection. While only small sample volumes are required, high sensitivity and a large dynamic range are achieved. Using this technique, we measured cMyBP-C, creatine kinase MB, and cardiac troponin I levels in serum samples from 16 subjects with MI and compared the results with 16 control subjects. We were able to detect all three markers in these samples and found all three biomarkers to be increased after MI. This technique is, therefore, suitable for the sensitive detection of cardiac biomarkers in serum samples.
Molecular Biology, Issue 78, Cellular Biology, Biochemistry, Genetics, Biomedical Engineering, Medicine, Cardiology, Heart Diseases, Myocardial Ischemia, Myocardial Infarction, Cardiovascular Diseases, cardiovascular disease, immunoassay, cardiac myosin binding protein-C, cardiac troponin I, creatine kinase MB, electrochemiluminescence, multiplex biomarkers, ELISA, assay
50786
Play Button
A Practical Guide to Phylogenetics for Nonexperts
Authors: Damien O'Halloran.
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
50975
Play Button
Quantification of Heavy Metals and Other Inorganic Contaminants on the Productivity of Microalgae
Authors: Katerine Napan, Derek Hess, Brian McNeil, Jason C. Quinn.
Institutions: Utah State University.
Increasing demand for renewable fuels has researchers investigating the feasibility of alternative feedstocks, such as microalgae. Inherent advantages include high potential yield, use of non-arable land and integration with waste streams. The nutrient requirements of a large-scale microalgae production system will require the coupling of cultivation systems with industrial waste resources, such as carbon dioxide from flue gas and nutrients from wastewater. Inorganic contaminants present in these wastes can potentially lead to bioaccumulation in microalgal biomass negatively impact productivity and limiting end use. This study focuses on the experimental evaluation of the impact and the fate of 14 inorganic contaminants (As, Cd, Co, Cr, Cu, Hg, Mn, Ni, Pb, Sb, Se, Sn, V and Zn) on Nannochloropsis salina growth. Microalgae were cultivated in photobioreactors illuminated at 984 µmol m-2 sec-1 and maintained at pH 7 in a growth media polluted with inorganic contaminants at levels expected based on the composition found in commercial coal flue gas systems. Contaminants present in the biomass and the medium at the end of a 7 day growth period were analytically quantified through cold vapor atomic absorption spectrometry for Hg and through inductively coupled plasma mass spectrometry for As, Cd, Co, Cr, Cu, Mn, Ni, Pb, Sb, Se, Sn, V and Zn. Results show N. salina is a sensitive strain to the multi-metal environment with a statistical decrease in biomass yieldwith the introduction of these contaminants. The techniques presented here are adequate for quantifying algal growth and determining the fate of inorganic contaminants.
Environmental Sciences, Issue 101, algae, heavy metals, Nannochloropsis salina, photobioreactor, flue gas, inductively coupled plasma mass spectrometry, ICPMS, cold vapor atomic absorption spectrometry, CVAAS
52936
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.