JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
A comparison of the inflammatory and proteolytic effects of dung biomass and cigarette smoke exposure in the lung.
PLoS ONE
Biomass is the energy source for cooking and heating for billions of people worldwide. Despite their prevalent use and their potential impact on global health, the effects of these fuels on lung biology and function remain poorly understood.
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Published: 11-28-2014
ABSTRACT
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
26 Related JoVE Articles!
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Culturing of Human Nasal Epithelial Cells at the Air Liquid Interface
Authors: Loretta Müller, Luisa E. Brighton, Johnny L. Carson, William A. Fischer II, Ilona Jaspers.
Institutions: The University of North Carolina at Chapel Hill, The University of North Carolina at Chapel Hill, The University of North Carolina at Chapel Hill, The University of North Carolina at Chapel Hill.
In vitro models using human primary epithelial cells are essential in understanding key functions of the respiratory epithelium in the context of microbial infections or inhaled agents. Direct comparisons of cells obtained from diseased populations allow us to characterize different phenotypes and dissect the underlying mechanisms mediating changes in epithelial cell function. Culturing epithelial cells from the human tracheobronchial region has been well documented, but is limited by the availability of human lung tissue or invasiveness associated with obtaining the bronchial brushes biopsies. Nasal epithelial cells are obtained through much less invasive superficial nasal scrape biopsies and subjects can be biopsied multiple times with no significant side effects. Additionally, the nose is the entry point to the respiratory system and therefore one of the first sites to be exposed to any kind of air-borne stressor, such as microbial agents, pollutants, or allergens. Briefly, nasal epithelial cells obtained from human volunteers are expanded on coated tissue culture plates, and then transferred onto cell culture inserts. Upon reaching confluency, cells continue to be cultured at the air-liquid interface (ALI), for several weeks, which creates more physiologically relevant conditions. The ALI culture condition uses defined media leading to a differentiated epithelium that exhibits morphological and functional characteristics similar to the human nasal epithelium, with both ciliated and mucus producing cells. Tissue culture inserts with differentiated nasal epithelial cells can be manipulated in a variety of ways depending on the research questions (treatment with pharmacological agents, transduction with lentiviral vectors, exposure to gases, or infection with microbial agents) and analyzed for numerous different endpoints ranging from cellular and molecular pathways, functional changes, morphology, etc. In vitro models of differentiated human nasal epithelial cells will enable investigators to address novel and important research questions by using organotypic experimental models that largely mimic the nasal epithelium in vivo.
Cellular Biology, Issue 80, Epithelium, Cell culture models, ciliated, air pollution, co-culture models, nasal epithelium
50646
Play Button
Quantitative In vitro Assay to Measure Neutrophil Adhesion to Activated Primary Human Microvascular Endothelial Cells under Static Conditions
Authors: Kevin Wilhelmsen, Katherine Farrar, Judith Hellman.
Institutions: University of California, San Francisco, University of California, San Francisco.
The vascular endothelium plays an integral part in the inflammatory response. During the acute phase of inflammation, endothelial cells (ECs) are activated by host mediators or directly by conserved microbial components or host-derived danger molecules. Activated ECs express cytokines, chemokines and adhesion molecules that mobilize, activate and retain leukocytes at the site of infection or injury. Neutrophils are the first leukocytes to arrive, and adhere to the endothelium through a variety of adhesion molecules present on the surfaces of both cells. The main functions of neutrophils are to directly eliminate microbial threats, promote the recruitment of other leukocytes through the release of additional factors, and initiate wound repair. Therefore, their recruitment and attachment to the endothelium is a critical step in the initiation of the inflammatory response. In this report, we describe an in vitro neutrophil adhesion assay using calcein AM-labeled primary human neutrophils to quantitate the extent of microvascular endothelial cell activation under static conditions. This method has the additional advantage that the same samples quantitated by fluorescence spectrophotometry can also be visualized directly using fluorescence microscopy for a more qualitative assessment of neutrophil binding.
Immunology, Issue 78, Cellular Biology, Infection, Molecular Biology, Medicine, Biomedical Engineering, Biophysics, Endothelium, Vascular, Neutrophils, Inflammation, Inflammation Mediators, Neutrophil, Leukocyte Adhesion, Endothelial cells, assay
50677
Play Button
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Authors: Colin W. Bell, Barbara E. Fricks, Jennifer D. Rocca, Jessica M. Steinweg, Shawna K. McMahon, Matthew D. Wallenstein.
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample). Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
50961
Play Button
Videomorphometric Analysis of Hypoxic Pulmonary Vasoconstriction of Intra-pulmonary Arteries Using Murine Precision Cut Lung Slices
Authors: Renate Paddenberg, Petra Mermer, Anna Goldenberg, Wolfgang Kummer.
Institutions: Justus-Liebig-University.
Acute alveolar hypoxia causes pulmonary vasoconstriction (HPV) - also known as von Euler-Liljestrand mechanism - which serves to match lung perfusion to ventilation. Up to now, the underlying mechanisms are not fully understood. The major vascular segment contributing to HPV is the intra-acinar artery. This vessel section is responsible for the blood supply of an individual acinus, which is defined as the portion of lung distal to a terminal bronchiole. Intra-acinar arteries are mostly located in that part of the lung that cannot be selectively reached by a number of commonly used techniques such as measurement of the pulmonary artery pressure in isolated perfused lungs or force recordings from dissected proximal pulmonary artery segments1,2. The analysis of subpleural vessels by real-time confocal laser scanning luminescence microscopy is limited to vessels with up to 50 µm in diameter3. We provide a technique to study HPV of murine intra-pulmonary arteries in the range of 20-100 µm inner diameters. It is based on the videomorphometric analysis of cross-sectioned arteries in precision cut lung slices (PCLS). This method allows the quantitative measurement of vasoreactivity of small intra-acinar arteries with inner diameter between 20-40 µm which are located at gussets of alveolar septa next to alveolar ducts and of larger pre-acinar arteries with inner diameters between 40-100 µm which run adjacent to bronchi and bronchioles. In contrast to real-time imaging of subpleural vessels in anesthetized and ventilated mice, videomorphometric analysis of PCLS occurs under conditions free of shear stress. In our experimental model both arterial segments exhibit a monophasic HPV when exposed to medium gassed with 1% O2 and the response fades after 30-40 min at hypoxia.
Medicine, Issue 83, Hypoxic pulmonary vasoconstriction, murine lungs, precision cut lung slices, intra-pulmonary, pre- and intra-acinar arteries, videomorphometry
50970
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
Towards Biomimicking Wood: Fabricated Free-standing Films of Nanocellulose, Lignin, and a Synthetic Polycation
Authors: Karthik Pillai, Fernando Navarro Arzate, Wei Zhang, Scott Renneckar.
Institutions: Virginia Tech, Virginia Tech, Illinois Institute of Technology- Moffett Campus, University of Guadalajara, Virginia Tech, Virginia Tech.
Woody materials are comprised of plant cell walls that contain a layered secondary cell wall composed of structural polymers of polysaccharides and lignin. Layer-by-layer (LbL) assembly process which relies on the assembly of oppositely charged molecules from aqueous solutions was used to build a freestanding composite film of isolated wood polymers of lignin and oxidized nanofibril cellulose (NFC). To facilitate the assembly of these negatively charged polymers, a positively charged polyelectrolyte, poly(diallyldimethylammomium chloride) (PDDA), was used as a linking layer to create this simplified model cell wall. The layered adsorption process was studied quantitatively using quartz crystal microbalance with dissipation monitoring (QCM-D) and ellipsometry. The results showed that layer mass/thickness per adsorbed layer increased as a function of total number of layers. The surface coverage of the adsorbed layers was studied with atomic force microscopy (AFM). Complete coverage of the surface with lignin in all the deposition cycles was found for the system, however, surface coverage by NFC increased with the number of layers. The adsorption process was carried out for 250 cycles (500 bilayers) on a cellulose acetate (CA) substrate. Transparent free-standing LBL assembled nanocomposite films were obtained when the CA substrate was later dissolved in acetone. Scanning electron microscopy (SEM) of the fractured cross-sections showed a lamellar structure, and the thickness per adsorption cycle (PDDA-Lignin-PDDA-NC) was estimated to be 17 nm for two different lignin types used in the study. The data indicates a film with highly controlled architecture where nanocellulose and lignin are spatially deposited on the nanoscale (a polymer-polymer nanocomposites), similar to what is observed in the native cell wall.
Plant Biology, Issue 88, nanocellulose, thin films, quartz crystal microbalance, layer-by-layer, LbL
51257
Play Button
Transcript and Metabolite Profiling for the Evaluation of Tobacco Tree and Poplar as Feedstock for the Bio-based Industry
Authors: Colin Ruprecht, Takayuki Tohge, Alisdair Fernie, Cara L. Mortimer, Amanda Kozlo, Paul D. Fraser, Norma Funke, Igor Cesarino, Ruben Vanholme, Wout Boerjan, Kris Morreel, Ingo Burgert, Notburga Gierlinger, Vincent Bulone, Vera Schneider, Andrea Stockero, Juan Navarro-Aviñó, Frank Pudel, Bart Tambuyser, James Hygate, Jon Bumstead, Louis Notley, Staffan Persson.
Institutions: Max Planck Institute for Molecular Plant Physiology, Royal Holloway, University of London, VIB, UGhent, ETH Zurich, EMPA, Royal Institute of Technology (KTH), European Research and Project Office GmbH, ABBA Gaia S.L., Pflanzenöltechnologie, Capax Environmental Services, Green Fuels, Neutral Consulting Ltd, University of Melbourne.
The global demand for food, feed, energy, and water poses extraordinary challenges for future generations. It is evident that robust platforms for the exploration of renewable resources are necessary to overcome these challenges. Within the multinational framework MultiBioPro we are developing biorefinery pipelines to maximize the use of plant biomass. More specifically, we use poplar and tobacco tree (Nicotiana glauca) as target crop species for improving saccharification, isoprenoid, long chain hydrocarbon contents, fiber quality, and suberin and lignin contents. The methods used to obtain these outputs include GC-MS, LC-MS and RNA sequencing platforms. The metabolite pipelines are well established tools to generate these types of data, but also have the limitations in that only well characterized metabolites can be used. The deep sequencing will allow us to include all transcripts present during the developmental stages of the tobacco tree leaf, but has to be mapped back to the sequence of Nicotiana tabacum. With these set-ups, we aim at a basic understanding for underlying processes and at establishing an industrial framework to exploit the outcomes. In a more long term perspective, we believe that data generated here will provide means for a sustainable biorefinery process using poplar and tobacco tree as raw material. To date the basal level of metabolites in the samples have been analyzed and the protocols utilized are provided in this article.
Environmental Sciences, Issue 87, botany, plants, Biorefining, Poplar, Tobacco tree, Arabidopsis, suberin, lignin, cell walls, biomass, long-chain hydrocarbons, isoprenoids, Nicotiana glauca, systems biology
51393
Play Button
In vitro Cell Culture Model for Toxic Inhaled Chemical Testing
Authors: Shama Ahmad, Aftab Ahmad, Keith B. Neeves, Tara Hendry-Hofer, Joan E. Loader, Carl W. White, Livia Veress.
Institutions: University of Colorado, Colorado School of Mines.
Cell cultures are indispensable to develop and study efficacy of therapeutic agents, prior to their use in animal models. We have the unique ability to model well differentiated human airway epithelium and heart muscle cells. This could be an invaluable tool to study the deleterious effects of toxic inhaled chemicals, such as chlorine, that can normally interact with the cell surfaces, and form various byproducts upon reacting with water, and limiting their effects in submerged cultures. Our model using well differentiated human airway epithelial cell cultures at air-liqiuid interface circumvents this limitation as well as provides an opportunity to evaluate critical mechanisms of toxicity of potential poisonous inhaled chemicals. We describe enhanced loss of membrane integrity, caspase release and death upon toxic inhaled chemical such as chlorine exposure. In this article, we propose methods to model chlorine exposure in mammalian heart and airway epithelial cells in culture and simple tests to evaluate its effect on these cell types.
Bioengineering, Issue 87, air-liquid interface, chlorine exposure, toxic inhaled chemicals, Transepithelial Electrical Resistance,Immunocytochemistry
51539
Play Button
Analysis of Nephron Composition and Function in the Adult Zebrafish Kidney
Authors: Kristen K. McCampbell, Kristin N. Springer, Rebecca A. Wingert.
Institutions: University of Notre Dame.
The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.
Cellular Biology, Issue 90, zebrafish; kidney; nephron; nephrology; renal; regeneration; proximal tubule; distal tubule; segment; mesonephros; physiology; acute kidney injury (AKI)
51644
Play Button
Evaluation of Integrated Anaerobic Digestion and Hydrothermal Carbonization for Bioenergy Production
Authors: M. Toufiq Reza, Maja Werner, Marcel Pohl, Jan Mumme.
Institutions: Leibniz Institute for Agricultural Engineering.
Lignocellulosic biomass is one of the most abundant yet underutilized renewable energy resources. Both anaerobic digestion (AD) and hydrothermal carbonization (HTC) are promising technologies for bioenergy production from biomass in terms of biogas and HTC biochar, respectively. In this study, the combination of AD and HTC is proposed to increase overall bioenergy production. Wheat straw was anaerobically digested in a novel upflow anaerobic solid state reactor (UASS) in both mesophilic (37 °C) and thermophilic (55 °C) conditions. Wet digested from thermophilic AD was hydrothermally carbonized at 230 °C for 6 hr for HTC biochar production. At thermophilic temperature, the UASS system yields an average of 165 LCH4/kgVS (VS: volatile solids) and 121 L CH4/kgVS at mesophilic AD over the continuous operation of 200 days. Meanwhile, 43.4 g of HTC biochar with 29.6 MJ/kgdry_biochar was obtained from HTC of 1 kg digestate (dry basis) from mesophilic AD. The combination of AD and HTC, in this particular set of experiment yield 13.2 MJ of energy per 1 kg of dry wheat straw, which is at least 20% higher than HTC alone and 60.2% higher than AD only.
Environmental Sciences, Issue 88, Biomethane, Hydrothermal Carbonization (HTC), Calorific Value, Lignocellulosic Biomass, UASS, Anaerobic Digestion
51734
Play Button
Creating Dynamic Images of Short-lived Dopamine Fluctuations with lp-ntPET: Dopamine Movies of Cigarette Smoking
Authors: Evan D. Morris, Su Jin Kim, Jenna M. Sullivan, Shuo Wang, Marc D. Normandin, Cristian C. Constantinescu, Kelly P. Cosgrove.
Institutions: Yale University, Yale University, Yale University, Yale University, Massachusetts General Hospital, University of California, Irvine.
We describe experimental and statistical steps for creating dopamine movies of the brain from dynamic PET data. The movies represent minute-to-minute fluctuations of dopamine induced by smoking a cigarette. The smoker is imaged during a natural smoking experience while other possible confounding effects (such as head motion, expectation, novelty, or aversion to smoking repeatedly) are minimized. We present the details of our unique analysis. Conventional methods for PET analysis estimate time-invariant kinetic model parameters which cannot capture short-term fluctuations in neurotransmitter release. Our analysis - yielding a dopamine movie - is based on our work with kinetic models and other decomposition techniques that allow for time-varying parameters 1-7. This aspect of the analysis - temporal-variation - is key to our work. Because our model is also linear in parameters, it is practical, computationally, to apply at the voxel level. The analysis technique is comprised of five main steps: pre-processing, modeling, statistical comparison, masking and visualization. Preprocessing is applied to the PET data with a unique 'HYPR' spatial filter 8 that reduces spatial noise but preserves critical temporal information. Modeling identifies the time-varying function that best describes the dopamine effect on 11C-raclopride uptake. The statistical step compares the fit of our (lp-ntPET) model 7 to a conventional model 9. Masking restricts treatment to those voxels best described by the new model. Visualization maps the dopamine function at each voxel to a color scale and produces a dopamine movie. Interim results and sample dopamine movies of cigarette smoking are presented.
Behavior, Issue 78, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Medicine, Anatomy, Physiology, Image Processing, Computer-Assisted, Receptors, Dopamine, Dopamine, Functional Neuroimaging, Binding, Competitive, mathematical modeling (systems analysis), Neurotransmission, transient, dopamine release, PET, modeling, linear, time-invariant, smoking, F-test, ventral-striatum, clinical techniques
50358
Play Button
Evaluation of Respiratory System Mechanics in Mice using the Forced Oscillation Technique
Authors: Toby K. McGovern, Annette Robichaud, Liah Fereydoonzad, Thomas F. Schuessler, James G. Martin.
Institutions: McGill University , SCIREQ Scientific Respiratory Equipment Inc..
The forced oscillation technique (FOT) is a powerful, integrative and translational tool permitting the experimental assessment of lung function in mice in a comprehensive, detailed, precise and reproducible manner. It provides measurements of respiratory system mechanics through the analysis of pressure and volume signals acquired in reaction to predefined, small amplitude, oscillatory airflow waveforms, which are typically applied at the subject's airway opening. The present protocol details the steps required to adequately execute forced oscillation measurements in mice using a computer-controlled piston ventilator (flexiVent; SCIREQ Inc, Montreal, Qc, Canada). The description is divided into four parts: preparatory steps, mechanical ventilation, lung function measurements, and data analysis. It also includes details of how to assess airway responsiveness to inhaled methacholine in anesthetized mice, a common application of this technique which also extends to other outcomes and various lung pathologies. Measurements obtained in naïve mice as well as from an oxidative-stress driven model of airway damage are presented to illustrate how this tool can contribute to a better characterization and understanding of studied physiological changes or disease models as well as to applications in new research areas.
Medicine, Issue 75, Biomedical Engineering, Anatomy, Physiology, Biophysics, Pathology, lung diseases, asthma, respiratory function tests, respiratory system, forced oscillation technique, respiratory system mechanics, airway hyperresponsiveness, flexiVent, lung physiology, lung, oxidative stress, ventilator, cannula, mice, animal model, clinical techniques
50172
Play Button
Protein Transfection of Mouse Lung
Authors: Patrick Geraghty, Robert Foronjy.
Institutions: St. Luke's Roosevelt Medical Center.
Increasing protein expression enables researchers to better understand the functional role of that protein in regulating key biological processes1. In the lung, this has been achieved typically through genetic approaches that utilize transgenic mice2,3 or viral or non-viral vectors that elevate protein levels via increased gene expression4. Transgenic mice are costly and time-consuming to generate and the random insertion of a transgene or chronic gene expression can alter normal lung development and thus limit the utility of the model5. While conditional transgenics avert problems associated with chronic gene expression6, the reverse tetracycline-controlled transactivator (rtTA) mice, which are used to generate conditional expression, develop spontaneous air space enlargement7. As with transgenics, the use of viral and non-viral vectors is expensive8 and can provoke dose-dependent inflammatory responses that confound results9 and hinder expression10. Moreover, the efficacy of repeated doses are limited by enhanced immune responses to the vector11,12. Researchers are developing adeno-associated viral (AAV) vectors that provoke less inflammation and have longer expression within the lung13. Using β-galactosidase, we present a method for rapidly and effectively increasing protein expression within the lung using a direct protein transfection technique. This protocol mixes a fixed amount of purified protein with 20 μl of a lipid-based transfection reagent (Pro-Ject, Pierce Bio) to allow penetration into the lung tissue itself. The liposomal protein mixture is then injected into the lungs of the mice via the trachea using a microsprayer (Penn Century, Philadelphia, PA). The microsprayer generates a fine plume of liquid aerosol throughout the lungs. Using the technique we have demonstrated uniform deposition of the injected protein throughout the airways and the alveoli of mice14. The lipid transfection technique allows the use of a small amount of protein to achieve effect. This limits the inflammatory response that otherwise would be provoked by high protein administration. Indeed, using this technique we published that we were able to significantly increase PP2A activity in the lung without affecting lung lavage cellularity15. Lung lavage cellularity taken 24 hr after challenge was comparable to controls (27±4 control vs. 31±5 albumin transfected; N=6 per group). Moreover, it increases protein levels without inducing lung developmental changes or architectural changes that can occur in transgenic models. However, the need for repeated administrations may make this technique less favorable for studies examining the effects of long-term increases in protein expression. This would be particularly true for proteins with short half-lives.
Molecular Biology, Issue 75, Medicine, Biomedical Engineering, Bioengineering, Biochemistry, Genetics, Cellular Biology, Anatomy, Physiology, Proteins, Torso, Tissues, Cells, Animal Structures, Respiratory System, Eukaryota, Immune System Diseases, Respiratory Tract Diseases, Natural Science Disciplines, Life Sciences (General), transfection, lung, protein, mice, inflammation, animal model
50080
Play Button
Comprehensive Compositional Analysis of Plant Cell Walls (Lignocellulosic biomass) Part I: Lignin
Authors: Cliff E. Foster, Tina M. Martin, Markus Pauly.
Institutions: Michigan State University (MSU), Michigan State University (MSU).
The need for renewable, carbon neutral, and sustainable raw materials for industry and society has become one of the most pressing issues for the 21st century. This has rekindled interest in the use of plant products as industrial raw materials for the production of liquid fuels for transportation1 and other products such as biocomposite materials7. Plant biomass remains one of the greatest untapped reserves on the planet4. It is mostly comprised of cell walls that are composed of energy rich polymers including cellulose, various hemicelluloses (matrix polysaccharides, and the polyphenol lignin6 and thus sometimes termed lignocellulosics. However, plant cell walls have evolved to be recalcitrant to degradation as walls provide tensile strength to cells and the entire plants, ward off pathogens, and allow water to be transported throughout the plant; in the case of trees up to more the 100 m above ground level. Due to the various functions of walls, there is an immense structural diversity within the walls of different plant species and cell types within a single plant4. Hence, depending of what crop species, crop variety, or plant tissue is used for a biorefinery, the processing steps for depolymerization by chemical/enzymatic processes and subsequent fermentation of the various sugars to liquid biofuels need to be adjusted and optimized. This fact underpins the need for a thorough characterization of plant biomass feedstocks. Here we describe a comprehensive analytical methodology that enables the determination of the composition of lignocellulosics and is amenable to a medium to high-throughput analysis. In this first part we focus on the analysis of the polyphenol lignin (Figure 1). The method starts of with preparing destarched cell wall material. The resulting lignocellulosics are then split up to determine its lignin content by acetylbromide solubilization3, and its lignin composition in terms of its syringyl, guaiacyl- and p-hydroxyphenyl units5. The protocol for analyzing the carbohydrates in lignocellulosic biomass including cellulose content and matrix polysaccharide composition is discussed in Part II2.
Plant Biology, Issue 37, cell walls, lignin, GC-MS, biomass, compositional analysis
1745
Play Button
Comprehensive Compositional Analysis of Plant Cell Walls (Lignocellulosic biomass) Part II: Carbohydrates
Authors: Cliff E. Foster, Tina M. Martin, Markus Pauly.
Institutions: Michigan State University (MSU), Michigan State University (MSU).
The need for renewable, carbon neutral, and sustainable raw materials for industry and society has become one of the most pressing issues for the 21st century. This has rekindled interest in the use of plant products as industrial raw materials for the production of liquid fuels for transportation2 and other products such as biocomposite materials6. Plant biomass remains one of the greatest untapped reserves on the planet4. It is mostly comprised of cell walls that are composed of energy rich polymers including cellulose, various hemicelluloses, and the polyphenol lignin5 and thus sometimes termed lignocellulosics. However, plant cell walls have evolved to be recalcitrant to degradation as walls contribute extensively to the strength and structural integrity of the entire plant. Despite its necessary rigidity, the cell wall is a highly dynamic entity that is metabolically active and plays crucial roles in numerous cell activities such as plant growth and differentiation5. Due to the various functions of walls, there is an immense structural diversity within the walls of different plant species and cell types within a single plant4. Hence, depending of what crop species, crop variety, or plant tissue is used for a biorefinery, the processing steps for depolymerisation by chemical/enzymatic processes and subsequent fermentation of the various sugars to liquid biofuels need to be adjusted and optimized. This fact underpins the need for a thorough characterization of plant biomass feedstocks. Here we describe a comprehensive analytical methodology that enables the determination of the composition of lignocellulosics and is amenable to a medium to high-throughput analysis (Figure 1). The method starts of with preparing destarched cell wall material. The resulting lignocellulosics are then split up to determine its monosaccharide composition of the hemicelluloses and other matrix polysaccharides1, and its content of crystalline cellulose7. The protocol for analyzing the lignin components in lignocellulosic biomass is discussed in Part I3.
Plant Biology, Issue 37, cell walls, polysaccharide, cellulose, hemicellulose, sugar composition, GC-MS
1837
Play Button
Label-free in situ Imaging of Lignification in Plant Cell Walls
Authors: Martin Schmidt, Pradeep Perera, Adam M. Schwartzberg, Paul D. Adams, P. James Schuck.
Institutions: University of California, Berkeley, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Meeting growing energy demands safely and efficiently is a pressing global challenge. Therefore, research into biofuels production that seeks to find cost-effective and sustainable solutions has become a topical and critical task. Lignocellulosic biomass is poised to become the primary source of biomass for the conversion to liquid biofuels1-6. However, the recalcitrance of these plant cell wall materials to cost-effective and efficient degradation presents a major impediment for their use in the production of biofuels and chemicals4. In particular, lignin, a complex and irregular poly-phenylpropanoid heteropolymer, becomes problematic to the postharvest deconstruction of lignocellulosic biomass. For example in biomass conversion for biofuels, it inhibits saccharification in processes aimed at producing simple sugars for fermentation7. The effective use of plant biomass for industrial purposes is in fact largely dependent on the extent to which the plant cell wall is lignified. The removal of lignin is a costly and limiting factor8 and lignin has therefore become a key plant breeding and genetic engineering target in order to improve cell wall conversion. Analytical tools that permit the accurate rapid characterization of lignification of plant cell walls become increasingly important for evaluating a large number of breeding populations. Extractive procedures for the isolation of native components such as lignin are inevitably destructive, bringing about significant chemical and structural modifications9-11. Analytical chemical in situ methods are thus invaluable tools for the compositional and structural characterization of lignocellulosic materials. Raman microscopy is a technique that relies on inelastic or Raman scattering of monochromatic light, like that from a laser, where the shift in energy of the laser photons is related to molecular vibrations and presents an intrinsic label-free molecular "fingerprint" of the sample. Raman microscopy can afford non-destructive and comparatively inexpensive measurements with minimal sample preparation, giving insights into chemical composition and molecular structure in a close to native state. Chemical imaging by confocal Raman microscopy has been previously used for the visualization of the spatial distribution of cellulose and lignin in wood cell walls12-14. Based on these earlier results, we have recently adopted this method to compare lignification in wild type and lignin-deficient transgenic Populus trichocarpa (black cottonwood) stem wood15. Analyzing the lignin Raman bands16,17 in the spectral region between 1,600 and 1,700 cm-1, lignin signal intensity and localization were mapped in situ. Our approach visualized differences in lignin content, localization, and chemical composition. Most recently, we demonstrated Raman imaging of cell wall polymers in Arabidopsis thaliana with lateral resolution that is sub-μm18. Here, this method is presented affording visualization of lignin in plant cell walls and comparison of lignification in different tissues, samples or species without staining or labeling of the tissues.
Plant Biology, Issue 45, Raman microscopy, lignin, poplar wood, Arabidopsis thaliana
2064
Play Button
Isolation of Mouse Respiratory Epithelial Cells and Exposure to Experimental Cigarette Smoke at Air Liquid Interface
Authors: Hilaire C. Lam, Augustine M.K. Choi, Stefan W. Ryter.
Institutions: Harvard Medical School, University of Pittsburgh.
Pulmonary epithelial cells can be isolated from the respiratory tract of mice and cultured at air-liquid interface (ALI) as a model of differentiated respiratory epithelium. A protocol is described for isolating and exposing these cells to mainstream cigarette smoke (CS), in order to study epithelial cell responses to CS exposure. The protocol consists of three parts: the isolation of airway epithelial cells from mouse trachea, the culturing of these cells at air-liquid interface (ALI) as fully differentiated epithelial cells, and the delivery of calibrated mainstream CS to these cells in culture. The ALI culture system allows the culture of respiratory epithelia under conditions that more closely resemble their physiological setting than ordinary liquid culture systems. The study of molecular and lung cellular responses to CS exposure is a critical component of understanding the impact of environmental air pollution on human health. Research findings in this area may ultimately contribute towards understanding the etiology of chronic obstructive pulmonary disease (COPD), and other tobacco-related diseases, which represent major global health problems.
Medicine, Issue 48, Air-Liquid Interface, Cell isolation, Cigarette smoke, Epithelial cells
2513
Play Button
GENPLAT: an Automated Platform for Biomass Enzyme Discovery and Cocktail Optimization
Authors: Jonathan Walton, Goutami Banerjee, Suzana Car.
Institutions: Michigan State University, Michigan State University.
The high cost of enzymes for biomass deconstruction is a major impediment to the economic conversion of lignocellulosic feedstocks to liquid transportation fuels such as ethanol. We have developed an integrated high throughput platform, called GENPLAT, for the discovery and development of novel enzymes and enzyme cocktails for the release of sugars from diverse pretreatment/biomass combinations. GENPLAT comprises four elements: individual pure enzymes, statistical design of experiments, robotic pipeting of biomass slurries and enzymes, and automated colorimeteric determination of released Glc and Xyl. Individual enzymes are produced by expression in Pichia pastoris or Trichoderma reesei, or by chromatographic purification from commercial cocktails or from extracts of novel microorganisms. Simplex lattice (fractional factorial) mixture models are designed using commercial Design of Experiment statistical software. Enzyme mixtures of high complexity are constructed using robotic pipeting into a 96-well format. The measurement of released Glc and Xyl is automated using enzyme-linked colorimetric assays. Optimized enzyme mixtures containing as many as 16 components have been tested on a variety of feedstock and pretreatment combinations. GENPLAT is adaptable to mixtures of pure enzymes, mixtures of commercial products (e.g., Accellerase 1000 and Novozyme 188), extracts of novel microbes, or combinations thereof. To make and test mixtures of ˜10 pure enzymes requires less than 100 μg of each protein and fewer than 100 total reactions, when operated at a final total loading of 15 mg protein/g glucan. We use enzymes from several sources. Enzymes can be purified from natural sources such as fungal cultures (e.g., Aspergillus niger, Cochliobolus carbonum, and Galerina marginata), or they can be made by expression of the encoding genes (obtained from the increasing number of microbial genome sequences) in hosts such as E. coli, Pichia pastoris, or a filamentous fungus such as T. reesei. Proteins can also be purified from commercial enzyme cocktails (e.g., Multifect Xylanase, Novozyme 188). An increasing number of pure enzymes, including glycosyl hydrolases, cell wall-active esterases, proteases, and lyases, are available from commercial sources, e.g., Megazyme, Inc. (www.megazyme.com), NZYTech (www.nzytech.com), and PROZOMIX (www.prozomix.com). Design-Expert software (Stat-Ease, Inc.) is used to create simplex-lattice designs and to analyze responses (in this case, Glc and Xyl release). Mixtures contain 4-20 components, which can vary in proportion between 0 and 100%. Assay points typically include the extreme vertices with a sufficient number of intervening points to generate a valid model. In the terminology of experimental design, most of our studies are "mixture" experiments, meaning that the sum of all components adds to a total fixed protein loading (expressed as mg/g glucan). The number of mixtures in the simplex-lattice depends on both the number of components in the mixture and the degree of polynomial (quadratic or cubic). For example, a 6-component experiment will entail 63 separate reactions with an augmented special cubic model, which can detect three-way interactions, whereas only 23 individual reactions are necessary with an augmented quadratic model. For mixtures containing more than eight components, a quadratic experimental design is more practical, and in our experience such models are usually statistically valid. All enzyme loadings are expressed as a percentage of the final total loading (which for our experiments is typically 15 mg protein/g glucan). For "core" enzymes, the lower percentage limit is set to 5%. This limit was derived from our experience in which yields of Glc and/or Xyl were very low if any core enzyme was present at 0%. Poor models result from too many samples showing very low Glc or Xyl yields. Setting a lower limit in turn determines an upper limit. That is, for a six-component experiment, if the lower limit for each single component is set to 5%, then the upper limit of each single component will be 75%. The lower limits of all other enzymes considered as "accessory" are set to 0%. "Core" and "accessory" are somewhat arbitrary designations and will differ depending on the substrate, but in our studies the core enzymes for release of Glc from corn stover comprise the following enzymes from T. reesei: CBH1 (also known as Cel7A), CBH2 (Cel6A), EG1(Cel7B), BG (β-glucosidase), EX3 (endo-β1,4-xylanase, GH10), and BX (β-xylosidase).
Bioengineering, Issue 56, cellulase, cellobiohydrolase, glucanase, xylanase, hemicellulase, experimental design, biomass, bioenergy, corn stover, glycosyl hydrolase
3314
Play Button
Continuously-stirred Anaerobic Digester to Convert Organic Wastes into Biogas: System Setup and Basic Operation
Authors: Joseph G. Usack, Catherine M. Spirito, Largus T. Angenent.
Institutions: Cornell University.
Anaerobic digestion (AD) is a bioprocess that is commonly used to convert complex organic wastes into a useful biogas with methane as the energy carrier 1-3. Increasingly, AD is being used in industrial, agricultural, and municipal waste(water) treatment applications 4,5. The use of AD technology allows plant operators to reduce waste disposal costs and offset energy utility expenses. In addition to treating organic wastes, energy crops are being converted into the energy carrier methane 6,7. As the application of AD technology broadens for the treatment of new substrates and co-substrate mixtures 8, so does the demand for a reliable testing methodology at the pilot- and laboratory-scale. Anaerobic digestion systems have a variety of configurations, including the continuously stirred tank reactor (CSTR), plug flow (PF), and anaerobic sequencing batch reactor (ASBR) configurations 9. The CSTR is frequently used in research due to its simplicity in design and operation, but also for its advantages in experimentation. Compared to other configurations, the CSTR provides greater uniformity of system parameters, such as temperature, mixing, chemical concentration, and substrate concentration. Ultimately, when designing a full-scale reactor, the optimum reactor configuration will depend on the character of a given substrate among many other nontechnical considerations. However, all configurations share fundamental design features and operating parameters that render the CSTR appropriate for most preliminary assessments. If researchers and engineers use an influent stream with relatively high concentrations of solids, then lab-scale bioreactor configurations cannot be fed continuously due to plugging problems of lab-scale pumps with solids or settling of solids in tubing. For that scenario with continuous mixing requirements, lab-scale bioreactors are fed periodically and we refer to such configurations as continuously stirred anaerobic digesters (CSADs). This article presents a general methodology for constructing, inoculating, operating, and monitoring a CSAD system for the purpose of testing the suitability of a given organic substrate for long-term anaerobic digestion. The construction section of this article will cover building the lab-scale reactor system. The inoculation section will explain how to create an anaerobic environment suitable for seeding with an active methanogenic inoculum. The operating section will cover operation, maintenance, and troubleshooting. The monitoring section will introduce testing protocols using standard analyses. The use of these measures is necessary for reliable experimental assessments of substrate suitability for AD. This protocol should provide greater protection against a common mistake made in AD studies, which is to conclude that reactor failure was caused by the substrate in use, when really it was improper user operation 10.
Bioengineering, Issue 65, Environmental Engineering, Chemistry, Anaerobic Digestion, Bioenergy, Biogas, Methane, Organic Waste, Methanogenesis, Energy Crops
3978
Play Button
Bronchoalveolar Lavage (BAL) for Research; Obtaining Adequate Sample Yield
Authors: Andrea M. Collins, Jamie Rylance, Daniel G. Wootton, Angela D. Wright, Adam K. A. Wright, Duncan G. Fullerton, Stephen B. Gordon.
Institutions: National Institute for Health Research, Royal Liverpool and Broadgreen University Hospital Trust, Liverpool School of Tropical Medicine, University of Liverpool, Royal Liverpool and Broadgreen University Hospital Trust, University Hospital Aintree.
We describe a research technique for fiberoptic bronchoscopy with bronchoalveolar lavage (BAL) using manual hand held suction in order to remove nonadherent cells and lung lining fluid from the mucosal surface. In research environments, BAL allows sampling of innate (lung macrophage), cellular (B- and T- cells), and humoral (immunoglobulin) responses within the lung. BAL is internationally accepted for research purposes and since 1999 the technique has been performed in > 1,000 subjects in the UK and Malawi by our group. Our technique uses gentle hand-held suction of instilled fluid; this is designed to maximize BAL volume returned and apply minimum shear force on ciliated epithelia in order to preserve the structure and function of cells within the BAL fluid and to preserve viability to facilitate the growth of cells in ex vivo culture. The research technique therefore uses a larger volume instillate (typically in the order of 200 ml) and employs manual suction to reduce cell damage. Patients are given local anesthetic, offered conscious sedation (midazolam), and tolerate the procedure well with minimal side effects. Verbal and written subject information improves tolerance and written informed consent is mandatory. Safety of the subject is paramount. Subjects are carefully selected using clear inclusion and exclusion criteria. This protocol includes a description of the potential risks, and the steps taken to mitigate them, a list of contraindications, pre- and post-procedure checks, as well as precise bronchoscopy and laboratory techniques.
Medicine, Issue 85, Research bronchoscopy, bronchoalveolar lavage (BAL), fiberoptic bronchoscopy, lymphocyte, macrophage
4345
Play Button
Right Ventricular Systolic Pressure Measurements in Combination with Harvest of Lung and Immune Tissue Samples in Mice
Authors: Wen-Chi Chen, Sung-Hyun Park, Carol Hoffman, Cecil Philip, Linda Robinson, James West, Gabriele Grunig.
Institutions: New York University School of Medicine, Tuxedo, Vanderbilt University Medical Center, New York University School of Medicine.
The function of the right heart is to pump blood through the lungs, thus linking right heart physiology and pulmonary vascular physiology. Inflammation is a common modifier of heart and lung function, by elaborating cellular infiltration, production of cytokines and growth factors, and by initiating remodeling processes 1. Compared to the left ventricle, the right ventricle is a low-pressure pump that operates in a relatively narrow zone of pressure changes. Increased pulmonary artery pressures are associated with increased pressure in the lung vascular bed and pulmonary hypertension 2. Pulmonary hypertension is often associated with inflammatory lung diseases, for example chronic obstructive pulmonary disease, or autoimmune diseases 3. Because pulmonary hypertension confers a bad prognosis for quality of life and life expectancy, much research is directed towards understanding the mechanisms that might be targets for pharmaceutical intervention 4. The main challenge for the development of effective management tools for pulmonary hypertension remains the complexity of the simultaneous understanding of molecular and cellular changes in the right heart, the lungs and the immune system. Here, we present a procedural workflow for the rapid and precise measurement of pressure changes in the right heart of mice and the simultaneous harvest of samples from heart, lungs and immune tissues. The method is based on the direct catheterization of the right ventricle via the jugular vein in close-chested mice, first developed in the late 1990s as surrogate measure of pressures in the pulmonary artery5-13. The organized team-approach facilitates a very rapid right heart catheterization technique. This makes it possible to perform the measurements in mice that spontaneously breathe room air. The organization of the work-flow in distinct work-areas reduces time delay and opens the possibility to simultaneously perform physiology experiments and harvest immune, heart and lung tissues. The procedural workflow outlined here can be adapted for a wide variety of laboratory settings and study designs, from small, targeted experiments, to large drug screening assays. The simultaneous acquisition of cardiac physiology data that can be expanded to include echocardiography5,14-17 and harvest of heart, lung and immune tissues reduces the number of animals needed to obtain data that move the scientific knowledge basis forward. The procedural workflow presented here also provides an ideal basis for gaining knowledge of the networks that link immune, lung and heart function. The same principles outlined here can be adapted to study other or additional organs as needed.
Immunology, Issue 71, Medicine, Anatomy, Physiology, Cardiology, Surgery, Cardiovascular Abnormalities, Inflammation, Respiration Disorders, Immune System Diseases, Cardiac physiology, mouse, pulmonary hypertension, right heart function, lung immune response, lung inflammation, lung remodeling, catheterization, mice, tissue, animal model
50023
Play Button
Small Volume (1-3L) Filtration of Coastal Seawater Samples
Authors: David A. Walsh, Elena Zaikova, Steven J. Hallam.
Institutions: University of British Columbia - UBC.
The workflow begins with the collection of coastal marine waters for downstream microbial community, nutrient and trace gas analyses. For today s demonstration samples were collected from the deck of the HMS John Strickland operating in Saanich Inlet. This video documents small volume (~1 L) filtration of microbial biomass from the water column. The protocol is an extension of the large volume sampling protocol described earlier, with one major difference: here, there is no pre-filtration step, so all size classes of biomass are collected down to the 0.22 μm filter cut-off. Samples collected this way are ideal for nucleic acid analysis. The set-up, filtration, and clean-up steps each take about 20-30 minutes. If using two peristaltic pumps simultaneously, up to 8 samples may be filtered at the same time. To prevent biofilm formation between sampling trips, all filtration equipment must be rinsed with dilute HCl and deionized water and autoclaved immediately after use.
Molecular Biology, Issue 28, microbiology, seawater, filtration, biomass concentration
1163
Play Button
Electroporation of Mycobacteria
Authors: Renan Goude, Tanya Parish.
Institutions: Barts and the London School of Medicine and Dentistry, Barts and the London School of Medicine and Dentistry.
High efficiency transformation is a major limitation in the study of mycobacteria. The genus Mycobacterium can be difficult to transform; this is mainly caused by the thick and waxy cell wall, but is compounded by the fact that most molecular techniques have been developed for distantly-related species such as Escherichia coli and Bacillus subtilis. In spite of these obstacles, mycobacterial plasmids have been identified and DNA transformation of many mycobacterial species have now been described. The most successful method for introducing DNA into mycobacteria is electroporation. Many parameters contribute to successful transformation; these include the species/strain, the nature of the transforming DNA, the selectable marker used, the growth medium, and the conditions for the electroporation pulse. Optimized methods for the transformation of both slow- and fast-grower are detailed here. Transformation efficiencies for different mycobacterial species and with various selectable markers are reported.
Microbiology, Issue 15, Springer Protocols, Mycobacteria, Electroporation, Bacterial Transformation, Transformation Efficiency, Bacteria, Tuberculosis, M. Smegmatis, Springer Protocols
761
Play Button
Investigating the Microbial Community in the Termite Hindgut - Interview
Authors: Jared Leadbetter.
Institutions: California Institute of Technology - Caltech.
Jared Leadbetter explains why the termite-gut microbial community is an excellent system for studying the complex interactions between microbes. The symbiotic relationship existing between the host insect and lignocellulose-degrading gut microbes is explained, as well as the industrial uses of these microbes for degrading plant biomass and generating biofuels.
Microbiology, issue 4, microbial community, diversity
196
Play Button
Tracheotomy: A Method for Transplantation of Stem Cells to the Lung
Authors: Yakov Peter.
Institutions: Harvard Medical School.
Lung disease is a leading cause of death and likely to become an epidemic given increases in pollution and smoking worldwide. Advances in stem cell therapy may alleviate many of the symptoms associated with lung disease and induce alveolar repair in adults. Concurrent with the ongoing search for stem cells applicable for human treatment, precise delivery and homing (to the site of disease) must be reassured for successful therapy. Here, I report that stem cells can safely be instilled via the trachea opening a non-stop route to the lung. This method involves a skin incision, caudal insertion of a cannula into and along the tracheal lumen, and injection of a stem cell vehicle mixture into airways of the lung. A broad range of media solutions and stabilizers can be instilled via tracheotomy, resulting in the ability to deliver a wider range of cell types. With alveolar epithelium confining these cells to the lumen, lung expansion and negative pressure during inhalation may also assist in stem cell integration. Tracheal delivery of stem cells, with a quick uptake and the ability to handle a large range of treatments, could accelerate the development of cell-based therapies, opening new avenues for treatment of lung disease.
Cellular Biology, Issue 2, lung, stem cells, transplantation, trachea
163
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.