JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Benchmarking successional progress in a quantitative food web.
PLoS ONE
PUBLISHED: 01-01-2014
Central to ecology and ecosystem management, succession theory aims to mechanistically explain and predict the assembly and development of ecological communities. Yet processes at lower hierarchical levels, e.g. at the species and functional group level, are rarely mechanistically linked to the under-investigated system-level processes which drive changes in ecosystem properties and functioning and are comparable across ecosystems. As a model system for secondary succession, seasonal plankton succession during the growing season is readily observable and largely driven autogenically. We used a long-term dataset from large, deep Lake Constance comprising biomasses, auto- and heterotrophic production, food quality, functional diversity, and mass-balanced food webs of the energy and nutrient flows between functional guilds of plankton and partly fish. Extracting population- and system-level indices from this dataset, we tested current hypotheses about the directionality of successional progress which are rooted in ecosystem theory, the metabolic theory of ecology, quantitative food web theory, thermodynamics, and information theory. Our results indicate that successional progress in Lake Constance is quantifiable, passing through predictable stages. Mean body mass, functional diversity, predator-prey weight ratios, trophic positions, system residence times of carbon and nutrients, and the complexity of the energy flow patterns increased during succession. In contrast, both the mass-specific metabolic activity and the system export decreased, while the succession rate exhibited a bimodal pattern. The weighted connectance introduced here represents a suitable index for assessing the evenness and interconnectedness of energy flows during succession. Diverging from earlier predictions, ascendency and eco-exergy did not increase during succession. Linking aspects of functional diversity to metabolic theory and food web complexity, we reconcile previously disjoint bodies of ecological theory to form a complete picture of successional progress within a pelagic food web. This comprehensive synthesis may be used as a benchmark for quantifying successional progress in other ecosystems.
Authors: Jennifer L Soong, Dan Reuss, Colin Pinney, Ty Boyack, Michelle L Haddix, Catherine E Stewart, M. Francesca Cotrufo.
Published: 01-16-2014
ABSTRACT
Tracing rare stable isotopes from plant material through the ecosystem provides the most sensitive information about ecosystem processes; from CO2 fluxes and soil organic matter formation to small-scale stable-isotope biomarker probing. Coupling multiple stable isotopes such as 13C with 15N, 18O or 2H has the potential to reveal even more information about complex stoichiometric relationships during biogeochemical transformations. Isotope labeled plant material has been used in various studies of litter decomposition and soil organic matter formation1-4. From these and other studies, however, it has become apparent that structural components of plant material behave differently than metabolic components (i.e. leachable low molecular weight compounds) in terms of microbial utilization and long-term carbon storage5-7. The ability to study structural and metabolic components separately provides a powerful new tool for advancing the forefront of ecosystem biogeochemical studies. Here we describe a method for producing 13C and 15N labeled plant material that is either uniformly labeled throughout the plant or differentially labeled in structural and metabolic plant components. Here, we present the construction and operation of a continuous 13C and 15N labeling chamber that can be modified to meet various research needs. Uniformly labeled plant material is produced by continuous labeling from seedling to harvest, while differential labeling is achieved by removing the growing plants from the chamber weeks prior to harvest. Representative results from growing Andropogon gerardii Kaw demonstrate the system's ability to efficiently label plant material at the targeted levels. Through this method we have produced plant material with a 4.4 atom%13C and 6.7 atom%15N uniform plant label, or material that is differentially labeled by up to 1.29 atom%13C and 0.56 atom%15N in its metabolic and structural components (hot water extractable and hot water residual components, respectively). Challenges lie in maintaining proper temperature, humidity, CO2 concentration, and light levels in an airtight 13C-CO2 atmosphere for successful plant production. This chamber description represents a useful research tool to effectively produce uniformly or differentially multi-isotope labeled plant material for use in experiments on ecosystem biogeochemical cycling.
21 Related JoVE Articles!
Play Button
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Authors: Andreas Florian Haas, Ben Knowles, Yan Wei Lim, Tracey McDole Somera, Linda Wegley Kelly, Mark Hatay, Forest Rohwer.
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
52131
Play Button
Isolation and Quantification of Botulinum Neurotoxin From Complex Matrices Using the BoTest Matrix Assays
Authors: F. Mark Dunning, Timothy M. Piazza, Füsûn N. Zeytin, Ward C. Tucker.
Institutions: BioSentinel Inc., Madison, WI.
Accurate detection and quantification of botulinum neurotoxin (BoNT) in complex matrices is required for pharmaceutical, environmental, and food sample testing. Rapid BoNT testing of foodstuffs is needed during outbreak forensics, patient diagnosis, and food safety testing while accurate potency testing is required for BoNT-based drug product manufacturing and patient safety. The widely used mouse bioassay for BoNT testing is highly sensitive but lacks the precision and throughput needed for rapid and routine BoNT testing. Furthermore, the bioassay's use of animals has resulted in calls by drug product regulatory authorities and animal-rights proponents in the US and abroad to replace the mouse bioassay for BoNT testing. Several in vitro replacement assays have been developed that work well with purified BoNT in simple buffers, but most have not been shown to be applicable to testing in highly complex matrices. Here, a protocol for the detection of BoNT in complex matrices using the BoTest Matrix assays is presented. The assay consists of three parts: The first part involves preparation of the samples for testing, the second part is an immunoprecipitation step using anti-BoNT antibody-coated paramagnetic beads to purify BoNT from the matrix, and the third part quantifies the isolated BoNT's proteolytic activity using a fluorogenic reporter. The protocol is written for high throughput testing in 96-well plates using both liquid and solid matrices and requires about 2 hr of manual preparation with total assay times of 4-26 hr depending on the sample type, toxin load, and desired sensitivity. Data are presented for BoNT/A testing with phosphate-buffered saline, a drug product, culture supernatant, 2% milk, and fresh tomatoes and includes discussion of critical parameters for assay success.
Neuroscience, Issue 85, Botulinum, food testing, detection, quantification, complex matrices, BoTest Matrix, Clostridium, potency testing
51170
Play Button
Studying Food Reward and Motivation in Humans
Authors: Hisham Ziauddeen, Naresh Subramaniam, Victoria C. Cambridge, Nenad Medic, Ismaa Sadaf Farooqi, Paul C. Fletcher.
Institutions: University of Cambridge, University of Cambridge, University of Cambridge, Addenbrooke's Hospital.
A key challenge in studying reward processing in humans is to go beyond subjective self-report measures and quantify different aspects of reward such as hedonics, motivation, and goal value in more objective ways. This is particularly relevant for the understanding of overeating and obesity as well as their potential treatments. In this paper are described a set of measures of food-related motivation using handgrip force as a motivational measure. These methods can be used to examine changes in food related motivation with metabolic (satiety) and pharmacological manipulations and can be used to evaluate interventions targeted at overeating and obesity. However to understand food-related decision making in the complex food environment it is essential to be able to ascertain the reward goal values that guide the decisions and behavioral choices that people make. These values are hidden but it is possible to ascertain them more objectively using metrics such as the willingness to pay and a method for this is described. Both these sets of methods provide quantitative measures of motivation and goal value that can be compared within and between individuals.
Behavior, Issue 85, Food reward, motivation, grip force, willingness to pay, subliminal motivation
51281
Play Button
Lignin Down-regulation of Zea mays via dsRNAi and Klason Lignin Analysis
Authors: Sang-Hyuck Park, Rebecca Garlock Ong, Chuansheng Mei, Mariam Sticklen.
Institutions: University of Arizona, Michigan State University, The Institute for Advanced Learning and Research, Michigan State University.
To facilitate the use of lignocellulosic biomass as an alternative bioenergy resource, during biological conversion processes, a pretreatment step is needed to open up the structure of the plant cell wall, increasing the accessibility of the cell wall carbohydrates. Lignin, a polyphenolic material present in many cell wall types, is known to be a significant hindrance to enzyme access. Reduction in lignin content to a level that does not interfere with the structural integrity and defense system of the plant might be a valuable step to reduce the costs of bioethanol production. In this study, we have genetically down-regulated one of the lignin biosynthesis-related genes, cinnamoyl-CoA reductase (ZmCCR1) via a double stranded RNA interference technique. The ZmCCR1_RNAi construct was integrated into the maize genome using the particle bombardment method. Transgenic maize plants grew normally as compared to the wild-type control plants without interfering with biomass growth or defense mechanisms, with the exception of displaying of brown-coloration in transgenic plants leaf mid-ribs, husks, and stems. The microscopic analyses, in conjunction with the histological assay, revealed that the leaf sclerenchyma fibers were thinned but the structure and size of other major vascular system components was not altered. The lignin content in the transgenic maize was reduced by 7-8.7%, the crystalline cellulose content was increased in response to lignin reduction, and hemicelluloses remained unchanged. The analyses may indicate that carbon flow might have been shifted from lignin biosynthesis to cellulose biosynthesis. This article delineates the procedures used to down-regulate the lignin content in maize via RNAi technology, and the cell wall compositional analyses used to verify the effect of the modifications on the cell wall structure.
Bioengineering, Issue 89, Zea mays, cinnamoyl-CoA reductase (CCR), dsRNAi, Klason lignin measurement, cell wall carbohydrate analysis, gas chromatography (GC)
51340
Play Button
High Throughput Quantitative Expression Screening and Purification Applied to Recombinant Disulfide-rich Venom Proteins Produced in E. coli
Authors: Natalie J. Saez, Hervé Nozach, Marilyne Blemont, Renaud Vincentelli.
Institutions: Aix-Marseille Université, Commissariat à l'énergie atomique et aux énergies alternatives (CEA) Saclay, France.
Escherichia coli (E. coli) is the most widely used expression system for the production of recombinant proteins for structural and functional studies. However, purifying proteins is sometimes challenging since many proteins are expressed in an insoluble form. When working with difficult or multiple targets it is therefore recommended to use high throughput (HTP) protein expression screening on a small scale (1-4 ml cultures) to quickly identify conditions for soluble expression. To cope with the various structural genomics programs of the lab, a quantitative (within a range of 0.1-100 mg/L culture of recombinant protein) and HTP protein expression screening protocol was implemented and validated on thousands of proteins. The protocols were automated with the use of a liquid handling robot but can also be performed manually without specialized equipment. Disulfide-rich venom proteins are gaining increasing recognition for their potential as therapeutic drug leads. They can be highly potent and selective, but their complex disulfide bond networks make them challenging to produce. As a member of the FP7 European Venomics project (www.venomics.eu), our challenge is to develop successful production strategies with the aim of producing thousands of novel venom proteins for functional characterization. Aided by the redox properties of disulfide bond isomerase DsbC, we adapted our HTP production pipeline for the expression of oxidized, functional venom peptides in the E. coli cytoplasm. The protocols are also applicable to the production of diverse disulfide-rich proteins. Here we demonstrate our pipeline applied to the production of animal venom proteins. With the protocols described herein it is likely that soluble disulfide-rich proteins will be obtained in as little as a week. Even from a small scale, there is the potential to use the purified proteins for validating the oxidation state by mass spectrometry, for characterization in pilot studies, or for sensitive micro-assays.
Bioengineering, Issue 89, E. coli, expression, recombinant, high throughput (HTP), purification, auto-induction, immobilized metal affinity chromatography (IMAC), tobacco etch virus protease (TEV) cleavage, disulfide bond isomerase C (DsbC) fusion, disulfide bonds, animal venom proteins/peptides
51464
Play Button
Laboratory Estimation of Net Trophic Transfer Efficiencies of PCB Congeners to Lake Trout (Salvelinus namaycush) from Its Prey
Authors: Charles P. Madenjian, Richard R. Rediske, James P. O'Keefe, Solomon R. David.
Institutions: U. S. Geological Survey, Grand Valley State University, Shedd Aquarium.
A technique for laboratory estimation of net trophic transfer efficiency (γ) of polychlorinated biphenyl (PCB) congeners to piscivorous fish from their prey is described herein. During a 135-day laboratory experiment, we fed bloater (Coregonus hoyi) that had been caught in Lake Michigan to lake trout (Salvelinus namaycush) kept in eight laboratory tanks. Bloater is a natural prey for lake trout. In four of the tanks, a relatively high flow rate was used to ensure relatively high activity by the lake trout, whereas a low flow rate was used in the other four tanks, allowing for low lake trout activity. On a tank-by-tank basis, the amount of food eaten by the lake trout on each day of the experiment was recorded. Each lake trout was weighed at the start and end of the experiment. Four to nine lake trout from each of the eight tanks were sacrificed at the start of the experiment, and all 10 lake trout remaining in each of the tanks were euthanized at the end of the experiment. We determined concentrations of 75 PCB congeners in the lake trout at the start of the experiment, in the lake trout at the end of the experiment, and in bloaters fed to the lake trout during the experiment. Based on these measurements, γ was calculated for each of 75 PCB congeners in each of the eight tanks. Mean γ was calculated for each of the 75 PCB congeners for both active and inactive lake trout. Because the experiment was replicated in eight tanks, the standard error about mean γ could be estimated. Results from this type of experiment are useful in risk assessment models to predict future risk to humans and wildlife eating contaminated fish under various scenarios of environmental contamination.
Environmental Sciences, Issue 90, trophic transfer efficiency, polychlorinated biphenyl congeners, lake trout, activity, contaminants, accumulation, risk assessment, toxic equivalents
51496
Play Button
Experimental Protocol for Manipulating Plant-induced Soil Heterogeneity
Authors: Angela J. Brandt, Gaston A. del Pino, Jean H. Burns.
Institutions: Case Western Reserve University.
Coexistence theory has often treated environmental heterogeneity as being independent of the community composition; however biotic feedbacks such as plant-soil feedbacks (PSF) have large effects on plant performance, and create environmental heterogeneity that depends on the community composition. Understanding the importance of PSF for plant community assembly necessitates understanding of the role of heterogeneity in PSF, in addition to mean PSF effects. Here, we describe a protocol for manipulating plant-induced soil heterogeneity. Two example experiments are presented: (1) a field experiment with a 6-patch grid of soils to measure plant population responses and (2) a greenhouse experiment with 2-patch soils to measure individual plant responses. Soils can be collected from the zone of root influence (soils from the rhizosphere and directly adjacent to the rhizosphere) of plants in the field from conspecific and heterospecific plant species. Replicate collections are used to avoid pseudoreplicating soil samples. These soils are then placed into separate patches for heterogeneous treatments or mixed for a homogenized treatment. Care should be taken to ensure that heterogeneous and homogenized treatments experience the same degree of soil disturbance. Plants can then be placed in these soil treatments to determine the effect of plant-induced soil heterogeneity on plant performance. We demonstrate that plant-induced heterogeneity results in different outcomes than predicted by traditional coexistence models, perhaps because of the dynamic nature of these feedbacks. Theory that incorporates environmental heterogeneity influenced by the assembling community and additional empirical work is needed to determine when heterogeneity intrinsic to the assembling community will result in different assembly outcomes compared with heterogeneity extrinsic to the community composition.
Environmental Sciences, Issue 85, Coexistence, community assembly, environmental drivers, plant-soil feedback, soil heterogeneity, soil microbial communities, soil patch
51580
Play Button
Laboratory-determined Phosphorus Flux from Lake Sediments as a Measure of Internal Phosphorus Loading
Authors: Mary E. Ogdahl, Alan D. Steinman, Maggie E. Weinert.
Institutions: Grand Valley State University.
Eutrophication is a water quality issue in lakes worldwide, and there is a critical need to identify and control nutrient sources. Internal phosphorus (P) loading from lake sediments can account for a substantial portion of the total P load in eutrophic, and some mesotrophic, lakes. Laboratory determination of P release rates from sediment cores is one approach for determining the role of internal P loading and guiding management decisions. Two principal alternatives to experimental determination of sediment P release exist for estimating internal load: in situ measurements of changes in hypolimnetic P over time and P mass balance. The experimental approach using laboratory-based sediment incubations to quantify internal P load is a direct method, making it a valuable tool for lake management and restoration. Laboratory incubations of sediment cores can help determine the relative importance of internal vs. external P loads, as well as be used to answer a variety of lake management and research questions. We illustrate the use of sediment core incubations to assess the effectiveness of an aluminum sulfate (alum) treatment for reducing sediment P release. Other research questions that can be investigated using this approach include the effects of sediment resuspension and bioturbation on P release. The approach also has limitations. Assumptions must be made with respect to: extrapolating results from sediment cores to the entire lake; deciding over what time periods to measure nutrient release; and addressing possible core tube artifacts. A comprehensive dissolved oxygen monitoring strategy to assess temporal and spatial redox status in the lake provides greater confidence in annual P loads estimated from sediment core incubations.
Environmental Sciences, Issue 85, Limnology, internal loading, eutrophication, nutrient flux, sediment coring, phosphorus, lakes
51617
Play Button
A Next-generation Tissue Microarray (ngTMA) Protocol for Biomarker Studies
Authors: Inti Zlobec, Guido Suter, Aurel Perren, Alessandro Lugli.
Institutions: University of Bern.
Biomarker research relies on tissue microarrays (TMA). TMAs are produced by repeated transfer of small tissue cores from a ‘donor’ block into a ‘recipient’ block and then used for a variety of biomarker applications. The construction of conventional TMAs is labor intensive, imprecise, and time-consuming. Here, a protocol using next-generation Tissue Microarrays (ngTMA) is outlined. ngTMA is based on TMA planning and design, digital pathology, and automated tissue microarraying. The protocol is illustrated using an example of 134 metastatic colorectal cancer patients. Histological, statistical and logistical aspects are considered, such as the tissue type, specific histological regions, and cell types for inclusion in the TMA, the number of tissue spots, sample size, statistical analysis, and number of TMA copies. Histological slides for each patient are scanned and uploaded onto a web-based digital platform. There, they are viewed and annotated (marked) using a 0.6-2.0 mm diameter tool, multiple times using various colors to distinguish tissue areas. Donor blocks and 12 ‘recipient’ blocks are loaded into the instrument. Digital slides are retrieved and matched to donor block images. Repeated arraying of annotated regions is automatically performed resulting in an ngTMA. In this example, six ngTMAs are planned containing six different tissue types/histological zones. Two copies of the ngTMAs are desired. Three to four slides for each patient are scanned; 3 scan runs are necessary and performed overnight. All slides are annotated; different colors are used to represent the different tissues/zones, namely tumor center, invasion front, tumor/stroma, lymph node metastases, liver metastases, and normal tissue. 17 annotations/case are made; time for annotation is 2-3 min/case. 12 ngTMAs are produced containing 4,556 spots. Arraying time is 15-20 hr. Due to its precision, flexibility and speed, ngTMA is a powerful tool to further improve the quality of TMAs used in clinical and translational research.
Medicine, Issue 91, tissue microarray, biomarkers, prognostic, predictive, digital pathology, slide scanning
51893
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
A Practical Guide to Phylogenetics for Nonexperts
Authors: Damien O'Halloran.
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
50975
Play Button
Chemotactic Response of Marine Micro-Organisms to Micro-Scale Nutrient Layers
Authors: Justin R. Seymour, Marcos, Roman Stocker.
Institutions: MIT - Massachusetts Institute of Technology.
The degree to which planktonic microbes can exploit microscale resource patches will have considerable implications for oceanic trophodynamics and biogeochemical flux. However, to take advantage of nutrient patches in the ocean, swimming microbes must overcome the influences of physical forces including molecular diffusion and turbulent shear, which will limit the availability of patches and the ability of bacteria to locate them. Until recently, methodological limitations have precluded direct examinations of microbial behaviour within patchy habitats and realistic small-scale flow conditions. Hence, much of our current knowledge regarding microbial behaviour in the ocean has been procured from theoretical predictions. To obtain new information on microbial foraging behaviour in the ocean we have applied soft lithographic fabrication techniques to develop 2 microfluidic devices, which we have used to create (i) microscale nutrient patches with dimensions and diffusive characteristics relevant to oceanic processes and (ii) microscale vortices, with shear rates corresponding to those expected in the ocean. These microfluidic devices have permitted a first direct examination of microbial swimming and chemotactic behaviour within a heterogeneous and dynamic seascape. The combined use of epifluorescence and phase contrast microscopy allow direct examinations of the physical dimensions and diffusive characteristics of nutrient patches, while observing the population-level aggregative response, in addition to the swimming behaviour of individual microbes. These experiments have revealed that some species of phytoplankton, heterotrophic bacteria and phagotrophic protists are adept at locating and exploiting diffusing microscale resource patches within very short time frames. We have also shown that up to moderate shear rates, marine bacteria are able to fight the flow and swim through their environment at their own accord. However, beyond a threshold high shear level, bacteria are aligned in the shear flow and are less capable of swimming without disturbance from the flow. Microfluidics represents a novel and inexpensive approach for studying aquatic microbial ecology, and due to its suitability for accurately creating realistic flow fields and substrate gradients at the microscale, is ideally applicable to examinations of microbial behaviour at the smallest scales of interaction. We therefore suggest that microfluidics represents a valuable tool for obtaining a better understanding of the ecology of microorganisms in the ocean.
Microbiology, issue 4, microbial community, chemotaxis, microfluidics
203
Play Button
Concentration of Metabolites from Low-density Planktonic Communities for Environmental Metabolomics using Nuclear Magnetic Resonance Spectroscopy
Authors: R. Craig Everroad, Seiji Yoshida, Yuuri Tsuboi, Yasuhiro Date, Jun Kikuchi, Shigeharu Moriya.
Institutions: RIKEN Advanced Science Institute, Yokohama City University, RIKEN Plant Science Center, Nagoya University.
Environmental metabolomics is an emerging field that is promoting new understanding in how organisms respond to and interact with the environment and each other at the biochemical level1. Nuclear magnetic resonance (NMR) spectroscopy is one of several technologies, including gas chromatography–mass spectrometry (GC-MS), with considerable promise for such studies. Advantages of NMR are that it is suitable for untargeted analyses, provides structural information and spectra can be queried in quantitative and statistical manners against recently available databases of individual metabolite spectra2,3. In addition, NMR spectral data can be combined with data from other omics levels (e.g. transcriptomics, genomics) to provide a more comprehensive understanding of the physiological responses of taxa to each other and the environment4,5,6. However, NMR is less sensitive than other metabolomic techniques, making it difficult to apply to natural microbial systems where sample populations can be low-density and metabolite concentrations low compared to metabolites from well-defined and readily extractable sources such as whole tissues, biofluids or cell-cultures. Consequently, the few direct environmental metabolomic studies of microbes performed to date have been limited to culture-based or easily defined high-density ecosystems such as host-symbiont systems, constructed co-cultures or manipulations of the gut environment where stable isotope labeling can be additionally used to enhance NMR signals7,8,9,10,11,12. Methods that facilitate the concentration and collection of environmental metabolites at concentrations suitable for NMR are lacking. Since recent attention has been given to the environmental metabolomics of organisms within the aquatic environment, where much of the energy and material flow is mediated by the planktonic community13,14, we have developed a method for the concentration and extraction of whole-community metabolites from planktonic microbial systems by filtration. Commercially available hydrophilic poly-1,1-difluoroethene (PVDF) filters are specially treated to completely remove extractables, which can otherwise appear as contaminants in subsequent analyses. These treated filters are then used to filter environmental or experimental samples of interest. Filters containing the wet sample material are lyophilized and aqueous-soluble metabolites are extracted directly for conventional NMR spectroscopy using a standardized potassium phosphate extraction buffer2. Data derived from these methods can be analyzed statistically to identify meaningful patterns, or integrated with other omics levels for comprehensive understanding of community and ecosystem function.
Molecular Biology, Issue 62, environmental metabolomics, metabolic profiling, microbial ecology, plankton, NMR spectroscopy, PCA
3163
Play Button
A Protocol for Computer-Based Protein Structure and Function Prediction
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
3259
Play Button
Establishment of Microbial Eukaryotic Enrichment Cultures from a Chemically Stratified Antarctic Lake and Assessment of Carbon Fixation Potential
Authors: Jenna M. Dolhi, Nicholas Ketchum, Rachael M. Morgan-Kiss.
Institutions: Miami University .
Lake Bonney is one of numerous permanently ice-covered lakes located in the McMurdo Dry Valleys, Antarctica. The perennial ice cover maintains a chemically stratified water column and unlike other inland bodies of water, largely prevents external input of carbon and nutrients from streams. Biota are exposed to numerous environmental stresses, including year-round severe nutrient deficiency, low temperatures, extreme shade, hypersalinity, and 24-hour darkness during the winter 1. These extreme environmental conditions limit the biota in Lake Bonney almost exclusively to microorganisms 2. Single-celled microbial eukaryotes (called "protists") are important players in global biogeochemical cycling 3 and play important ecological roles in the cycling of carbon in the dry valley lakes, occupying both primary and tertiary roles in the aquatic food web. In the dry valley aquatic food web, protists that fix inorganic carbon (autotrophy) are the major producers of organic carbon for organotrophic organisms 4, 2. Phagotrophic or heterotrophic protists capable of ingesting bacteria and smaller protists act as the top predators in the food web 5. Last, an unknown proportion of the protist population is capable of combined mixotrophic metabolism 6, 7. Mixotrophy in protists involves the ability to combine photosynthetic capability with phagotrophic ingestion of prey microorganisms. This form of mixotrophy differs from mixotrophic metabolism in bacterial species, which generally involves uptake dissolved carbon molecules. There are currently very few protist isolates from permanently ice-capped polar lakes, and studies of protist diversity and ecology in this extreme environment have been limited 8, 4, 9, 10, 5. A better understanding of protist metabolic versatility in the simple dry valley lake food web will aid in the development of models for the role of protists in the global carbon cycle. We employed an enrichment culture approach to isolate potentially phototrophic and mixotrophic protists from Lake Bonney. Sampling depths in the water column were chosen based on the location of primary production maxima and protist phylogenetic diversity 4, 11, as well as variability in major abiotic factors affecting protist trophic modes: shallow sampling depths are limited for major nutrients, while deeper sampling depths are limited by light availability. In addition, lake water samples were supplemented with multiple types of growth media to promote the growth of a variety of phototrophic organisms. RubisCO catalyzes the rate limiting step in the Calvin Benson Bassham (CBB) cycle, the major pathway by which autotrophic organisms fix inorganic carbon and provide organic carbon for higher trophic levels in aquatic and terrestrial food webs 12. In this study, we applied a radioisotope assay modified for filtered samples 13 to monitor maximum carboxylase activity as a proxy for carbon fixation potential and metabolic versatility in the Lake Bonney enrichment cultures.
Microbiology, Issue 62, Antarctic lake, McMurdo Dry Valleys, Enrichment cultivation, Microbial eukaryotes, RubisCO
3992
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Linking Predation Risk, Herbivore Physiological Stress and Microbial Decomposition of Plant Litter
Authors: Oswald J. Schmitz, Mark A. Bradford, Michael S. Strickland, Dror Hawlena.
Institutions: Yale University, Virginia Tech, The Hebrew University of Jerusalem.
The quantity and quality of detritus entering the soil determines the rate of decomposition by microbial communities as well as recycle rates of nitrogen (N) and carbon (C) sequestration1,2. Plant litter comprises the majority of detritus3, and so it is assumed that decomposition is only marginally influenced by biomass inputs from animals such as herbivores and carnivores4,5. However, carnivores may influence microbial decomposition of plant litter via a chain of interactions in which predation risk alters the physiology of their herbivore prey that in turn alters soil microbial functioning when the herbivore carcasses are decomposed6. A physiological stress response by herbivores to the risk of predation can change the C:N elemental composition of herbivore biomass7,8,9 because stress from predation risk increases herbivore basal energy demands that in nutrient-limited systems forces herbivores to shift their consumption from N-rich resources to support growth and reproduction to C-rich carbohydrate resources to support heightened metabolism6. Herbivores have limited ability to store excess nutrients, so stressed herbivores excrete N as they increase carbohydrate-C consumption7. Ultimately, prey stressed by predation risk increase their body C:N ratio7,10, making them poorer quality resources for the soil microbial pool likely due to lower availability of labile N for microbial enzyme production6. Thus, decomposition of carcasses of stressed herbivores has a priming effect on the functioning of microbial communities that decreases subsequent ability to of microbes to decompose plant litter6,10,11. We present the methodology to evaluate linkages between predation risk and litter decomposition by soil microbes. We describe how to: induce stress in herbivores from predation risk; measure those stress responses, and measure the consequences on microbial decomposition. We use insights from a model grassland ecosystem comprising the hunting spider predator (Pisuarina mira), a dominant grasshopper herbivore (Melanoplus femurrubrum),and a variety of grass and forb plants9.
Environmental Sciences, Issue 73, Microbiology, Plant Biology, Entomology, Organisms, Investigative Techniques, Biological Phenomena, Chemical Phenomena, Metabolic Phenomena, Microbiological Phenomena, Earth Resources and Remote Sensing, Life Sciences (General), Litter Decomposition, Ecological Stoichiometry, Physiological Stress and Ecosystem Function, Predation Risk, Soil Respiration, Carbon Sequestration, Soil Science, respiration, spider, grasshoper, model system
50061
Play Button
Determination of Microbial Extracellular Enzyme Activity in Waters, Soils, and Sediments using High Throughput Microplate Assays
Authors: Colin R. Jackson, Heather L. Tyler, Justin J. Millar.
Institutions: The University of Mississippi.
Much of the nutrient cycling and carbon processing in natural environments occurs through the activity of extracellular enzymes released by microorganisms. Thus, measurement of the activity of these extracellular enzymes can give insights into the rates of ecosystem level processes, such as organic matter decomposition or nitrogen and phosphorus mineralization. Assays of extracellular enzyme activity in environmental samples typically involve exposing the samples to artificial colorimetric or fluorometric substrates and tracking the rate of substrate hydrolysis. Here we describe microplate based methods for these procedures that allow the analysis of large numbers of samples within a short time frame. Samples are allowed to react with artificial substrates within 96-well microplates or deep well microplate blocks, and enzyme activity is subsequently determined by absorption or fluorescence of the resulting end product using a typical microplate reader or fluorometer. Such high throughput procedures not only facilitate comparisons between spatially separate sites or ecosystems, but also substantially reduce the cost of such assays by reducing overall reagent volumes needed per sample.
Environmental Sciences, Issue 80, Environmental Monitoring, Ecological and Environmental Processes, Environmental Microbiology, Ecology, extracellular enzymes, freshwater microbiology, soil microbiology, microbial activity, enzyme activity
50399
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Authors: Colin W. Bell, Barbara E. Fricks, Jennifer D. Rocca, Jessica M. Steinweg, Shawna K. McMahon, Matthew D. Wallenstein.
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample). Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
50961
Play Button
Layers of Symbiosis - Visualizing the Termite Hindgut Microbial Community
Authors: Jared Leadbetter.
Institutions: California Institute of Technology - Caltech.
Jared Leadbetter takes us for a nature walk through the diversity of life resident in the termite hindgut - a microenvironment containing 250 different species found nowhere else on Earth. Jared reveals that the symbiosis exhibited by this system is multi-layered and involves not only a relationship between the termite and its gut inhabitants, but also involves a complex web of symbiosis among the gut microbes themselves.
Microbiology, issue 4, microbial community, symbiosis, hindgut
197
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.