JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Effectiveness of biodiversity surrogates for conservation planning: different measures of effectiveness generate a kaleidoscope of variation.
PLoS ONE
PUBLISHED: 03-31-2010
Conservation planners represent many aspects of biodiversity by using surrogates with spatial distributions readily observed or quantified, but tests of their effectiveness have produced varied and conflicting results. We identified four factors likely to have a strong influence on the apparent effectiveness of surrogates: (1) the choice of surrogate; (2) differences among study regions, which might be large and unquantified (3) the test method, that is, how effectiveness is quantified, and (4) the test features that the surrogates are intended to represent. Analysis of an unusually rich dataset enabled us, for the first time, to disentangle these factors and to compare their individual and interacting influences. Using two data-rich regions, we estimated effectiveness using five alternative methods: two forms of incidental representation, two forms of species accumulation index and irreplaceability correlation, to assess the performance of forest ecosystems and environmental units as surrogates for six groups of threatened species-the test features-mammals, birds, reptiles, frogs, plants and all of these combined. Four methods tested the effectiveness of the surrogates by selecting areas for conservation of the surrogates then estimating how effective those areas were at representing test features. One method measured the spatial match between conservation priorities for surrogates and test features. For methods that selected conservation areas, we measured effectiveness using two analytical approaches: (1) when representation targets for the surrogates were achieved (incidental representation), or (2) progressively as areas were selected (species accumulation index). We estimated the spatial correlation of conservation priorities using an index known as summed irreplaceability. In general, the effectiveness of surrogates for our taxa (mostly threatened species) was low, although environmental units tended to be more effective than forest ecosystems. The surrogates were most effective for plants and mammals and least effective for frogs and reptiles. The five testing methods differed in their rankings of effectiveness of the two surrogates in relation to different groups of test features. There were differences between study areas in terms of the effectiveness of surrogates for different test feature groups. Overall, the effectiveness of the surrogates was sensitive to all four factors. This indicates the need for caution in generalizing surrogacy tests.
ABSTRACT
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
24 Related JoVE Articles!
Play Button
Electrochemotherapy of Tumours
Authors: Gregor Sersa, Damijan Miklavcic.
Institutions: Institute of Oncology Ljubljana, University of Ljubljana.
Electrochemotherapy is a combined use of certain chemotherapeutic drugs and electric pulses applied to the treated tumour nodule. Local application of electric pulses to the tumour increases drug delivery into cells, specifically at the site of electric pulse application. Drug uptake by delivery of electric pulses is increased for only those chemotherapeutic drugs whose transport through the plasma membrane is impeded. Among many drugs that have been tested so far, bleomycin and cisplatin found their way from preclinical testing to clinical use. Clinical data collected within a number of clinical studies indicate that approximately 80% of the treated cutaneous and subcutaneous tumour nodules of different malignancies are in an objective response, from these, approximately 70% in complete response after a single application of electrochemotherapy. Usually only one treatment is needed, however, electrochemotherapy can be repeated several times every few weeks with equal effectiveness each time. The treatment results in an effective eradication of the treated nodules, with a good cosmetic effect without tissue scarring.
Medicine, Issue 22, electrochemotherapy, electroporation, cisplatin, bleomycin, malignant tumours, cutaneous lesions
1038
Play Button
Quantification of Proteins Using Peptide Immunoaffinity Enrichment Coupled with Mass Spectrometry
Authors: Lei Zhao, Jeffrey R. Whiteaker, Matthew E. Pope, Eric Kuhn, Angela Jackson, N. Leigh Anderson, Terry W. Pearson, Steven A. Carr, Amanda G. Paulovich.
Institutions: Fred Hutchinson Cancer Research Center - FHCRC, University of Victoria, Broad Institute of MIT and Harvard, University of Victoria, Plasma Proteome Institute.
There is a great need for quantitative assays in measuring proteins. Traditional sandwich immunoassays, largely considered the gold standard in quantitation, are associated with a high cost, long lead time, and are fraught with drawbacks (e.g. heterophilic antibodies, autoantibody interference, 'hook-effect').1 An alternative technique is affinity enrichment of peptides coupled with quantitative mass spectrometry, commonly referred to as SISCAPA (Stable Isotope Standards and Capture by Anti-Peptide Antibodies).2 In this technique, affinity enrichment of peptides with stable isotope dilution and detection by selected/multiple reaction monitoring mass spectrometry (SRM/MRM-MS) provides quantitative measurement of peptides as surrogates for their respective proteins. SRM/MRM-MS is well established for accurate quantitation of small molecules 3, 4 and more recently has been adapted to measure the concentrations of proteins in plasma and cell lysates.5-7 To achieve quantitation of proteins, these larger molecules are digested to component peptides using an enzyme such as trypsin. One or more selected peptides whose sequence is unique to the target protein in that species (i.e. "proteotypic" peptides) are then enriched from the sample using anti-peptide antibodies and measured as quantitative stoichiometric surrogates for protein concentration in the sample. Hence, coupled to stable isotope dilution (SID) methods (i.e. a spiked-in stable isotope labeled peptide standard), SRM/MRM can be used to measure concentrations of proteotypic peptides as surrogates for quantification of proteins in complex biological matrices. The assays have several advantages compared to traditional immunoassays. The reagents are relatively less expensive to generate, the specificity for the analyte is excellent, the assays can be highly multiplexed, enrichment can be performed from neat plasma (no depletion required), and the technique is amenable to a wide array of proteins or modifications of interest.8-13 In this video we demonstrate the basic protocol as adapted to a magnetic bead platform.
Molecular Biology, Issue 53, Mass spectrometry, targeted assay, peptide, MRM, SISCAPA, protein quantitation
2812
Play Button
Intubation-mediated Intratracheal (IMIT) Instillation: A Noninvasive, Lung-specific Delivery System
Authors: Matthew B Lawrenz, Ramy A. Fodah, Maria G. Gutierrez, Jonathan Warawa.
Institutions: University of Louisville Medical School, University of Louisville Medical School.
Respiratory disease studies typically involve the use of murine models as surrogate systems. However, there are significant physiologic differences between the murine and human respiratory systems, especially in their upper respiratory tracts (URT). In some models, these differences in the murine nasal cavity can have a significant impact on disease progression and presentation in the lower respiratory tract (LRT) when using intranasal instillation techniques, potentially limiting the usefulness of the mouse model to study these diseases. For these reasons, it would be advantageous to develop a technique to instill bacteria directly into the mouse lungs in order to study LRT disease in the absence of involvement of the URT. We have termed this lung specific delivery technique intubation-mediated intratracheal (IMIT) instillation. This noninvasive technique minimizes the potential for instillation into the bloodstream, which can occur during more invasive traditional surgical intratracheal infection approaches, and limits the possibility of incidental digestive tract delivery. IMIT is a two-step process in which mice are first intubated, with an intermediate step to ensure correct catheter placement into the trachea, followed by insertion of a blunt needle into the catheter to mediate direct delivery of bacteria into the lung. This approach facilitates a >98% efficacy of delivery into the lungs with excellent distribution of reagent throughout the lung. Thus, IMIT represents a novel approach to study LRT disease and therapeutic delivery directly into the lung, improving upon the ability to use mice as surrogates to study human respiratory disease. Furthermore, the accuracy and reproducibility of this delivery system also makes it amenable to Good Laboratory Practice Standards (GLPS), as well as delivery of a wide range of reagents which require high efficiency delivery to the lung.
Medicine, Issue 93, Respiratory disease, intubation-mediated intratracheal (IMIT) instillation, therapeutic delivery, bacterial pneumonia, lower respiratory tract, mouse
52261
Play Button
Light/dark Transition Test for Mice
Authors: Keizo Takao, Tsuyoshi Miyakawa.
Institutions: Graduate School of Medicine, Kyoto University.
Although all of the mouse genome sequences have been determined, we do not yet know the functions of most of these genes. Gene-targeting techniques, however, can be used to delete or manipulate a specific gene in mice. The influence of a given gene on a specific behavior can then be determined by conducting behavioral analyses of the mutant mice. As a test for behavioral phenotyping of mutant mice, the light/dark transition test is one of the most widely used tests to measure anxiety-like behavior in mice. The test is based on the natural aversion of mice to brightly illuminated areas and on their spontaneous exploratory behavior in novel environments. The test is sensitive to anxiolytic drug treatment. The apparatus consists of a dark chamber and a brightly illuminated chamber. Mice are allowed to move freely between the two chambers. The number of entries into the bright chamber and the duration of time spent there are indices of bright-space anxiety in mice. To obtain phenotyping results of a strain of mutant mice that can be readily reproduced and compared with those of other mutants, the behavioral test methods should be as identical as possible between laboratories. The procedural differences that exist between laboratories, however, make it difficult to replicate or compare the results among laboratories. Here, we present our protocol for the light/dark transition test as a movie so that the details of the protocol can be demonstrated. In our laboratory, we have assessed more than 60 strains of mutant mice using the protocol shown in the movie. Those data will be disclosed as a part of a public database that we are now constructing. Visualization of the protocol will facilitate understanding of the details of the entire experimental procedure, allowing for standardization of the protocols used across laboratories and comparisons of the behavioral phenotypes of various strains of mutant mice assessed using this test.
Neuroscience, Issue 1, knockout mice, transgenic mice, behavioral test, phenotyping
104
Play Button
Multi-step Preparation Technique to Recover Multiple Metabolite Compound Classes for In-depth and Informative Metabolomic Analysis
Authors: Charmion Cruickshank-Quinn, Kevin D. Quinn, Roger Powell, Yanhui Yang, Michael Armstrong, Spencer Mahaffey, Richard Reisdorph, Nichole Reisdorph.
Institutions: National Jewish Health, University of Colorado Denver.
Metabolomics is an emerging field which enables profiling of samples from living organisms in order to obtain insight into biological processes. A vital aspect of metabolomics is sample preparation whereby inconsistent techniques generate unreliable results. This technique encompasses protein precipitation, liquid-liquid extraction, and solid-phase extraction as a means of fractionating metabolites into four distinct classes. Improved enrichment of low abundance molecules with a resulting increase in sensitivity is obtained, and ultimately results in more confident identification of molecules. This technique has been applied to plasma, bronchoalveolar lavage fluid, and cerebrospinal fluid samples with volumes as low as 50 µl.  Samples can be used for multiple downstream applications; for example, the pellet resulting from protein precipitation can be stored for later analysis. The supernatant from that step undergoes liquid-liquid extraction using water and strong organic solvent to separate the hydrophilic and hydrophobic compounds. Once fractionated, the hydrophilic layer can be processed for later analysis or discarded if not needed. The hydrophobic fraction is further treated with a series of solvents during three solid-phase extraction steps to separate it into fatty acids, neutral lipids, and phospholipids. This allows the technician the flexibility to choose which class of compounds is preferred for analysis. It also aids in more reliable metabolite identification since some knowledge of chemical class exists.
Bioengineering, Issue 89, plasma, chemistry techniques, analytical, solid phase extraction, mass spectrometry, metabolomics, fluids and secretions, profiling, small molecules, lipids, liquid chromatography, liquid-liquid extraction, cerebrospinal fluid, bronchoalveolar lavage fluid
51670
Play Button
Assessment of Morphine-induced Hyperalgesia and Analgesic Tolerance in Mice Using Thermal and Mechanical Nociceptive Modalities
Authors: Khadija Elhabazi, Safia Ayachi, Brigitte Ilien, Frédéric Simonin.
Institutions: Université de Strasbourg.
Opioid-induced hyperalgesia and tolerance severely impact the clinical efficacy of opiates as pain relievers in animals and humans. The molecular mechanisms underlying both phenomena are not well understood and their elucidation should benefit from the study of animal models and from the design of appropriate experimental protocols. We describe here a methodological approach for inducing, recording and quantifying morphine-induced hyperalgesia as well as for evidencing analgesic tolerance, using the tail-immersion and tail pressure tests in wild-type mice. As shown in the video, the protocol is divided into five sequential steps. Handling and habituation phases allow a safe determination of the basal nociceptive response of the animals. Chronic morphine administration induces significant hyperalgesia as shown by an increase in both thermal and mechanical sensitivity, whereas the comparison of analgesia time-courses after acute or repeated morphine treatment clearly indicates the development of tolerance manifested by a decline in analgesic response amplitude. This protocol may be similarly adapted to genetically modified mice in order to evaluate the role of individual genes in the modulation of nociception and morphine analgesia. It also provides a model system to investigate the effectiveness of potential therapeutic agents to improve opiate analgesic efficacy.
Neuroscience, Issue 89, mice, nociception, tail immersion test, tail pressure test, morphine, analgesia, opioid-induced hyperalgesia, tolerance
51264
Play Button
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Authors: Andreas Florian Haas, Ben Knowles, Yan Wei Lim, Tracey McDole Somera, Linda Wegley Kelly, Mark Hatay, Forest Rohwer.
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
52131
Play Button
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Authors: Alla Gagarinova, Mohan Babu, Jack Greenblatt, Andrew Emili.
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g. protein-protein) and functional (e.g. gene-gene or genetic) interactions (GI)1. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7, but GI information remains sparse for prokaryotes8, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10. Here, we present the key steps required to perform quantitative E. coli Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format. Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g. the 'Keio' collection11) and essential gene hypomorphic mutations (i.e. alleles conferring reduced protein expression, stability, or activity9, 12, 13) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e. slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2 as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9.
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
4056
Play Button
Breathing-controlled Electrical Stimulation (BreEStim) for Management of Neuropathic Pain and Spasticity
Authors: Sheng Li.
Institutions: University of Texas Health Science Center at Houston , TIRR Memorial Hermann Hospital, TIRR Memorial Hermann Hospital.
Electrical stimulation (EStim) refers to the application of electrical current to muscles or nerves in order to achieve functional and therapeutic goals. It has been extensively used in various clinical settings. Based upon recent discoveries related to the systemic effects of voluntary breathing and intrinsic physiological interactions among systems during voluntary breathing, a new EStim protocol, Breathing-controlled Electrical Stimulation (BreEStim), has been developed to augment the effects of electrical stimulation. In BreEStim, a single-pulse electrical stimulus is triggered and delivered to the target area when the airflow rate of an isolated voluntary inspiration reaches the threshold. BreEStim integrates intrinsic physiological interactions that are activated during voluntary breathing and has demonstrated excellent clinical efficacy. Two representative applications of BreEStim are reported with detailed protocols: management of post-stroke finger flexor spasticity and neuropathic pain in spinal cord injury.
Medicine, Issue 71, Neuroscience, Neurobiology, Anatomy, Physiology, Behavior, electrical stimulation, BreEStim, electrode, voluntary breathing, respiration, inspiration, pain, neuropathic pain, pain management, spasticity, stroke, spinal cord injury, brain, central nervous system, CNS, clinical, electromyogram, neuromuscular electrical stimulation
50077
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
50427
Play Button
Reduced Itraconazole Concentration and Durations Are Successful in Treating Batrachochytrium dendrobatidis Infection in Amphibians
Authors: Laura A. Brannelly.
Institutions: James Cook University.
Amphibians are experiencing the greatest decline of any vertebrate class and a leading cause of these declines is a fungal pathogen, Batrachochytrium dendrobatidis (Bd), which causes the disease chytridiomycosis. Captive assurance colonies are important worldwide for threatened amphibian species and may be the only lifeline for those in critical threat of extinction. Maintaining disease free colonies is a priority of captive managers, yet safe and effective treatments for all species and across life stages have not been identified. The most widely used chemotherapeutic treatment is itraconazole, although the dosage commonly used can be harmful to some individuals and species. We performed a clinical treatment trial to assess whether a lower and safer but effective dose of itraconazole could be found to cure Bd infections. We found that by reducing the treatment concentration from 0.01-0.0025% and reducing the treatment duration from 11-6 days of 5 min baths, frogs could be cured of Bd infection with fewer side effects and less treatment-associated mortality.
Immunology, Issue 85, Batrachochytrium dendrobatidis, itraconazole, chytridiomycosis, captive assurance colonies, amphibian conservation
51166
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Laboratory-determined Phosphorus Flux from Lake Sediments as a Measure of Internal Phosphorus Loading
Authors: Mary E. Ogdahl, Alan D. Steinman, Maggie E. Weinert.
Institutions: Grand Valley State University.
Eutrophication is a water quality issue in lakes worldwide, and there is a critical need to identify and control nutrient sources. Internal phosphorus (P) loading from lake sediments can account for a substantial portion of the total P load in eutrophic, and some mesotrophic, lakes. Laboratory determination of P release rates from sediment cores is one approach for determining the role of internal P loading and guiding management decisions. Two principal alternatives to experimental determination of sediment P release exist for estimating internal load: in situ measurements of changes in hypolimnetic P over time and P mass balance. The experimental approach using laboratory-based sediment incubations to quantify internal P load is a direct method, making it a valuable tool for lake management and restoration. Laboratory incubations of sediment cores can help determine the relative importance of internal vs. external P loads, as well as be used to answer a variety of lake management and research questions. We illustrate the use of sediment core incubations to assess the effectiveness of an aluminum sulfate (alum) treatment for reducing sediment P release. Other research questions that can be investigated using this approach include the effects of sediment resuspension and bioturbation on P release. The approach also has limitations. Assumptions must be made with respect to: extrapolating results from sediment cores to the entire lake; deciding over what time periods to measure nutrient release; and addressing possible core tube artifacts. A comprehensive dissolved oxygen monitoring strategy to assess temporal and spatial redox status in the lake provides greater confidence in annual P loads estimated from sediment core incubations.
Environmental Sciences, Issue 85, Limnology, internal loading, eutrophication, nutrient flux, sediment coring, phosphorus, lakes
51617
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
52183
Play Button
A Proboscis Extension Response Protocol for Investigating Behavioral Plasticity in Insects: Application to Basic, Biomedical, and Agricultural Research
Authors: Brian H. Smith, Christina M. Burden.
Institutions: Arizona State University.
Insects modify their responses to stimuli through experience of associating those stimuli with events important for survival (e.g., food, mates, threats). There are several behavioral mechanisms through which an insect learns salient associations and relates them to these events. It is important to understand this behavioral plasticity for programs aimed toward assisting insects that are beneficial for agriculture. This understanding can also be used for discovering solutions to biomedical and agricultural problems created by insects that act as disease vectors and pests. The Proboscis Extension Response (PER) conditioning protocol was developed for honey bees (Apis mellifera) over 50 years ago to study how they perceive and learn about floral odors, which signal the nectar and pollen resources a colony needs for survival. The PER procedure provides a robust and easy-to-employ framework for studying several different ecologically relevant mechanisms of behavioral plasticity. It is easily adaptable for use with several other insect species and other behavioral reflexes. These protocols can be readily employed in conjunction with various means for monitoring neural activity in the CNS via electrophysiology or bioimaging, or for manipulating targeted neuromodulatory pathways. It is a robust assay for rapidly detecting sub-lethal effects on behavior caused by environmental stressors, toxins or pesticides. We show how the PER protocol is straightforward to implement using two procedures. One is suitable as a laboratory exercise for students or for quick assays of the effect of an experimental treatment. The other provides more thorough control of variables, which is important for studies of behavioral conditioning. We show how several measures for the behavioral response ranging from binary yes/no to more continuous variable like latency and duration of proboscis extension can be used to test hypotheses. And, we discuss some pitfalls that researchers commonly encounter when they use the procedure for the first time.
Neuroscience, Issue 91, PER, conditioning, honey bee, olfaction, olfactory processing, learning, memory, toxin assay
51057
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
50680
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
50341
Play Button
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Authors: Colin W. Bell, Barbara E. Fricks, Jennifer D. Rocca, Jessica M. Steinweg, Shawna K. McMahon, Matthew D. Wallenstein.
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample). Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
50961
Play Button
Predicting the Effectiveness of Population Replacement Strategy Using Mathematical Modeling
Authors: John Marshall, Koji Morikawa, Nicholas Manoukis, Charles Taylor.
Institutions: University of California, Los Angeles.
Charles Taylor and John Marshall explain the utility of mathematical modeling for evaluating the effectiveness of population replacement strategy. Insight is given into how computational models can provide information on the population dynamics of mosquitoes and the spread of transposable elements through A. gambiae subspecies. The ethical considerations of releasing genetically modified mosquitoes into the wild are discussed.
Cellular Biology, Issue 5, mosquito, malaria, popuulation, replacement, modeling, infectious disease
227
Play Button
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Authors: Sergey Rabotyagov, Todd Campbell, Adriana Valcu, Philip Gassman, Manoj Jha, Keith Schilling, Calvin Wolter, Catherine Kling.
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,5,12,20) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7 with a multiobjective evolutionary algorithm SPEA226, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development
4009
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
1988
Play Button
Choice and No-Choice Assays for Testing the Resistance of A. thaliana to Chewing Insects
Authors: Martin De Vos, Georg Jander.
Institutions: Cornell University.
Larvae of the small white cabbage butterfly are a pest in agricultural settings. This caterpillar species feeds from plants in the cabbage family, which include many crops such as cabbage, broccoli, Brussel sprouts etc. Rearing of the insects takes place on cabbage plants in the greenhouse. At least two cages are needed for the rearing of Pieris rapae. One for the larvae and the other to contain the adults, the butterflies. In order to investigate the role of plant hormones and toxic plant chemicals in resistance to this insect pest, we demonstrate two experiments. First, determination of the role of jasmonic acid (JA - a plant hormone often indicated in resistance to insects) in resistance to the chewing insect Pieris rapae. Caterpillar growth can be compared on wild-type and mutant plants impaired in production of JA. This experiment is considered "No Choice", because larvae are forced to subsist on a single plant which synthesizes or is deficient in JA. Second, we demonstrate an experiment that investigates the role of glucosinolates, which are used as oviposition (egg-laying) signals. Here, we use WT and mutant Arabidopsis impaired in glucosinolate production in a "Choice" experiment in which female butterflies are allowed to choose to lay their eggs on plants of either genotype. This video demonstrates the experimental setup for both assays as well as representative results.
Plant Biology, Issue 15, Annual Review, Plant Resistance, Herbivory, Arabidopsis thaliana, Pieris rapae, Caterpillars, Butterflies, Jasmonic Acid, Glucosinolates
683
Play Button
The Structure of Skilled Forelimb Reaching in the Rat: A Movement Rating Scale
Authors: Ian Q Whishaw, Paul Whishaw, Bogdan Gorny.
Institutions: University of Lethbridge.
Skilled reaching for food is an evolutionary ancient act and is displayed by many animal species, including those in the sister clades of rodents and primates. The video describes a test situation that allows filming of repeated acts of reaching for food by the rat that has been mildly food deprived. A rat is trained to reach through a slot in a holding box for food pellet that it grasps and then places in its mouth for eating. Reaching is accomplished in the main by proximally driven movements of the limb but distal limb movements are used for pronating the paw, grasping the food, and releasing the food into the mouth. Each reach is divided into at least 10 movements of the forelimb and the reaching act is facilitated by postural adjustments. Each of the movements is described and examples of the movements are given from a number of viewing perspectives. By rating each movement element on a 3-point scale, the reach can be quantified. A number of studies have demonstrated that the movement elements are altered by motor system damage, including damage to the motor cortex, basal ganglia, brainstem, and spinal cord. The movements are also altered in neurological conditions that can be modeled in the rat, including Parkinson's disease and Huntington's disease. Thus, the rating scale is useful for quantifying motor impairments and the effectiveness of neural restoration and rehabilitation. Because the reaching act for the rat is very similar to that displayed by humans and nonhuman primates, the scale can be used for comparative purposes. from a number of viewing perspectives. By rating each movement element on a 3-point scale, the reach can be quantified. A number of studies have demonstrated that the movement elements are altered by motor system damage, including damage to the motor cortex, basal ganglia, brainstem, and spinal cord. The movements are also altered in neurological conditions that can be modeled in the rat, including Parkinson's disease and Huntington's disease. Thus, the rating scale is useful for quantifying motor impairments and the effectiveness of neural restoration and rehabilitation. Experiments on animals were performed in accordance with the guidelines and regulations set forth by the University of Lethbridge Animal Care Committee in accordance with the regulations of the Canadian Council on Animal Care.
Neuroscience, Issue 18, rat skilled reaching, rat reaching scale, rat, rat movement element rating scale, reaching elements
816
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.