JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Weather indices for designing micro-insurance products for small-holder farmers in the tropics.
Agriculture is inherently risky. Drought is a particularly troublesome hazard that has a documented adverse impact on agricultural development. A long history of decision-support tools have been developed to try and help farmers or policy makers manage risk. We offer site-specific drought insurance methodology as a significant addition to this process. Drought insurance works by encapsulating the best available scientific estimate of drought probability and severity at a site within a single number- the insurance premium, which is offered by insurers to insurable parties in a transparent risk-sharing agreement. The proposed method is demonstrated in a case study for dry beans in Nicaragua.
Authors: Ifat Levy, Lior Rosenberg Belmaker, Kirk Manson, Agnieszka Tymula, Paul W. Glimcher.
Published: 09-19-2012
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2, the degree of those aversions vary substantially across individuals, such that the subjective value of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3 to assess the neural representation of the subjective values of risky and ambiguous options4. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations. In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
26 Related JoVE Articles!
Play Button
Nerve Excitability Assessment in Chemotherapy-induced Neurotoxicity
Authors: Susanna B. Park, Cindy S-Y. Lin, Matthew C. Kiernan.
Institutions: University of New South Wales , University of New South Wales , University of New South Wales .
Chemotherapy-induced neurotoxicity is a serious consequence of cancer treatment, which occurs with some of the most commonly used chemotherapies1,2. Chemotherapy-induced peripheral neuropathy produces symptoms of numbness and paraesthesia in the limbs and may progress to difficulties with fine motor skills and walking, leading to functional impairment. In addition to producing troubling symptoms, chemotherapy-induced neuropathy may limit treatment success leading to dose reduction or early cessation of treatment. Neuropathic symptoms may persist long-term, leaving permanent nerve damage in patients with an otherwise good prognosis3. As chemotherapy is utilised more often as a preventative measure, and survival rates increase, the importance of long-lasting and significant neurotoxicity will increase. There are no established neuroprotective or treatment options and a lack of sensitive assessment methods. Appropriate assessment of neurotoxicity will be critical as a prognostic factor and as suitable endpoints for future trials of neuroprotective agents. Current methods to assess the severity of chemotherapy-induced neuropathy utilise clinician-based grading scales which have been demonstrated to lack sensitivity to change and inter-observer objectivity4. Conventional nerve conduction studies provide information about compound action potential amplitude and conduction velocity, which are relatively non-specific measures and do not provide insight into ion channel function or resting membrane potential. Accordingly, prior studies have demonstrated that conventional nerve conduction studies are not sensitive to early change in chemotherapy-induced neurotoxicity4-6. In comparison, nerve excitability studies utilize threshold tracking techniques which have been developed to enable assessment of ion channels, pumps and exchangers in vivo in large myelinated human axons7-9. Nerve excitability techniques have been established as a tool to examine the development and severity of chemotherapy-induced neurotoxicity10-13. Comprising a number of excitability parameters, nerve excitability studies can be used to assess acute neurotoxicity arising immediately following infusion and the development of chronic, cumulative neurotoxicity. Nerve excitability techniques are feasible in the clinical setting, with each test requiring only 5 -10 minutes to complete. Nerve excitability equipment is readily commercially available, and a portable system has been devised so that patients can be tested in situ in the infusion centre setting. In addition, these techniques can be adapted for use in multiple chemotherapies. In patients treated with the chemotherapy oxaliplatin, primarily utilised for colorectal cancer, nerve excitability techniques provide a method to identify patients at-risk for neurotoxicity prior to the onset of chronic neuropathy. Nerve excitability studies have revealed the development of an acute Na+ channelopathy in motor and sensory axons10-13. Importantly, patients who demonstrated changes in excitability in early treatment were subsequently more likely to develop moderate to severe neurotoxicity11. However, across treatment, striking longitudinal changes were identified only in sensory axons which were able to predict clinical neurological outcome in 80% of patients10. These changes demonstrated a different pattern to those seen acutely following oxaliplatin infusion, and most likely reflect the development of significant axonal damage and membrane potential change in sensory nerves which develops longitudinally during oxaliplatin treatment10. Significant abnormalities developed during early treatment, prior to any reduction in conventional measures of nerve function, suggesting that excitability parameters may provide a sensitive biomarker.
Neuroscience, Issue 62, Chemotherapy, Neurotoxicity, Neuropathy, Nerve excitability, Ion channel function, Oxaliplatin, oncology, medicine
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Play Button
Non-radioactive in situ Hybridization Protocol Applicable for Norway Spruce and a Range of Plant Species
Authors: Anna Karlgren, Jenny Carlsson, Niclas Gyllenstrand, Ulf Lagercrantz, Jens F. Sundström.
Institutions: Uppsala University, Swedish University of Agricultural Sciences.
The high-throughput expression analysis technologies available today give scientists an overflow of expression profiles but their resolution in terms of tissue specific expression is limited because of problems in dissecting individual tissues. Expression data needs to be confirmed and complemented with expression patterns using e.g. in situ hybridization, a technique used to localize cell specific mRNA expression. The in situ hybridization method is laborious, time-consuming and often requires extensive optimization depending on species and tissue. In situ experiments are relatively more difficult to perform in woody species such as the conifer Norway spruce (Picea abies). Here we present a modified DIG in situ hybridization protocol, which is fast and applicable on a wide range of plant species including P. abies. With just a few adjustments, including altered RNase treatment and proteinase K concentration, we could use the protocol to study tissue specific expression of homologous genes in male reproductive organs of one gymnosperm and two angiosperm species; P. abies, Arabidopsis thaliana and Brassica napus. The protocol worked equally well for the species and genes studied. AtAP3 and BnAP3 were observed in second and third whorl floral organs in A. thaliana and B. napus and DAL13 in microsporophylls of male cones from P. abies. For P. abies the proteinase K concentration, used to permeablize the tissues, had to be increased to 3 g/ml instead of 1 g/ml, possibly due to more compact tissues and higher levels of phenolics and polysaccharides. For all species the RNase treatment was removed due to reduced signal strength without a corresponding increase in specificity. By comparing tissue specific expression patterns of homologous genes from both flowering plants and a coniferous tree we demonstrate that the DIG in situ protocol presented here, with only minute adjustments, can be applied to a wide range of plant species. Hence, the protocol avoids both extensive species specific optimization and the laborious use of radioactively labeled probes in favor of DIG labeled probes. We have chosen to illustrate the technically demanding steps of the protocol in our film. Anna Karlgren and Jenny Carlsson contributed equally to this study. Corresponding authors: Anna Karlgren at and Jens F. Sundström at
Plant Biology, Issue 26, RNA, expression analysis, Norway spruce, Arabidopsis, rapeseed, conifers
Play Button
Single Plane Illumination Module and Micro-capillary Approach for a Wide-field Microscope
Authors: Thomas Bruns, Sarah Schickinger, Herbert Schneckenburger.
Institutions: Aalen University.
A module for light sheet or single plane illumination microscopy (SPIM) is described which is easily adapted to an inverted wide-field microscope and optimized for 3-dimensional cell cultures, e.g., multi-cellular tumor spheroids (MCTS). The SPIM excitation module shapes and deflects the light such that the sample is illuminated by a light sheet perpendicular to the detection path of the microscope. The system is characterized by use of a rectangular capillary for holding (and in an advanced version also by a micro-capillary approach for rotating) the samples, by synchronous adjustment of the illuminating light sheet and the objective lens used for fluorescence detection as well as by adaptation of a microfluidic system for application of fluorescent dyes, pharmaceutical agents or drugs in small quantities. A protocol for working with this system is given, and some technical details are reported. Representative results include (1) measurements of the uptake of a cytostatic drug (doxorubicin) and its partial conversion to a degradation product, (2) redox measurements by use of a genetically encoded glutathione sensor upon addition of an oxidizing agent, and (3) initiation and labeling of cell necrosis upon inhibition of the mitochondrial respiratory chain. Differences and advantages of the present SPIM module in comparison with existing systems are discussed.
Physics, Issue 90, Fluorescence, light sheet, single plane illumination microscopy (SPIM), 3D cell cultures, rectangular capillary, microfluidics, multi-cellular tumor spheroids (MCTS), wide-field microscopy
Play Button
Fabrication And Characterization Of Photonic Crystal Slow Light Waveguides And Cavities
Authors: Christopher Paul Reardon, Isabella H. Rey, Karl Welna, Liam O'Faolain, Thomas F. Krauss.
Institutions: University of St Andrews.
Slow light has been one of the hot topics in the photonics community in the past decade, generating great interest both from a fundamental point of view and for its considerable potential for practical applications. Slow light photonic crystal waveguides, in particular, have played a major part and have been successfully employed for delaying optical signals1-4 and the enhancement of both linear5-7 and nonlinear devices.8-11 Photonic crystal cavities achieve similar effects to that of slow light waveguides, but over a reduced band-width. These cavities offer high Q-factor/volume ratio, for the realization of optically12 and electrically13 pumped ultra-low threshold lasers and the enhancement of nonlinear effects.14-16 Furthermore, passive filters17 and modulators18-19 have been demonstrated, exhibiting ultra-narrow line-width, high free-spectral range and record values of low energy consumption. To attain these exciting results, a robust repeatable fabrication protocol must be developed. In this paper we take an in-depth look at our fabrication protocol which employs electron-beam lithography for the definition of photonic crystal patterns and uses wet and dry etching techniques. Our optimised fabrication recipe results in photonic crystals that do not suffer from vertical asymmetry and exhibit very good edge-wall roughness. We discuss the results of varying the etching parameters and the detrimental effects that they can have on a device, leading to a diagnostic route that can be taken to identify and eliminate similar issues. The key to evaluating slow light waveguides is the passive characterization of transmission and group index spectra. Various methods have been reported, most notably resolving the Fabry-Perot fringes of the transmission spectrum20-21 and interferometric techniques.22-25 Here, we describe a direct, broadband measurement technique combining spectral interferometry with Fourier transform analysis.26 Our method stands out for its simplicity and power, as we can characterise a bare photonic crystal with access waveguides, without need for on-chip interference components, and the setup only consists of a Mach-Zehnder interferometer, with no need for moving parts and delay scans. When characterising photonic crystal cavities, techniques involving internal sources21 or external waveguides directly coupled to the cavity27 impact on the performance of the cavity itself, thereby distorting the measurement. Here, we describe a novel and non-intrusive technique that makes use of a cross-polarised probe beam and is known as resonant scattering (RS), where the probe is coupled out-of plane into the cavity through an objective. The technique was first demonstrated by McCutcheon et al.28 and further developed by Galli et al.29
Physics, Issue 69, Optics and Photonics, Astronomy, light scattering, light transmission, optical waveguides, photonics, photonic crystals, Slow-light, Cavities, Waveguides, Silicon, SOI, Fabrication, Characterization
Play Button
Genetic Manipulation in Δku80 Strains for Functional Genomic Analysis of Toxoplasma gondii
Authors: Leah M. Rommereim, Miryam A. Hortua Triana, Alejandra Falla, Kiah L. Sanders, Rebekah B. Guevara, David J. Bzik, Barbara A. Fox.
Institutions: The Geisel School of Medicine at Dartmouth.
Targeted genetic manipulation using homologous recombination is the method of choice for functional genomic analysis to obtain a detailed view of gene function and phenotype(s). The development of mutant strains with targeted gene deletions, targeted mutations, complemented gene function, and/or tagged genes provides powerful strategies to address gene function, particularly if these genetic manipulations can be efficiently targeted to the gene locus of interest using integration mediated by double cross over homologous recombination. Due to very high rates of nonhomologous recombination, functional genomic analysis of Toxoplasma gondii has been previously limited by the absence of efficient methods for targeting gene deletions and gene replacements to specific genetic loci. Recently, we abolished the major pathway of nonhomologous recombination in type I and type II strains of T. gondii by deleting the gene encoding the KU80 protein1,2. The Δku80 strains behave normally during tachyzoite (acute) and bradyzoite (chronic) stages in vitro and in vivo and exhibit essentially a 100% frequency of homologous recombination. The Δku80 strains make functional genomic studies feasible on the single gene as well as on the genome scale1-4. Here, we report methods for using type I and type II Δku80Δhxgprt strains to advance gene targeting approaches in T. gondii. We outline efficient methods for generating gene deletions, gene replacements, and tagged genes by targeted insertion or deletion of the hypoxanthine-xanthine-guanine phosphoribosyltransferase (HXGPRT) selectable marker. The described gene targeting protocol can be used in a variety of ways in Δku80 strains to advance functional analysis of the parasite genome and to develop single strains that carry multiple targeted genetic manipulations. The application of this genetic method and subsequent phenotypic assays will reveal fundamental and unique aspects of the biology of T. gondii and related significant human pathogens that cause malaria (Plasmodium sp.) and cryptosporidiosis (Cryptosporidium).
Infectious Diseases, Issue 77, Genetics, Microbiology, Infection, Medicine, Immunology, Molecular Biology, Cellular Biology, Biomedical Engineering, Bioengineering, Genomics, Parasitology, Pathology, Apicomplexa, Coccidia, Toxoplasma, Genetic Techniques, Gene Targeting, Eukaryota, Toxoplasma gondii, genetic manipulation, gene targeting, gene deletion, gene replacement, gene tagging, homologous recombination, DNA, sequencing
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Designing Silk-silk Protein Alloy Materials for Biomedical Applications
Authors: Xiao Hu, Solomon Duki, Joseph Forys, Jeffrey Hettinger, Justin Buchicchio, Tabbetha Dobbins, Catherine Yang.
Institutions: Rowan University, Rowan University, Cooper Medical School of Rowan University, Rowan University.
Fibrous proteins display different sequences and structures that have been used for various applications in biomedical fields such as biosensors, nanomedicine, tissue regeneration, and drug delivery. Designing materials based on the molecular-scale interactions between these proteins will help generate new multifunctional protein alloy biomaterials with tunable properties. Such alloy material systems also provide advantages in comparison to traditional synthetic polymers due to the materials biodegradability, biocompatibility, and tenability in the body. This article used the protein blends of wild tussah silk (Antheraea pernyi) and domestic mulberry silk (Bombyx mori) as an example to provide useful protocols regarding these topics, including how to predict protein-protein interactions by computational methods, how to produce protein alloy solutions, how to verify alloy systems by thermal analysis, and how to fabricate variable alloy materials including optical materials with diffraction gratings, electric materials with circuits coatings, and pharmaceutical materials for drug release and delivery. These methods can provide important information for designing the next generation multifunctional biomaterials based on different protein alloys.
Bioengineering, Issue 90, protein alloys, biomaterials, biomedical, silk blends, computational simulation, implantable electronic devices
Play Button
An Experimental Model to Study Tuberculosis-Malaria Coinfection upon Natural Transmission of Mycobacterium tuberculosis and Plasmodium berghei
Authors: Ann-Kristin Mueller, Jochen Behrends, Jannike Blank, Ulrich E. Schaible, Bianca E. Schneider.
Institutions: University Hospital Heidelberg, Research Center Borstel.
Coinfections naturally occur due to the geographic overlap of distinct types of pathogenic organisms. Concurrent infections most likely modulate the respective immune response to each single pathogen and may thereby affect pathogenesis and disease outcome. Coinfected patients may also respond differentially to anti-infective interventions. Coinfection between tuberculosis as caused by mycobacteria and the malaria parasite Plasmodium, both of which are coendemic in many parts of sub-Saharan Africa, has not been studied in detail. In order to approach the challenging but scientifically and clinically highly relevant question how malaria-tuberculosis coinfection modulate host immunity and the course of each disease, we established an experimental mouse model that allows us to dissect the elicited immune responses to both pathogens in the coinfected host. Of note, in order to most precisely mimic naturally acquired human infections, we perform experimental infections of mice with both pathogens by their natural routes of infection, i.e. aerosol and mosquito bite, respectively.
Infectious Diseases, Issue 84, coinfection, mouse, Tuberculosis, Malaria, Plasmodium berghei, Mycobacterium tuberculosis, natural transmission
Play Button
Fast Micro-iontophoresis of Glutamate and GABA: A Useful Tool to Investigate Synaptic Integration
Authors: Christina Müller, Stefan Remy.
Institutions: University of Bonn, Deutsches Zentrum für Neurodegenerative Erkrankungen e.V. (DZNE).
One of the fundamental interests in neuroscience is to understand the integration of excitatory and inhibitory inputs along the very complex structure of the dendritic tree, which eventually leads to neuronal output of action potentials at the axon. The influence of diverse spatial and temporal parameters of specific synaptic input on neuronal output is currently under investigation, e.g. the distance-dependent attenuation of dendritic inputs, the location-dependent interaction of spatially segregated inputs, the influence of GABAergig inhibition on excitatory integration, linear and non-linear integration modes, and many more. With fast micro-iontophoresis of glutamate and GABA it is possible to precisely investigate the spatial and temporal integration of glutamatergic excitation and GABAergic inhibition. Critical technical requirements are either a triggered fluorescent lamp, light-emitting diode (LED), or a two-photon scanning microscope to visualize dendritic branches without introducing significant photo-damage of the tissue. Furthermore, it is very important to have a micro-iontophoresis amplifier that allows for fast capacitance compensation of high resistance pipettes. Another crucial point is that no transmitter is involuntarily released by the pipette during the experiment. Once established, this technique will give reliable and reproducible signals with a high neurotransmitter and location specificity. Compared to glutamate and GABA uncaging, fast iontophoresis allows using both transmitters at the same time but at very distant locations without limitation to the field of view. There are also advantages compared to focal electrical stimulation of axons: with micro-iontophoresis the location of the input site is definitely known and it is sure that only the neurotransmitter of interest is released. However it has to be considered that with micro-iontophoresis only the postsynapse is activated and presynaptic aspects of neurotransmitter release are not resolved. In this article we demonstrate how to set up micro-iontophoresis in brain slice experiments.
Neuroscience, Issue 77, Neurobiology, Molecular Biology, Cellular Biology, Physiology, Biomedical Engineering, Biophysics, Biochemistry, biology (general), animal biology, Nervous System, Life Sciences (General), Neurosciences, brain slices, dendrites, inhibition, excitation, glutamate, GABA, micro-iontophoresis, iontophoresis, neurons, patch clamp, whole cell recordings
Play Button
Prehospital Thrombolysis: A Manual from Berlin
Authors: Martin Ebinger, Sascha Lindenlaub, Alexander Kunz, Michal Rozanski, Carolin Waldschmidt, Joachim E. Weber, Matthias Wendt, Benjamin Winter, Philipp A. Kellner, Sabina Kaczmarek, Matthias Endres, Heinrich J. Audebert.
Institutions: Charité - Universitätsmedizin Berlin, Charité - Universitätsmedizin Berlin, Universitätsklinikum Hamburg - Eppendorf, Berliner Feuerwehr, STEMO-Consortium.
In acute ischemic stroke, time from symptom onset to intervention is a decisive prognostic factor. In order to reduce this time, prehospital thrombolysis at the emergency site would be preferable. However, apart from neurological expertise and laboratory investigations a computed tomography (CT) scan is necessary to exclude hemorrhagic stroke prior to thrombolysis. Therefore, a specialized ambulance equipped with a CT scanner and point-of-care laboratory was designed and constructed. Further, a new stroke identifying interview algorithm was developed and implemented in the Berlin emergency medical services. Since February 2011 the identification of suspected stroke in the dispatch center of the Berlin Fire Brigade prompts the deployment of this ambulance, a stroke emergency mobile (STEMO). On arrival, a neurologist, experienced in stroke care and with additional training in emergency medicine, takes a neurological examination. If stroke is suspected a CT scan excludes intracranial hemorrhage. The CT-scans are telemetrically transmitted to the neuroradiologist on-call. If coagulation status of the patient is normal and patient's medical history reveals no contraindication, prehospital thrombolysis is applied according to current guidelines (intravenous recombinant tissue plasminogen activator, iv rtPA, alteplase, Actilyse). Thereafter patients are transported to the nearest hospital with a certified stroke unit for further treatment and assessment of strokeaetiology. After a pilot-phase, weeks were randomized into blocks either with or without STEMO care. Primary end-point of this study is time from alarm to the initiation of thrombolysis. We hypothesized that alarm-to-treatment time can be reduced by at least 20 min compared to regular care.
Medicine, Issue 81, Telemedicine, Emergency Medical Services, Stroke, Tomography, X-Ray Computed, Emergency Treatment,[stroke, thrombolysis, prehospital, emergency medical services, ambulance
Play Button
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
Play Button
Using High Resolution Computed Tomography to Visualize the Three Dimensional Structure and Function of Plant Vasculature
Authors: Andrew J. McElrone, Brendan Choat, Dilworth Y. Parkinson, Alastair A. MacDowell, Craig R. Brodersen.
Institutions: U.S. Department of Agriculture, University of California - Davis, University of Western Sydney, Lawrence Berkeley National Lab, University of Florida .
High resolution x-ray computed tomography (HRCT) is a non-destructive diagnostic imaging technique with sub-micron resolution capability that is now being used to evaluate the structure and function of plant xylem network in three dimensions (3D) (e.g. Brodersen et al. 2010; 2011; 2012a,b). HRCT imaging is based on the same principles as medical CT systems, but a high intensity synchrotron x-ray source results in higher spatial resolution and decreased image acquisition time. Here, we demonstrate in detail how synchrotron-based HRCT (performed at the Advanced Light Source-LBNL Berkeley, CA, USA) in combination with Avizo software (VSG Inc., Burlington, MA, USA) is being used to explore plant xylem in excised tissue and living plants. This new imaging tool allows users to move beyond traditional static, 2D light or electron micrographs and study samples using virtual serial sections in any plane. An infinite number of slices in any orientation can be made on the same sample, a feature that is physically impossible using traditional microscopy methods. Results demonstrate that HRCT can be applied to both herbaceous and woody plant species, and a range of plant organs (i.e. leaves, petioles, stems, trunks, roots). Figures presented here help demonstrate both a range of representative plant vascular anatomy and the type of detail extracted from HRCT datasets, including scans for coast redwood (Sequoia sempervirens), walnut (Juglans spp.), oak (Quercus spp.), and maple (Acer spp.) tree saplings to sunflowers (Helianthus annuus), grapevines (Vitis spp.), and ferns (Pteridium aquilinum and Woodwardia fimbriata). Excised and dried samples from woody species are easiest to scan and typically yield the best images. However, recent improvements (i.e. more rapid scans and sample stabilization) have made it possible to use this visualization technique on green tissues (e.g. petioles) and in living plants. On occasion some shrinkage of hydrated green plant tissues will cause images to blur and methods to avoid these issues are described. These recent advances with HRCT provide promising new insights into plant vascular function.
Plant Biology, Issue 74, Cellular Biology, Molecular Biology, Biophysics, Structural Biology, Physics, Environmental Sciences, Agriculture, botany, environmental effects (biological, animal and plant), plants, radiation effects (biological, animal and plant), CT scans, advanced visualization techniques, xylem networks, plant vascular function, synchrotron, x-ray micro-tomography, ALS 8.3.2, xylem, phloem, tomography, imaging
Play Button
Fecal Microbiota Transplantation via Colonoscopy for Recurrent C. difficile Infection
Authors: Jessica R. Allegretti, Joshua R. Korzenik, Matthew J. Hamilton.
Institutions: Brigham and Women‘s Hospital.
Fecal Microbiota Transplantation (FMT) is a safe and highly effective treatment for recurrent and refractory C. difficile infection (CDI). Various methods of FMT administration have been reported in the literature including nasogastric tube, upper endoscopy, enema and colonoscopy. FMT via colonoscopy yields excellent cure rates and is also well tolerated. We have found that patients find this an acceptable and tolerable mode of delivery. At our Center, we have initiated a fecal transplant program for patients with recurrent or refractory CDI. We have developed a protocol using an iterative process of revision and have performed 24 fecal transplants on 22 patients with success rates comparable to the current published literature. A systematic approach to patient and donor screening, preparation of stool, and delivery of the stool maximizes therapeutic success. Here we detail each step of the FMT protocol that can be carried out at any endoscopy center with a high degree of safety and success.
Immunology, Issue 94, C.difficile, colonoscopy, fecal transplant, stool, diarrhea, microbiota
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Authors: Madeleine E. Hackney, Kathleen McKee.
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Play Button
Measuring Spatial and Temporal Ca2+ Signals in Arabidopsis Plants
Authors: Xiaohong Zhu, Aaron Taylor, Shenyu Zhang, Dayong Zhang, Ying Feng, Gaimei Liang, Jian-Kang Zhu.
Institutions: Purdue University, Purdue University, Jiangsu Academy of Agricultural Sciences, Zhejiang University, Shanxi Academy of Agricultural Sciences, Chinese Academy of Sciences.
Developmental and environmental cues induce Ca2+ fluctuations in plant cells. Stimulus-specific spatial-temporal Ca2+ patterns are sensed by cellular Ca2+ binding proteins that initiate Ca2+ signaling cascades. However, we still know little about how stimulus specific Ca2+ signals are generated. The specificity of a Ca2+ signal may be attributed to the sophisticated regulation of the activities of Ca2+ channels and/or transporters in response to a given stimulus. To identify these cellular components and understand their functions, it is crucial to use systems that allow a sensitive and robust recording of Ca2+ signals at both the tissue and cellular levels. Genetically encoded Ca2+ indicators that are targeted to different cellular compartments have provided a platform for live cell confocal imaging of cellular Ca2+ signals. Here we describe instructions for the use of two Ca2+ detection systems: aequorin based FAS (film adhesive seedlings) luminescence Ca2+ imaging and case12 based live cell confocal fluorescence Ca2+ imaging. Luminescence imaging using the FAS system provides a simple, robust and sensitive detection of spatial and temporal Ca2+ signals at the tissue level, while live cell confocal imaging using Case12 provides simultaneous detection of cytosolic and nuclear Ca2+ signals at a high resolution.
Plant Biology, Issue 91, Aequorin, Case12, abiotic stress, heavy metal stress, copper ion, calcium imaging, Arabidopsis
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Play Button
Principles of Site-Specific Recombinase (SSR) Technology
Authors: Frank Bucholtz.
Institutions: Max Plank Institute for Molecular Cell Biology and Genetics, Dresden.
Site-specific recombinase (SSR) technology allows the manipulation of gene structure to explore gene function and has become an integral tool of molecular biology. Site-specific recombinases are proteins that bind to distinct DNA target sequences. The Cre/lox system was first described in bacteriophages during the 1980's. Cre recombinase is a Type I topoisomerase that catalyzes site-specific recombination of DNA between two loxP (locus of X-over P1) sites. The Cre/lox system does not require any cofactors. LoxP sequences contain distinct binding sites for Cre recombinases that surround a directional core sequence where recombination and rearrangement takes place. When cells contain loxP sites and express the Cre recombinase, a recombination event occurs. Double-stranded DNA is cut at both loxP sites by the Cre recombinase, rearranged, and ligated ("scissors and glue"). Products of the recombination event depend on the relative orientation of the asymmetric sequences. SSR technology is frequently used as a tool to explore gene function. Here the gene of interest is flanked with Cre target sites loxP ("floxed"). Animals are then crossed with animals expressing the Cre recombinase under the control of a tissue-specific promoter. In tissues that express the Cre recombinase it binds to target sequences and excises the floxed gene. Controlled gene deletion allows the investigation of gene function in specific tissues and at distinct time points. Analysis of gene function employing SSR technology --- conditional mutagenesis -- has significant advantages over traditional knock-outs where gene deletion is frequently lethal.
Cellular Biology, Issue 15, Molecular Biology, Site-Specific Recombinase, Cre recombinase, Cre/lox system, transgenic animals, transgenic technology
Play Button
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Authors: Gauthier Julie, Fadi F. Hamdan, Guy A. Rouleau.
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1 and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo mutations. This is the case for autism and schizophrenia3. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo mutations would more frequently come from males, particularly older males4. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
Play Button
Measurement of Leaf Hydraulic Conductance and Stomatal Conductance and Their Responses to Irradiance and Dehydration Using the Evaporative Flux Method (EFM)
Authors: Lawren Sack, Christine Scoffoni.
Institutions: University of California, Los Angeles .
Water is a key resource, and the plant water transport system sets limits on maximum growth and drought tolerance. When plants open their stomata to achieve a high stomatal conductance (gs) to capture CO2 for photosynthesis, water is lost by transpiration1,2. Water evaporating from the airspaces is replaced from cell walls, in turn drawing water from the xylem of leaf veins, in turn drawing from xylem in the stems and roots. As water is pulled through the system, it experiences hydraulic resistance, creating tension throughout the system and a low leaf water potential (Ψleaf). The leaf itself is a critical bottleneck in the whole plant system, accounting for on average 30% of the plant hydraulic resistance3. Leaf hydraulic conductance (Kleaf = 1/ leaf hydraulic resistance) is the ratio of the water flow rate to the water potential gradient across the leaf, and summarizes the behavior of a complex system: water moves through the petiole and through several orders of veins, exits into the bundle sheath and passes through or around mesophyll cells before evaporating into the airspace and being transpired from the stomata. Kleaf is of strong interest as an important physiological trait to compare species, quantifying the effectiveness of the leaf structure and physiology for water transport, and a key variable to investigate for its relationship to variation in structure (e.g., in leaf venation architecture) and its impacts on photosynthetic gas exchange. Further, Kleaf responds strongly to the internal and external leaf environment3. Kleaf can increase dramatically with irradiance apparently due to changes in the expression and activation of aquaporins, the proteins involved in water transport through membranes4, and Kleaf declines strongly during drought, due to cavitation and/or collapse of xylem conduits, and/or loss of permeability in the extra-xylem tissues due to mesophyll and bundle sheath cell shrinkage or aquaporin deactivation5-10. Because Kleaf can constrain gs and photosynthetic rate across species in well watered conditions and during drought, and thus limit whole-plant performance they may possibly determine species distributions especially as droughts increase in frequency and severity11-14. We present a simple method for simultaneous determination of Kleaf and gs on excised leaves. A transpiring leaf is connected by its petiole to tubing running to a water source on a balance. The loss of water from the balance is recorded to calculate the flow rate through the leaf. When steady state transpiration (E, mmol • m-2 • s-1) is reached, gs is determined by dividing by vapor pressure deficit, and Kleaf by dividing by the water potential driving force determined using a pressure chamber (Kleaf= E /- Δψleaf, MPa)15. This method can be used to assess Kleaf responses to different irradiances and the vulnerability of Kleaf to dehydration14,16,17.
Plant Biology, Issue 70, Molecular Biology, Physiology, Ecology, Biology, Botany, Leaf traits, hydraulics, stomata, transpiration, xylem, conductance, leaf hydraulic conductance, resistance, evaporative flux method, whole plant
Play Button
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Authors: Sergey Rabotyagov, Todd Campbell, Adriana Valcu, Philip Gassman, Manoj Jha, Keith Schilling, Calvin Wolter, Catherine Kling.
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,5,12,20) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7 with a multiobjective evolutionary algorithm SPEA226, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.