Many women undergo cesarean delivery without problems, however some experience significant pain after cesarean section. Pain is associated with negative short-term and long-term effects on the mother. Prior to women undergoing surgery, can we predict who is at risk for developing significant postoperative pain and potentially prevent or minimize its negative consequences? These are the fundamental questions that a team from the University of Washington, Stanford University, the Catholic University in Brussels, Belgium, Santa Joana Women's Hospital in São Paulo, Brazil, and Rambam Medical Center in Israel is currently evaluating in an international research collaboration. The ultimate goal of this project is to provide optimal pain relief during and after cesarean section by offering individualized anesthetic care to women who appear to be more 'susceptible' to pain after surgery.
A significant number of women experience moderate or severe acute post-partum pain after vaginal and cesarean deliveries. 1 Furthermore, 10-15% of women suffer chronic persistent pain after cesarean section. 2 With constant increase in cesarean rates in the US 3 and the already high rate in Brazil, this is bound to create a significant public health problem. When questioning women's fears and expectations from cesarean section, pain during and after it is their greatest concern. 4 Individual variability in severity of pain after vaginal or operative delivery is influenced by multiple factors including sensitivity to pain, psychological factors, age, and genetics. The unique birth experience leads to unpredictable requirements for analgesics, from 'none at all' to 'very high' doses of pain medication. Pain after cesarean section is an excellent model to study post-operative pain because it is performed on otherwise young and healthy women. Therefore, it is recommended to attenuate the pain during the acute phase because this may lead to chronic pain disorders. The impact of developing persistent pain is immense, since it may impair not only the ability of women to care for their child in the immediate postpartum period, but also their own well being for a long period of time.
In a series of projects, an international research network is currently investigating the effect of pregnancy on pain modulation and ways to predict who will suffer acute severe pain and potentially chronic pain, by using simple pain tests and questionnaires in combination with genetic analysis. A relatively recent approach to investigate pain modulation is via the psychophysical measure of Diffuse Noxious Inhibitory Control (DNIC). This pain-modulating process is the neurophysiological basis for the well-known phenomenon of 'pain inhibits pain' from remote areas of the body. The DNIC paradigm has evolved recently into a clinical tool and simple test and has been shown to be a predictor of post-operative pain.5 Since pregnancy is associated with decreased pain sensitivity and/or enhanced processes of pain modulation, using tests that investigate pain modulation should provide a better understanding of the pathways involved with pregnancy-induced analgesia and may help predict pain outcomes during labor and delivery. For those women delivering by cesarean section, a DNIC test performed prior to surgery along with psychosocial questionnaires and genetic tests should enable one to identify women prone to suffer severe post-cesarean pain and persistent pain. These clinical tests should allow anesthesiologists to offer not only personalized medicine to women with the promise to improve well-being and satisfaction, but also a reduction in the overall cost of perioperative and long term care due to pain and suffering. On a larger scale, these tests that explore pain modulation may become bedside screening tests to predict the development of pain disorders following surgery.
25 Related JoVE Articles!
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e.
C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample).
Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e.
colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ
soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Inducing Plasticity of Astrocytic Receptors by Manipulation of Neuronal Firing Rates
Institutions: University of California Riverside, University of California Riverside, University of California Riverside.
Close to two decades of research has established that astrocytes in situ
and in vivo
express numerous G protein-coupled receptors (GPCRs) that can be stimulated by neuronally-released transmitter. However, the ability of astrocytic receptors to exhibit plasticity in response to changes in neuronal activity has received little attention. Here we describe a model system that can be used to globally scale up or down astrocytic group I metabotropic glutamate receptors (mGluRs) in acute brain slices. Included are methods on how to prepare parasagittal hippocampal slices, construct chambers suitable for long-term slice incubation, bidirectionally manipulate neuronal action potential frequency, load astrocytes and astrocyte processes with fluorescent Ca2+
indicator, and measure changes in astrocytic Gq GPCR activity by recording spontaneous and evoked astrocyte Ca2+
events using confocal microscopy. In essence, a “calcium roadmap” is provided for how to measure plasticity of astrocytic Gq GPCRs. Applications of the technique for study of astrocytes are discussed. Having an understanding of how astrocytic receptor signaling is affected by changes in neuronal activity has important implications for both normal synaptic function as well as processes underlying neurological disorders and neurodegenerative disease.
Neuroscience, Issue 85, astrocyte, plasticity, mGluRs, neuronal Firing, electrophysiology, Gq GPCRs, Bolus-loading, calcium, microdomains, acute slices, Hippocampus, mouse
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro
. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro
replication of HIV-1 as influenced by the gag
gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag
gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro
replication of chronically derived gag-pro
sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Experimental Protocol for Manipulating Plant-induced Soil Heterogeneity
Institutions: Case Western Reserve University.
Coexistence theory has often treated environmental heterogeneity as being independent of the community composition; however biotic feedbacks such as plant-soil feedbacks (PSF) have large effects on plant performance, and create environmental heterogeneity that depends on the community composition. Understanding the importance of PSF for plant community assembly necessitates understanding of the role of heterogeneity in PSF, in addition to mean PSF effects. Here, we describe a protocol for manipulating plant-induced soil heterogeneity. Two example experiments are presented: (1) a field experiment with a 6-patch grid of soils to measure plant population responses and (2) a greenhouse experiment with 2-patch soils to measure individual plant responses. Soils can be collected from the zone of root influence (soils from the rhizosphere and directly adjacent to the rhizosphere) of plants in the field from conspecific and heterospecific plant species. Replicate collections are used to avoid pseudoreplicating soil samples. These soils are then placed into separate patches for heterogeneous treatments or mixed for a homogenized treatment. Care should be taken to ensure that heterogeneous and homogenized treatments experience the same degree of soil disturbance. Plants can then be placed in these soil treatments to determine the effect of plant-induced soil heterogeneity on plant performance. We demonstrate that plant-induced heterogeneity results in different outcomes than predicted by traditional coexistence models, perhaps because of the dynamic nature of these feedbacks. Theory that incorporates environmental heterogeneity influenced by the assembling community and additional empirical work is needed to determine when heterogeneity intrinsic to the assembling community will result in different assembly outcomes compared with heterogeneity extrinsic to the community composition.
Environmental Sciences, Issue 85, Coexistence, community assembly, environmental drivers, plant-soil feedback, soil heterogeneity, soil microbial communities, soil patch
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Stabilizing Hepatocellular Phenotype Using Optimized Synthetic Surfaces
Institutions: University of Edinburgh, University of Edinburgh, University of Edinburgh.
Currently, one of the major limitations in cell biology is maintaining differentiated cell phenotype. Biological matrices are commonly used for culturing and maintaining primary and pluripotent stem cell derived hepatocytes. While biological matrices are useful, they permit short term culture of hepatocytes, limiting their widespread application. We have attempted to overcome the limitations using a synthetic polymer coating. Polymers represent one of the broadest classes of biomaterials and possess a wide range of mechanical, physical and chemical properties, which can be fine-tuned for purpose. Importantly, such materials can be scaled to quality assured standards and display batch-to-batch consistency. This is essential if cells are to be expanded for high through-put screening in the pharmaceutical testing industry or for cellular based therapy. Polyurethanes (PUs) are one group of materials that have shown promise in cell culture. Our recent progress in optimizing a polyurethane coated surface, for long-term culture of human hepatocytes displaying stable phenotype, is presented and discussed.
Chemistry, Issue 91, Pluripotent stem cell, polyurethane, polymer coating, p450 metabolism, stable phenotype, gamma irradiation, ultraviolet irradiation.
A Multi-Modal Approach to Assessing Recovery in Youth Athletes Following Concussion
Institutions: Holland Bloorview Kids Rehabilitation Hospital, University of Toronto, University of Toronto.
Concussion is one of the most commonly reported injuries amongst children and youth involved in sport participation. Following a concussion, youth can experience a range of short and long term neurobehavioral symptoms (somatic, cognitive and emotional/behavioral) that can have a significant impact on one’s participation in daily activities and pursuits of interest (e.g.,
school, sports, work, family/social life, etc.
). Despite this, there remains a paucity in clinically driven research aimed specifically at exploring concussion within the youth sport population, and more specifically, multi-modal approaches to measuring recovery. This article provides an overview of a novel and multi-modal approach to measuring recovery amongst youth athletes following concussion. The presented approach involves the use of both pre-injury/baseline testing and post-injury/follow-up testing to assess performance across a wide variety of domains (post-concussion symptoms, cognition, balance, strength, agility/motor skills and resting state heart rate variability). The goal of this research is to gain a more objective and accurate understanding of recovery following concussion in youth athletes (ages 10-18 years). Findings from this research can help to inform the development and use of improved approaches to concussion management and rehabilitation specific to the youth sport community.
Medicine, Issue 91, concussion, children, youth, athletes, assessment, management, rehabilitation
Propagation of Homalodisca coagulata virus-01 via Homalodisca vitripennis Cell Culture
Institutions: University of Texas at Tyler, USDA ARS.
The glassy-winged sharpshooter (Homalodisca vitripennis
) is a highly vagile and polyphagous insect found throughout the southwestern United States. These insects are the predominant vectors of Xylella fastidiosa (X. fastidiosa),
a xylem-limited bacterium that is the causal agent of Pierce's disease (PD) of grapevine. Pierce’s disease is economically damaging; thus, H. vitripennis
have become a target for pathogen management strategies. A dicistrovirus identified as Homalodisca coagulata virus-01
(HoCV-01) has been associated with an increased mortality in H. vitripennis
populations. Because a host cell is required for HoCV-01 replication, cell culture provides a uniform environment for targeted replication that is logistically and economically valuable for biopesticide production. In this study, a system for large-scale propagation of H. vitripennis
cells via tissue culture was developed, providing a viral replication mechanism. HoCV-01 was extracted from whole body insects and used to inoculate cultured H. vitripennis
cells at varying levels. The culture medium was removed every 24 hr for 168 hr, RNA extracted and analyzed with qRT-PCR. Cells were stained with trypan blue and counted to quantify cell survivability using light microscopy. Whole virus particles were extracted up to 96 hr after infection, which was the time point determined to be before total cell culture collapse occurred. Cells were also subjected to fluorescent staining and viewed using confocal microscopy to investigate viral activity on F-actin attachment and nuclei integrity. The conclusion of this study is that H. vitripennis
cells are capable of being cultured and used for mass production of HoCV-01 at a suitable level to allow production of a biopesticide.
Infection, Issue 91, Homalodisca vitripennis, Homalodisca coagulata virus-01, cell culture, Pierce’s disease of grapevine, Xylella fastidiosa, Dicistroviridae
Measurement of Greenhouse Gas Flux from Agricultural Soils Using Static Chambers
Institutions: University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, USDA-ARS Dairy Forage Research Center, USDA-ARS Pasture Systems Watershed Management Research Unit.
Measurement of greenhouse gas (GHG) fluxes between the soil and the atmosphere, in both managed and unmanaged ecosystems, is critical to understanding the biogeochemical drivers of climate change and to the development and evaluation of GHG mitigation strategies based on modulation of landscape management practices. The static chamber-based method described here is based on trapping gases emitted from the soil surface within a chamber and collecting samples from the chamber headspace at regular intervals for analysis by gas chromatography. Change in gas concentration over time is used to calculate flux. This method can be utilized to measure landscape-based flux of carbon dioxide, nitrous oxide, and methane, and to estimate differences between treatments or explore system dynamics over seasons or years. Infrastructure requirements are modest, but a comprehensive experimental design is essential. This method is easily deployed in the field, conforms to established guidelines, and produces data suitable to large-scale GHG emissions studies.
Environmental Sciences, Issue 90, greenhouse gas, trace gas, gas flux, static chamber, soil, field, agriculture, climate
Microwave-assisted Functionalization of Poly(ethylene glycol) and On-resin Peptides for Use in Chain Polymerizations and Hydrogel Formation
Institutions: University of Rochester, University of Rochester, University of Rochester Medical Center.
One of the main benefits to using poly(ethylene glycol) (PEG) macromers in hydrogel formation is synthetic versatility. The ability to draw from a large variety of PEG molecular weights and configurations (arm number, arm length, and branching pattern) affords researchers tight control over resulting hydrogel structures and properties, including Young’s modulus and mesh size. This video will illustrate a rapid, efficient, solvent-free, microwave-assisted method to methacrylate PEG precursors into poly(ethylene glycol) dimethacrylate (PEGDM). This synthetic method provides much-needed starting materials for applications in drug delivery and regenerative medicine. The demonstrated method is superior to traditional methacrylation methods as it is significantly faster and simpler, as well as more economical and environmentally friendly, using smaller amounts of reagents and solvents. We will also demonstrate an adaptation of this technique for on-resin methacrylamide functionalization of peptides. This on-resin method allows the N-terminus of peptides to be functionalized with methacrylamide groups prior to deprotection and cleavage from resin. This allows for selective addition of methacrylamide groups to the N-termini of the peptides while amino acids with reactive side groups (e.g.
primary amine of lysine, primary alcohol of serine, secondary alcohols of threonine, and phenol of tyrosine) remain protected, preventing functionalization at multiple sites. This article will detail common analytical methods (proton Nuclear Magnetic Resonance spectroscopy (;
H-NMR) and Matrix Assisted Laser Desorption Ionization Time of Flight mass spectrometry (MALDI-ToF)) to assess the efficiency of the functionalizations. Common pitfalls and suggested troubleshooting methods will be addressed, as will modifications of the technique which can be used to further tune macromer functionality and resulting hydrogel physical and chemical properties. Use of synthesized products for the formation of hydrogels for drug delivery and cell-material interaction studies will be demonstrated, with particular attention paid to modifying hydrogel composition to affect mesh size, controlling hydrogel stiffness and drug release.
Chemistry, Issue 80, Poly(ethylene glycol), peptides, polymerization, polymers, methacrylation, peptide functionalization, 1H-NMR, MALDI-ToF, hydrogels, macromer synthesis
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Window on a Microworld: Simple Microfluidic Systems for Studying Microbial Transport in Porous Media
Institutions: Vanderbilt University, Vanderbilt University, Vanderbilt University, Vanderbilt University, University of Connecticut, University of Connecticut.
Microbial growth and transport in porous media have important implications for the quality of groundwater and surface water, the recycling of nutrients in the environment, as well as directly for the transmission of pathogens to drinking water supplies. Natural porous media is composed of an intricate physical topology, varied surface chemistries, dynamic gradients of nutrients and electron acceptors, and a patchy distribution of microbes. These features vary substantially over a length scale of microns, making the results of macro-scale investigations of microbial transport difficult to interpret, and the validation of mechanistic models challenging. Here we demonstrate how simple microfluidic devices can be used to visualize microbial interactions with micro-structured habitats, to identify key processes influencing the observed phenomena, and to systematically validate predictive models. Simple, easy-to-use flow cells were constructed out of the transparent, biocompatible and oxygen-permeable material poly(dimethyl siloxane). Standard methods of photolithography were used to make micro-structured masters, and replica molding was used to cast micro-structured flow cells from the masters. The physical design of the flow cell chamber is adaptable to the experimental requirements: microchannels can vary from simple linear connections to complex topologies with feature sizes as small as 2 μm. Our modular EcoChip flow cell array features dozens of identical chambers and flow control by a gravity-driven flow module. We demonstrate that through use of EcoChip devices, physical structures and pressure heads can be held constant or varied systematically while the influence of surface chemistry, fluid properties, or the characteristics of the microbial population is investigated. Through transport experiments using a non-pathogenic, green fluorescent protein-expressing Vibrio
bacterial strain, we illustrate the importance of habitat structure, flow conditions, and inoculums size on fundamental transport phenomena, and with real-time particle-scale observations, demonstrate that microfluidics offer a compelling view of a hidden world.
Microbiology, Issue 39, Microfluidic device, bacterial transport, porous media, colloid, biofilm, filtration theory, artificial habitat, micromodel, PDMS, GFP
An Investigation of the Effects of Sports-related Concussion in Youth Using Functional Magnetic Resonance Imaging and the Head Impact Telemetry System
Institutions: University of Toronto, University of Toronto, University of Toronto, Bloorview Kids Rehab, Toronto Rehab, Sunnybrook Health Sciences Centre, University of Toronto.
One of the most commonly reported injuries in children who participate in sports is concussion or mild traumatic brain injury (mTBI)1
. Children and youth involved in organized sports such as competitive hockey are nearly six times more likely to suffer a severe concussion compared to children involved in other leisure physical activities2
. While the most common cognitive sequelae of mTBI appear similar for children and adults, the recovery profile and breadth of consequences in children remains largely unknown2
, as does the influence of pre-injury characteristics (e.g. gender) and injury details (e.g. magnitude and direction of impact) on long-term outcomes. Competitive sports, such as hockey, allow the rare opportunity to utilize a pre-post design to obtain pre-injury data before concussion occurs on youth characteristics and functioning and to relate this to outcome following injury. Our primary goals are to refine pediatric concussion diagnosis and management based on research evidence that is specific to children and youth. To do this we use new, multi-modal and integrative approaches that will:
1.Evaluate the immediate effects of head trauma in youth
2.Monitor the resolution of post-concussion symptoms (PCS) and cognitive performance during recovery
3.Utilize new methods to verify brain injury and recovery
To achieve our goals, we have implemented the Head Impact Telemetry (HIT) System. (Simbex; Lebanon, NH, USA). This system equips commercially available Easton S9 hockey helmets (Easton-Bell Sports; Van Nuys, CA, USA) with single-axis accelerometers designed to measure real-time head accelerations during contact sport participation 3 - 5
. By using telemetric technology, the magnitude of acceleration and location of all head impacts during sport participation can be objectively detected and recorded. We also use functional magnetic resonance imaging (fMRI) to localize and assess changes in neural activity specifically in the medial temporal and frontal lobes during the performance of cognitive tasks, since those are the cerebral regions most sensitive to concussive head injury 6
. Finally, we are acquiring structural imaging data sensitive to damage in brain white matter.
Medicine, Issue 47, Mild traumatic brain injury, concussion, fMRI, youth, Head Impact Telemetry System
Creating Objects and Object Categories for Studying Perception and Perceptual Learning
Institutions: Georgia Health Sciences University, Georgia Health Sciences University, Georgia Health Sciences University, Palo Alto Research Center, Palo Alto Research Center, University of Minnesota .
In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1
. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes
) with such properties2
Many innovative and useful methods currently exist for creating novel objects and object categories3-6
(also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings.
First, shape variations are generally imposed by the experimenter5,9,10
, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints.
Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13
. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases.
Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms.
Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14
. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13
. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics15,16
. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects9,13
. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper.
We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have.
Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis.
Neuroscience, Issue 69, machine learning, brain, classification, category learning, cross-modal perception, 3-D prototyping, inference
Habituation and Prepulse Inhibition of Acoustic Startle in Rodents
Institutions: University of Western Ontario.
The acoustic startle response is a protective response, elicited by a sudden and intense acoustic stimulus. Facial and skeletal muscles are activated within a few milliseconds, leading to a whole body flinch in rodents1
. Although startle responses are reflexive responses that can be reliably elicited, they are not stereotypic. They can be modulated by emotions such as fear (fear potentiated startle) and joy (joy attenuated startle), by non-associative learning processes such as habituation and sensitization, and by other sensory stimuli through sensory gating processes (prepulse inhibition), turning startle responses into an excellent tool for assessing emotions, learning, and sensory gating, for review see 2, 3
. The primary pathway mediating startle responses is very short and well described, qualifying startle also as an excellent model for studying the underlying mechanisms for behavioural plasticity on a cellular/molecular level3
We here describe a method for assessing short-term habituation, long-term habituation and prepulse inhibition of acoustic startle responses in rodents. Habituation describes the decrease of the startle response magnitude upon repeated presentation of the same stimulus. Habituation within a testing session is called short-term habituation (STH) and is reversible upon a period of several minutes without stimulation. Habituation between testing sessions is called long-term habituation (LTH)4
. Habituation is stimulus specific5
. Prepulse inhibition is the attenuation of a startle response by a preceding non-startling sensory stimulus6
. The interval between prepulse and startle stimulus can vary from 6 to up to 2000 ms. The prepulse can be any modality, however, acoustic prepulses are the most commonly used.
Habituation is a form of non-associative learning. It can also be viewed as a form of sensory filtering, since it reduces the organisms' response to a non-threatening stimulus. Prepulse inhibition (PPI) was originally developed in human neuropsychiatric research as an operational measure for sensory gating7
. PPI deficits may represent the interface of "psychosis and cognition" as they seem to predict cognitive impairment8-10
. Both habituation and PPI are disrupted in patients suffering from schizophrenia11
, and PPI disruptions have shown to be, at least in some cases, amenable to treatment with mostly atypical antipsychotics12, 13
. However, other mental and neurodegenerative diseases are also accompanied by disruption in habituation and/or PPI, such as autism spectrum disorders (slower habituation), obsessive compulsive disorder, Tourette's syndrome, Huntington's disease, Parkinson's disease, and Alzheimer's Disease (PPI)11, 14, 15
Dopamine induced PPI deficits are a commonly used animal model for the screening of antipsychotic drugs16
, but PPI deficits can also be induced by many other psychomimetic drugs, environmental modifications and surgical procedures.
Neuroscience, Issue 55, Startle responses, rat, mouse, sensory gating, sensory filtering, short-term habituation, long-term habituation, prepulse inhibition
Preparation of 3D Fibrin Scaffolds for Stem Cell Culture Applications
Institutions: University of Victoria , University of Victoria .
Stem cells are found in naturally occurring 3D microenvironments in vivo
, which are often referred to as the stem cell niche 1
. Culturing stem cells inside of 3D biomaterial scaffolds provides a way to accurately mimic these microenvironments, providing an advantage over traditional 2D culture methods using polystyrene as well as a method for engineering replacement tissues 2
. While 2D tissue culture polystrene has been used for the majority of cell culture experiments, 3D biomaterial scaffolds can more closely replicate the microenvironments found in vivo
by enabling more accurate establishment of cell polarity in the environment and possessing biochemical and mechanical properties similar to soft tissue.3
A variety of naturally derived and synthetic biomaterial scaffolds have been investigated as 3D environments for supporting stem cell growth. While synthetic scaffolds can be synthesized to have a greater range of mechanical and chemical properties and often have greater reproducibility, natural biomaterials are often composed of proteins and polysaccharides found in the extracelluar matrix and as a result contain binding sites for cell adhesion and readily support cell culture. Fibrin scaffolds, produced by polymerizing the protein fibrinogen obtained from plasma, have been widely investigated for a variety of tissue engineering applications both in vitro
and in vivo 4
. Such scaffolds can be modified using a variety of methods to incorporate controlled release systems for delivering therapeutic factors 5
. Previous work has shown that such scaffolds can be used to successfully culture embryonic stem cells and this scaffold-based culture system can be used to screen the effects of various growth factors on the differentiation of the stem cells seeded inside 6,7
This protocol details the process of polymerizing fibrin scaffolds from fibrinogen solutions using the enzymatic activity of thrombin. The process takes 2 days to complete, including an overnight dialysis step for the fibrinogen solution to remove citrates that inhibit polymerization. These detailed methods rely on fibrinogen concentrations determined to be optimal for embryonic and induced pluripotent stem cell culture. Other groups have further investigated fibrin scaffolds for a wide range of cell types and applications - demonstrating the versatility of this approach 8-12
Bioengineering, Issue 61, Extracellular matrix, stem cells, biomaterials, drug delivery, cell culture
Long-term Lethal Toxicity Test with the Crustacean Artemia franciscana
Institutions: Institute for Environmental Protection and Research, Regional Agency for Environmental Protection in Emilia-Romagna.
Our research activities target the use of biological methods for the evaluation of environmental quality, with particular reference to saltwater/brackish water and sediment. The choice of biological indicators must be based on reliable scientific knowledge and, possibly, on the availability of standardized procedures. In this article, we present a standardized protocol that used the marine crustacean Artemia
to evaluate the toxicity of chemicals and/or of marine environmental matrices. Scientists propose that the brine shrimp (Artemia
) is a suitable candidate for the development of a standard bioassay for worldwide utilization. A number of papers have been published on the toxic effects of various chemicals and toxicants on brine shrimp (Artemia
). The major advantage of this crustacean for toxicity studies is the overall availability of the dry cysts; these can be immediately used in testing and difficult cultivation is not demanded1,2
. Cyst-based toxicity assays are cheap, continuously available, simple and reliable and are thus an important answer to routine needs of toxicity screening, for industrial monitoring requirements or for regulatory purposes3
. The proposed method involves the mortality as an endpoint. The numbers of survivors were counted and percentage of deaths were calculated. Larvae were considered dead if they did not exhibit any internal or external movement during several seconds of observation4
. This procedure was standardized testing a reference substance (Sodium Dodecyl Sulfate); some results are reported in this work. This article accompanies a video that describes the performance of procedural toxicity testing, showing all the steps related to the protocol.
Chemistry, Issue 62, Artemia franciscana, bioassays, chemical substances, crustaceans, marine environment
Establishment of Microbial Eukaryotic Enrichment Cultures from a Chemically Stratified Antarctic Lake and Assessment of Carbon Fixation Potential
Institutions: Miami University .
Lake Bonney is one of numerous permanently ice-covered lakes located in the McMurdo Dry Valleys, Antarctica. The perennial ice cover maintains a chemically stratified water column and unlike other inland bodies of water, largely prevents external input of carbon and nutrients from streams. Biota are exposed to numerous environmental stresses, including year-round severe nutrient deficiency, low temperatures, extreme shade, hypersalinity, and 24-hour darkness during the winter 1
. These extreme environmental conditions limit the biota in Lake Bonney almost exclusively to microorganisms 2
Single-celled microbial eukaryotes (called "protists") are important players in global biogeochemical cycling 3
and play important ecological roles in the cycling of carbon in the dry valley lakes, occupying both primary and tertiary roles in the aquatic food web. In the dry valley aquatic food web, protists that fix inorganic carbon (autotrophy) are the major producers of organic carbon for organotrophic organisms 4, 2
. Phagotrophic or heterotrophic protists capable of ingesting bacteria and smaller protists act as the top predators in the food web 5
. Last, an unknown proportion of the protist population is capable of combined mixotrophic metabolism 6, 7
. Mixotrophy in protists involves the ability to combine photosynthetic capability with phagotrophic ingestion of prey microorganisms. This form of mixotrophy differs from mixotrophic metabolism in bacterial species, which generally involves uptake dissolved carbon molecules. There are currently very few protist isolates from permanently ice-capped polar lakes, and studies of protist diversity and ecology in this extreme environment have been limited 8, 4, 9, 10, 5
. A better understanding of protist metabolic versatility in the simple dry valley lake food web will aid in the development of models for the role of protists in the global carbon cycle.
We employed an enrichment culture approach to isolate potentially phototrophic and mixotrophic protists from Lake Bonney. Sampling depths in the water column were chosen based on the location of primary production maxima and protist phylogenetic diversity 4, 11
, as well as variability in major abiotic factors affecting protist trophic modes: shallow sampling depths are limited for major nutrients, while deeper sampling depths are limited by light availability. In addition, lake water samples were supplemented with multiple types of growth media to promote the growth of a variety of phototrophic organisms.
RubisCO catalyzes the rate limiting step in the Calvin Benson Bassham (CBB) cycle, the major pathway by which autotrophic organisms fix inorganic carbon and provide organic carbon for higher trophic levels in aquatic and terrestrial food webs 12
. In this study, we applied a radioisotope assay modified for filtered samples 13
to monitor maximum carboxylase activity as a proxy for carbon fixation potential and metabolic versatility in the Lake Bonney enrichment cultures.
Microbiology, Issue 62, Antarctic lake, McMurdo Dry Valleys, Enrichment cultivation, Microbial eukaryotes, RubisCO
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Mouse Short- and Long-term Locomotor Activity Analyzed by Video Tracking Software
Institutions: University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign.
Locomotor activity (LMA) is a simple and easily performed measurement of behavior in mice and other rodents. Improvements in video tracking software (VTS) have allowed it to be coupled to LMA testing, dramatically improving specificity and sensitivity when compared to the line crossings method with manual scoring. In addition, VTS enables high-throughput experimentation. While similar to automated video tracking used for the open field test (OFT), LMA testing is unique in that it allows mice to remain in their home cage and does not utilize the anxiogenic stimulus of bright lighting during the active phase of the light-dark cycle. Traditionally, LMA has been used for short periods of time (mins), while longer movement studies (hrs-days) have often used implanted transmitters and biotelemetry. With the option of real-time tracking, long-, like short-term LMA testing, can now be conducted using videography. Long-term LMA testing requires a specialized, but easily constructed, cage so that food and water (which is usually positioned on the cage top) does not obstruct videography. Importantly, videography and VTS allows for the quantification of parameters, such as path of mouse movement, that are difficult or unfeasible to measure with line crossing and/or biotelemetry. In sum, LMA testing coupled to VTS affords a more complete description of mouse movement and the ability to examine locomotion over an extended period of time.
Neuroscience, Issue 76, Behavior, Neurobiology, Anatomy, Physiology, Psychology, Animal, Exploratory Behavior, Behavioral Research, Psychoneuroimmunology, Locomotion, Neuroimmune, high throughput, sickness behavior, noninvasive, video recording, imaging, animal model
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e.
, lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.
) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization.
Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25
. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7
with a multiobjective evolutionary algorithm SPEA226
, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development
Measurement of Leaf Hydraulic Conductance and Stomatal Conductance and Their Responses to Irradiance and Dehydration Using the Evaporative Flux Method (EFM)
Institutions: University of California, Los Angeles .
Water is a key resource, and the plant water transport system sets limits on maximum growth and drought tolerance. When plants open their stomata to achieve a high stomatal conductance (gs
) to capture CO2
for photosynthesis, water is lost by transpiration1,2
. Water evaporating from the airspaces is replaced from cell walls, in turn drawing water from the xylem of leaf veins, in turn drawing from xylem in the stems and roots. As water is pulled through the system, it experiences hydraulic resistance, creating tension throughout the system and a low leaf water potential (Ψleaf
). The leaf itself is a critical bottleneck in the whole plant system, accounting for on average 30% of the plant hydraulic resistance3
. Leaf hydraulic conductance (Kleaf
= 1/ leaf hydraulic resistance) is the ratio of the water flow rate to the water potential gradient across the leaf, and summarizes the behavior of a complex system: water moves through the petiole and through several orders of veins, exits into the bundle sheath and passes through or around mesophyll cells before evaporating into the airspace and being transpired from the stomata. Kleaf
is of strong interest as an important physiological trait to compare species, quantifying the effectiveness of the leaf structure and physiology for water transport, and a key variable to investigate for its relationship to variation in structure (e.g.
, in leaf venation architecture) and its impacts on photosynthetic gas exchange. Further, Kleaf
responds strongly to the internal and external leaf environment3
can increase dramatically with irradiance apparently due to changes in the expression and activation of aquaporins, the proteins involved in water transport through membranes4
, and Kleaf
declines strongly during drought, due to cavitation and/or collapse of xylem conduits, and/or loss of permeability in the extra-xylem tissues due to mesophyll and bundle sheath cell shrinkage or aquaporin deactivation5-10
. Because Kleaf
can constrain gs
and photosynthetic rate across species in well watered conditions and during drought, and thus limit whole-plant performance they may possibly determine species distributions especially as droughts increase in frequency and severity11-14
We present a simple method for simultaneous determination of Kleaf
on excised leaves. A transpiring leaf is connected by its petiole to tubing running to a water source on a balance. The loss of water from the balance is recorded to calculate the flow rate through the leaf. When steady state transpiration (E
, mmol • m-2
) is reached, gs
is determined by dividing by vapor pressure deficit, and Kleaf
by dividing by the water potential driving force determined using a pressure chamber (Kleaf
This method can be used to assess Kleaf
responses to different irradiances and the vulnerability of Kleaf
Plant Biology, Issue 70, Molecular Biology, Physiology, Ecology, Biology, Botany, Leaf traits, hydraulics, stomata, transpiration, xylem, conductance, leaf hydraulic conductance, resistance, evaporative flux method, whole plant