Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample).
Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
27 Related JoVE Articles!
Nucleofection of Rodent Neuroblasts to Study Neuroblast Migration In vitro
Institutions: King's College London, King's College London.
The subventricular zone (SVZ) located in the lateral wall of the lateral ventricles plays a fundamental role in adult neurogenesis. In this restricted area of the brain, neural stem cells proliferate and constantly generate neuroblasts that migrate tangentially in chains along the rostral migratory stream (RMS) to reach the olfactory bulb (OB). Once in the OB, neuroblasts switch to radial migration and then differentiate into mature neurons able to incorporate into the preexisting neuronal network. Proper neuroblast migration is a fundamental step in neurogenesis, ensuring the correct functional maturation of newborn neurons. Given the ability of SVZ-derived neuroblasts to target injured areas in the brain, investigating the intracellular mechanisms underlying their motility will not only enhance the understanding of neurogenesis but may also promote the development of neuroregenerative strategies.
This manuscript describes a detailed protocol for the transfection of primary rodent RMS postnatal neuroblasts and the analysis of their motility using a 3D in vitro
migration assay recapitulating their mode of migration observed in vivo
. Both rat and mouse neuroblasts can be quickly and efficiently transfected via nucleofection with either plasmid DNA, small hairpin (sh)RNA or short interfering (si)RNA oligos targeting genes of interest. To analyze migration, nucleofected cells are reaggregated in 'hanging drops' and subsequently embedded in a three-dimensional matrix. Nucleofection per se
does not significantly impair the migration of neuroblasts. Pharmacological treatment of nucleofected and reaggregated neuroblasts can also be performed to study the role of signaling pathways involved in neuroblast migration.
Neuroscience, Issue 81, Cellular Biology, Cell Migration Assays, Transfection, Neurogenesis, subventricular zone (SVZ), neural stem cells, rostral migratory stream (RMS), neuroblast, 3D migration assay, nucleofection
Assessing Signaling Properties of Ectodermal Epithelia During Craniofacial Development
Institutions: University of California San Francisco.
The accessibility of avian embryos has helped experimental embryologists understand the fates of cells during development and the role of tissue interactions that regulate patterning and morphogenesis of vertebrates (e.g., 1, 2, 3, 4
). Here, we illustrate a method that exploits this accessibility to test the signaling and patterning properties of ectodermal tissues during facial development. In these experiments, we create quail-chick 5
or mouse-chick 6
chimeras by transplanting the surface cephalic ectoderm that covers the upper jaw from quail or mouse onto either the same region or an ectopic region of chick embryos. The use of quail as donor tissue for transplantation into chicks was developed to take advantage of a nucleolar marker present in quail but not chick cells, thus allowing investigators to distinguish host and donor tissues 7
. Similarly, a repetitive element is present in the mouse genome and is expressed ubiquitously, which allows us to distinguish host and donor tissues in mouse-chick chimeras 8
. The use of mouse ectoderm as donor tissue will greatly extend our understanding of these tissue interactions, because this will allow us to test the signaling properties of ectoderm derived from various mutant embryos.
Developmental Biology, Issue 49, Quail-chick chimera, Ectoderm transplant, FEZ, Mouse-chick chimera
How to Create and Use Binocular Rivalry
Institutions: New York University, New York University, Princeton University, Princeton University.
Each of our eyes normally sees a slightly different image of the world around us. The brain can combine these two images into a single coherent representation. However, when the eyes are presented with images that are sufficiently different from each other, an interesting thing happens: Rather than fusing the two images into a combined conscious percept, what transpires is a pattern of perceptual alternations where one image dominates awareness while the other is suppressed; dominance alternates between the two images, typically every few seconds. This perceptual phenomenon is known as binocular rivalry. Binocular rivalry is considered useful for studying perceptual selection and awareness in both human and animal models, because unchanging visual input to each eye leads to alternations in visual awareness and perception. To create a binocular rivalry stimulus, all that is necessary is to present each eye with a different image at the same perceived location. There are several ways of doing this, but newcomers to the field are often unsure which method would best suit their specific needs. The purpose of this article is to describe a number of inexpensive and straightforward ways to create and use binocular rivalry. We detail methods that do not require expensive specialized equipment and describe each method's advantages and disadvantages. The methods described include the use of red-blue goggles, mirror stereoscopes and prism goggles.
Neuroscience, Issue 45, Binocular rivalry, continuous flash suppression, vision, visual awareness, perceptual competition, unconscious processing, neuroimaging
Determining the Contribution of the Energy Systems During Exercise
Institutions: University of Sao Paulo, University of Sao Paulo, University of Sao Paulo, University of Sao Paulo.
One of the most important aspects of the metabolic demand is the relative contribution of the energy systems to the total energy required for a given physical activity. Although some sports are relatively easy to be reproduced in a laboratory (e.g., running and cycling), a number of sports are much more difficult to be reproduced and studied in controlled situations. This method presents how to assess the differential contribution of the energy systems in sports that are difficult to mimic in controlled laboratory conditions. The concepts shown here can be adapted to virtually any sport.
The following physiologic variables will be needed: rest oxygen consumption, exercise oxygen consumption, post-exercise oxygen consumption, rest plasma lactate concentration and post-exercise plasma peak lactate. To calculate the contribution of the aerobic metabolism, you will need the oxygen consumption at rest and during the exercise. By using the trapezoidal method, calculate the area under the curve of oxygen consumption during exercise, subtracting the area corresponding to the rest oxygen consumption. To calculate the contribution of the alactic anaerobic metabolism, the post-exercise oxygen consumption curve has to be adjusted to a mono or a bi-exponential model (chosen by the one that best fits). Then, use the terms of the fitted equation to calculate anaerobic alactic metabolism, as follows: ATP-CP metabolism = A1
(mL . s-1
) x t1
(s). Finally, to calculate the contribution of the lactic anaerobic system, multiply peak plasma lactate by 3 and by the athlete’s body mass (the result in mL is then converted to L and into kJ).
The method can be used for both continuous and intermittent exercise. This is a very interesting approach as it can be adapted to exercises and sports that are difficult to be mimicked in controlled environments. Also, this is the only available method capable of distinguishing the contribution of three different energy systems. Thus, the method allows the study of sports with great similarity to real situations, providing desirable ecological validity to the study.
Physiology, Issue 61, aerobic metabolism, anaerobic alactic metabolism, anaerobic lactic metabolism, exercise, athletes, mathematical model
Multimodal Optical Microscopy Methods Reveal Polyp Tissue Morphology and Structure in Caribbean Reef Building Corals
Institutions: University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign.
An integrated suite of imaging techniques has been applied to determine the three-dimensional (3D) morphology and cellular structure of polyp tissues comprising the Caribbean reef building corals Montastraeaannularis
and M. faveolata
. These approaches include fluorescence microscopy (FM), serial block face imaging (SBFI), and two-photon confocal laser scanning microscopy (TPLSM). SBFI provides deep tissue imaging after physical sectioning; it details the tissue surface texture and 3D visualization to tissue depths of more than 2 mm. Complementary FM and TPLSM yield ultra-high resolution images of tissue cellular structure. Results have: (1) identified previously unreported lobate tissue morphologies on the outer wall of individual coral polyps and (2) created the first surface maps of the 3D distribution and tissue density of chromatophores and algae-like dinoflagellate zooxanthellae
endosymbionts. Spectral absorption peaks of 500 nm and 675 nm, respectively, suggest that M. annularis
and M. faveolata
contain similar types of chlorophyll and chromatophores. However, M. annularis
and M. faveolata
exhibit significant differences in the tissue density and 3D distribution of these key cellular components. This study focusing on imaging methods indicates that SBFI is extremely useful for analysis of large mm-scale samples of decalcified coral tissues. Complimentary FM and TPLSM reveal subtle submillimeter scale changes in cellular distribution and density in nondecalcified coral tissue samples. The TPLSM technique affords: (1) minimally invasive sample preparation, (2) superior optical sectioning ability, and (3) minimal light absorption and scattering, while still permitting deep tissue imaging.
Environmental Sciences, Issue 91, Serial block face imaging, two-photon fluorescence microscopy, Montastraea annularis, Montastraea faveolata, 3D coral tissue morphology and structure, zooxanthellae, chromatophore, autofluorescence, light harvesting optimization, environmental change
Avian Influenza Surveillance with FTA Cards: Field Methods, Biosafety, and Transportation Issues Solved
Institutions: Wageningen University, Linnaeus University, Simon Fraser University .
Avian Influenza Viruses (AIVs) infect many mammals, including humans1
. These AIVs are diverse in their natural hosts, harboring almost all possible viral subtypes2
. Human pandemics of flu originally stem from AIVs3
. Many fatal human cases during the H5N1 outbreaks in recent years were reported. Lately, a new AIV related strain swept through the human population, causing the 'swine flu epidemic'4
. Although human trading and transportation activity seems to be responsible for the spread of highly pathogenic strains5
, dispersal can also partly be attributed to wild birds6, 7
. However, the actual reservoir of all AIV strains is wild birds.
In reaction to this and in face of severe commercial losses in the poultry industry, large surveillance programs have been implemented globally to collect information on the ecology of AIVs, and to install early warning systems to detect certain highly pathogenic strains8-12
. Traditional virological methods require viruses to be intact and cultivated before analysis. This necessitates strict cold chains with deep freezers and heavy biosafety procedures to be in place during transport. Long-term surveillance is therefore usually restricted to a few field stations close to well equipped laboratories. Remote areas cannot be sampled unless logistically cumbersome procedures are implemented. These problems have been recognised13, 14
and the use of alternative storage and transport strategies investigated (alcohols or guanidine)15-17
. Recently, Kraus et al
introduced a method to collect, store and transport AIV samples, based on a special filter paper. FTA cards19
preserve RNA on a dry storage basis20
and render pathogens inactive upon contact21
. This study showed that FTA cards can be used to detect AIV RNA in reverse-transcription PCR and that the resulting cDNA could be sequenced and virus genes and determined.
In the study of Kraus et al
a laboratory isolate of AIV was used, and samples were handled individually. In the extension presented here, faecal samples from wild birds from the duck trap at the Ottenby Bird Observatory (SE Sweden) were tested directly to illustrate the usefulness of the methods under field conditions. Catching of ducks and sample collection by cloacal swabs is demonstrated. The current protocol includes up-scaling of the work flow from single tube handling to a 96-well design. Although less sensitive than the traditional methods, the method of FTA cards provides an excellent supplement to large surveillance schemes. It allows collection and analysis of samples from anywhere in the world, without the need to maintaining a cool chain or safety regulations with respect to shipping of hazardous reagents, such as alcohol or guanidine.
Immunology, Issue 54, AI, Influenza A Virus, zoonoses, reverse transcription PCR, viral RNA, surveillance, duck trap, RNA preservation and storage, infection, mallard
Rapid Diagnosis of Avian Influenza Virus in Wild Birds: Use of a Portable rRT-PCR and Freeze-dried Reagents in the Field
Institutions: USGS Western Ecological Research Center, University of California, Davis, University of California, Davis, University of Minnesota , Science Applications International Corporation.
Wild birds have been implicated in the spread of highly pathogenic avian influenza (HPAI) of the H5N1 subtype, prompting surveillance along migratory flyways. Sampling of wild birds for avian influenza virus (AIV) is often conducted in remote regions, but results are often delayed because of the need to transport samples to a laboratory equipped for molecular testing. Real-time reverse transcriptase polymerase chain reaction (rRT-PCR) is a molecular technique that offers one of the most accurate and sensitive methods for diagnosis of AIV. The previously strict lab protocols needed for rRT-PCR are now being adapted for the field. Development of freeze-dried (lyophilized) reagents that do not require cold chain, with sensitivity at the level of wet reagents has brought on-site remote testing to a practical goal.
Here we present a method for the rapid diagnosis of AIV in wild birds using an rRT-PCR unit (Ruggedized Advanced Pathogen Identification Device or RAPID, Idaho Technologies, Salt Lake City, UT) that employs lyophilized reagents (Influenza A Target 1 Taqman; ASAY-ASY-0109, Idaho Technologies). The reagents contain all of the necessary components for testing at appropriate concentrations in a single tube: primers, probes, enzymes, buffers and internal positive controls, eliminating errors associated with improper storage or handling of wet reagents. The portable unit performs a screen for Influenza A by targeting the matrix gene and yields results in 2-3 hours. Genetic subtyping is also possible with H5 and H7 primer sets that target the hemagglutinin gene.
The system is suitable for use on cloacal and oropharyngeal samples collected from wild birds, as demonstrated here on the migratory shorebird species, the western sandpiper (Calidrus mauri
) captured in Northern California. Animal handling followed protocols approved by the Animal Care and Use Committee of the U.S. Geological Survey Western Ecological Research Center and permits of the U.S. Geological Survey Bird Banding Laboratory. The primary advantage of this technique is to expedite diagnosis of wild birds, increasing the chances of containing an outbreak in a remote location. On-site diagnosis would also prove useful for identifying and studying infected individuals in wild populations. The opportunity to collect information on host biology (immunological and physiological response to infection) and spatial ecology (migratory performance of infected birds) will provide insights into the extent to which wild birds can act as vectors for AIV over long distances.
Immunology, Issue 54, migratory birds, active surveillance, lyophilized reagents, avian influenza, H5N1
Analysis of Cell Migration within a Three-dimensional Collagen Matrix
Institutions: Witten/Herdecke University.
The ability to migrate is a hallmark of various cell types and plays a crucial role in several physiological processes, including embryonic development, wound healing, and immune responses. However, cell migration is also a key mechanism in cancer enabling these cancer cells to detach from the primary tumor to start metastatic spreading. Within the past years various cell migration assays have been developed to analyze the migratory behavior of different cell types. Because the locomotory behavior of cells markedly differs between a two-dimensional (2D) and three-dimensional (3D) environment it can be assumed that the analysis of the migration of cells that are embedded within a 3D environment would yield in more significant cell migration data. The advantage of the described 3D collagen matrix migration assay is that cells are embedded within a physiological 3D network of collagen fibers representing the major component of the extracellular matrix. Due to time-lapse video microscopy real cell migration is measured allowing the determination of several migration parameters as well as their alterations in response to pro-migratory factors or inhibitors. Various cell types could be analyzed using this technique, including lymphocytes/leukocytes, stem cells, and tumor cells. Likewise, also cell clusters or spheroids could be embedded within the collagen matrix concomitant with analysis of the emigration of single cells from the cell cluster/ spheroid into the collagen lattice. We conclude that the 3D collagen matrix migration assay is a versatile method to analyze the migration of cells within a physiological-like 3D environment.
Bioengineering, Issue 92, cell migration, 3D collagen matrix, cell tracking
Analysis of Neural Crest Migration and Differentiation by Cross-species Transplantation
Institutions: Rice University .
Avian embryos provide a unique platform for studying many vertebrate developmental processes, due to the easy access of the embryos within the egg. Chimeric avian embryos, in which quail donor tissue is transplanted into a chick embryo in ovo
, combine the power of indelible genetic labeling of cell populations with the ease of manipulation presented by the avian embryo.
Quail-chick chimeras are a classical tool for tracing migratory neural crest cells (NCCs)1-3
. NCCs are a transient migratory population of cells in the embryo, which originate in the dorsal region of the developing neural tube4
. They undergo an epithelial to mesenchymal transition and subsequently migrate to other regions of the embryo, where they differentiate into various cell types including cartilage5-13
, neurons and glia21-32
. NCCs are multipotent, and their ultimate fate is influenced by 1) the region of the neural tube in which they originate along the rostro-caudal axis of the embryo11,33-37
, 2) signals from neighboring cells as they migrate38-44
, and 3) the microenvironment of their ultimate destination within the embryo45,46
. Tracing these cells from their point of origin at the neural tube, to their final position and fate within the embryo, provides important insight into the developmental processes that regulate patterning and organogenesis.
Transplantation of complementary regions of donor neural tube (homotopic grafting) or different regions of donor neural tube (heterotopic grafting) can reveal differences in pre-specification of NCCs along the rostro-caudal axis2,47
. This technique can be further adapted to transplant a unilateral compartment of the neural tube, such that one side is derived from donor tissue, and the contralateral side remains unperturbed in the host embryo, yielding an internal control within the same sample2,47
. It can also be adapted for transplantation of brain segments in later embryos, after HH10, when the anterior neural tube has closed47
Here we report techniques for generating quail-chick chimeras via neural tube transplantation, which allow for tracing of migratory NCCs derived from a discrete segment of the neural tube. Species-specific labeling of the donor-derived cells with the quail-specific QCPN antibody48-56
allows the researcher to distinguish donor and host cells at the experimental end point. This technique is straightforward, inexpensive, and has many applications, including fate-mapping, cell lineage tracing, and identifying pre-patterning events along the rostro-caudal axis45
. Because of the ease of access to the avian embryo, the quail-chick graft technique may be combined with other manipulations, including but not limited to lens ablation40
, injection of inhibitory molecules57,58
, or genetic manipulation via electroporation of expression plasmids59-61
, to identify the response of particular migratory streams of NCCs to perturbations in the embryo's developmental program. Furthermore, this grafting technique may also be used to generate other interspecific chimeric embryos such as quail-duck chimeras to study NCC contribution to craniofacial morphogenesis, or mouse-chick chimeras to combine the power of mouse genetics with the ease of manipulation of the avian embryo.62
Neuroscience, Issue 60, Neural crest, chick, quail, chimera, fate map, cell migration, cell differentiation
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Assessment of Vascular Function in Patients With Chronic Kidney Disease
Institutions: University of Colorado, Denver, University of Colorado, Boulder.
Patients with chronic kidney disease (CKD) have significantly increased risk of cardiovascular disease (CVD) compared to the general population, and this is only partially explained by traditional CVD risk factors. Vascular dysfunction is an important non-traditional risk factor, characterized by vascular endothelial dysfunction (most commonly assessed as impaired endothelium-dependent dilation [EDD]) and stiffening of the large elastic arteries. While various techniques exist to assess EDD and large elastic artery stiffness, the most commonly used are brachial artery flow-mediated dilation (FMDBA
) and aortic pulse-wave velocity (aPWV), respectively. Both of these noninvasive measures of vascular dysfunction are independent predictors of future cardiovascular events in patients with and without kidney disease. Patients with CKD demonstrate both impaired FMDBA
, and increased aPWV. While the exact mechanisms by which vascular dysfunction develops in CKD are incompletely understood, increased oxidative stress and a subsequent reduction in nitric oxide (NO) bioavailability are important contributors. Cellular changes in oxidative stress can be assessed by collecting vascular endothelial cells from the antecubital vein and measuring protein expression of markers of oxidative stress using immunofluorescence. We provide here a discussion of these methods to measure FMDBA
, aPWV, and vascular endothelial cell protein expression.
Medicine, Issue 88, chronic kidney disease, endothelial cells, flow-mediated dilation, immunofluorescence, oxidative stress, pulse-wave velocity
Quantitative Visualization and Detection of Skin Cancer Using Dynamic Thermal Imaging
Institutions: The Johns Hopkins University.
In 2010 approximately 68,720 melanomas will be diagnosed in the US alone, with around 8,650 resulting in death 1
. To date, the only effective treatment for melanoma remains surgical excision, therefore, the key to extended survival is early detection 2,3
. Considering the large numbers of patients diagnosed every year and the limitations in accessing specialized care quickly, the development of objective in vivo
diagnostic instruments to aid the diagnosis is essential. New techniques to detect skin cancer, especially non-invasive diagnostic tools, are being explored in numerous laboratories. Along with the surgical methods, techniques such as digital photography, dermoscopy, multispectral imaging systems (MelaFind), laser-based systems (confocal scanning laser microscopy, laser doppler perfusion imaging, optical coherence tomography), ultrasound, magnetic resonance imaging, are being tested. Each technique offers unique advantages and disadvantages, many of which pose a compromise between effectiveness and accuracy versus ease of use and cost considerations. Details about these techniques and comparisons are available in the literature 4
Infrared (IR) imaging was shown to be a useful method to diagnose the signs of certain diseases by measuring the local skin temperature. There is a large body of evidence showing that disease or deviation from normal functioning are accompanied by changes of the temperature of the body, which again affect the temperature of the skin 5,6
. Accurate data about the temperature of the human body and skin can provide a wealth of information on the processes responsible for heat generation and thermoregulation, in particular the deviation from normal conditions, often caused by disease. However, IR imaging has not been widely recognized in medicine due to the premature use of the technology 7,8
several decades ago, when temperature measurement accuracy and the spatial resolution were inadequate and sophisticated image processing tools were unavailable. This situation changed dramatically in the late 1990s-2000s. Advances in IR instrumentation, implementation of digital image processing algorithms and dynamic IR imaging, which enables scientists to analyze not only the spatial, but also the temporal thermal behavior of the skin 9
, allowed breakthroughs in the field.
In our research, we explore the feasibility of IR imaging, combined with theoretical and experimental studies, as a cost effective, non-invasive, in vivo optical measurement technique for tumor detection, with emphasis on the screening and early detection of melanoma 10-13
. In this study, we show data obtained in a patient study in which patients that possess a pigmented lesion with a clinical indication for biopsy are selected for imaging. We compared the difference in thermal responses between healthy and malignant tissue and compared our data with biopsy results. We concluded that the increased metabolic activity of the melanoma lesion can be detected by dynamic infrared imaging.
Medicine, Issue 51, Infrared imaging, quantitative thermal analysis, image processing, skin cancer, melanoma, transient thermal response, skin thermal models, skin phantom experiment, patient study
Trajectory Data Analyses for Pedestrian Space-time Activity Study
Institutions: Kean University, University of Wisconsin-Madison.
It is well recognized that human movement in the spatial and temporal dimensions has direct influence on disease transmission1-3
. An infectious disease typically spreads via contact between infected and susceptible individuals in their overlapped activity spaces. Therefore, daily mobility-activity information can be used as an indicator to measure exposures to risk factors of infection. However, a major difficulty and thus the reason for paucity of studies of infectious disease transmission at the micro scale arise from the lack of detailed individual mobility data. Previously in transportation and tourism research detailed space-time activity data often relied on the time-space diary technique, which requires subjects to actively record their activities in time and space. This is highly demanding for the participants and collaboration from the participants greatly affects the quality of data4
Modern technologies such as GPS and mobile communications have made possible the automatic collection of trajectory data. The data collected, however, is not ideal for modeling human space-time activities, limited by the accuracies of existing devices. There is also no readily available tool for efficient processing of the data for human behavior study. We present here a suite of methods and an integrated ArcGIS desktop-based visual interface for the pre-processing and spatiotemporal analyses of trajectory data. We provide examples of how such processing may be used to model human space-time activities, especially with error-rich pedestrian trajectory data, that could be useful in public health studies such as infectious disease transmission modeling.
The procedure presented includes pre-processing, trajectory segmentation, activity space characterization, density estimation and visualization, and a few other exploratory analysis methods. Pre-processing is the cleaning of noisy raw trajectory data. We introduce an interactive visual pre-processing interface as well as an automatic module. Trajectory segmentation5
involves the identification of indoor and outdoor parts from pre-processed space-time tracks. Again, both interactive visual segmentation and automatic segmentation are supported. Segmented space-time tracks are then analyzed to derive characteristics of one's activity space such as activity radius etc.
Density estimation and visualization are used to examine large amount of trajectory data to model hot spots and interactions. We demonstrate both density surface mapping6
and density volume rendering7
. We also include a couple of other exploratory data analyses (EDA) and visualizations tools, such as Google Earth animation support and connection analysis. The suite of analytical as well as visual methods presented in this paper may be applied to any trajectory data for space-time activity studies.
Environmental Sciences, Issue 72, Computer Science, Behavior, Infectious Diseases, Geography, Cartography, Data Display, Disease Outbreaks, cartography, human behavior, Trajectory data, space-time activity, GPS, GIS, ArcGIS, spatiotemporal analysis, visualization, segmentation, density surface, density volume, exploratory data analysis, modelling
Measurement of Greenhouse Gas Flux from Agricultural Soils Using Static Chambers
Institutions: University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, USDA-ARS Dairy Forage Research Center, USDA-ARS Pasture Systems Watershed Management Research Unit.
Measurement of greenhouse gas (GHG) fluxes between the soil and the atmosphere, in both managed and unmanaged ecosystems, is critical to understanding the biogeochemical drivers of climate change and to the development and evaluation of GHG mitigation strategies based on modulation of landscape management practices. The static chamber-based method described here is based on trapping gases emitted from the soil surface within a chamber and collecting samples from the chamber headspace at regular intervals for analysis by gas chromatography. Change in gas concentration over time is used to calculate flux. This method can be utilized to measure landscape-based flux of carbon dioxide, nitrous oxide, and methane, and to estimate differences between treatments or explore system dynamics over seasons or years. Infrastructure requirements are modest, but a comprehensive experimental design is essential. This method is easily deployed in the field, conforms to established guidelines, and produces data suitable to large-scale GHG emissions studies.
Environmental Sciences, Issue 90, greenhouse gas, trace gas, gas flux, static chamber, soil, field, agriculture, climate
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (https://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
Magnetic Tweezers for the Measurement of Twist and Torque
Institutions: Delft University of Technology.
Single-molecule techniques make it possible to investigate the behavior of individual biological molecules in solution in real time. These techniques include so-called force spectroscopy approaches such as atomic force microscopy, optical tweezers, flow stretching, and magnetic tweezers. Amongst these approaches, magnetic tweezers have distinguished themselves by their ability to apply torque while maintaining a constant stretching force. Here, it is illustrated how such a “conventional” magnetic tweezers experimental configuration can, through a straightforward modification of its field configuration to minimize the magnitude of the transverse field, be adapted to measure the degree of twist in a biological molecule. The resulting configuration is termed the freely-orbiting magnetic tweezers. Additionally, it is shown how further modification of the field configuration can yield a transverse field with a magnitude intermediate between that of the “conventional” magnetic tweezers and the freely-orbiting magnetic tweezers, which makes it possible to directly measure the torque stored in a biological molecule. This configuration is termed the magnetic torque tweezers. The accompanying video explains in detail how the conversion of conventional magnetic tweezers into freely-orbiting magnetic tweezers and magnetic torque tweezers can be accomplished, and demonstrates the use of these techniques. These adaptations maintain all the strengths of conventional magnetic tweezers while greatly expanding the versatility of this powerful instrument.
Bioengineering, Issue 87, magnetic tweezers, magnetic torque tweezers, freely-orbiting magnetic tweezers, twist, torque, DNA, single-molecule techniques
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
Protocol for Long Duration Whole Body Hyperthermia in Mice
Institutions: National Institute of Immunology, National Institute of Immunology.
Hyperthermia is a general term used to define the increase in core body temperature above normal. It is often used to describe the increased core body temperature that is observed during fever. The use of hyperthermia as an adjuvant has emerged as a promising procedure for tumor regression in the field of cancer biology. For this purpose, the most important requirement is to have reliable and uniform heating protocols. We have developed a protocol for hyperthermia (whole body) in mice. In this protocol, animals are exposed to cycles of hyperthermia for 90 min followed by a rest period of 15 min. During this period mice have easy access to food and water. High body temperature spikes in the mice during first few hyperthermia exposure cycles are prevented by immobilizing the animal. Additionally, normal saline is administered in first few cycles to minimize the effects of dehydration. This protocol can simulate fever like conditions in mice up to 12-24 hr. We have used 8-12 weeks old BALB/Cj female mice to demonstrate the protocol.
Medicine, Issue 66, Anatomy, Physiology, Mouse, Fever, Whole Body Hyperthermia, Temperature Spikes, core body temperature
Basics of Multivariate Analysis in Neuroimaging Data
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9
. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Phase Contrast and Differential Interference Contrast (DIC) Microscopy
Institutions: University of Texas Health Science Center at San Antonio (UTHSCSA).
Phase-contrast microscopy is often used to produce contrast for transparent, non light-absorbing, biological specimens. The technique was discovered by Zernike, in 1942, who received the Nobel prize for his achievement. DIC microscopy, introduced in the late 1960s, has been popular in biomedical research because it highlights edges of specimen structural detail, provides high-resolution optical sections of thick specimens including tissue cells, eggs, and embryos and does not suffer from the phase halos typical of phase-contrast images. This protocol highlights the principles and practical applications of these microscopy techniques.
Basic protocols, Issue 18, Current Protocols Wiley, Microscopy, Phase Contrast, Difference Interference Contrast
Using Learning Outcome Measures to assess Doctoral Nursing Education
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric