An integrated suite of imaging techniques has been applied to determine the three-dimensional (3D) morphology and cellular structure of polyp tissues comprising the Caribbean reef building corals Montastraeaannularis and M. faveolata. These approaches include fluorescence microscopy (FM), serial block face imaging (SBFI), and two-photon confocal laser scanning microscopy (TPLSM). SBFI provides deep tissue imaging after physical sectioning; it details the tissue surface texture and 3D visualization to tissue depths of more than 2 mm. Complementary FM and TPLSM yield ultra-high resolution images of tissue cellular structure. Results have: (1) identified previously unreported lobate tissue morphologies on the outer wall of individual coral polyps and (2) created the first surface maps of the 3D distribution and tissue density of chromatophores and algae-like dinoflagellate zooxanthellae endosymbionts. Spectral absorption peaks of 500 nm and 675 nm, respectively, suggest that M. annularis and M. faveolata contain similar types of chlorophyll and chromatophores. However, M. annularis and M. faveolata exhibit significant differences in the tissue density and 3D distribution of these key cellular components. This study focusing on imaging methods indicates that SBFI is extremely useful for analysis of large mm-scale samples of decalcified coral tissues. Complimentary FM and TPLSM reveal subtle submillimeter scale changes in cellular distribution and density in nondecalcified coral tissue samples. The TPLSM technique affords: (1) minimally invasive sample preparation, (2) superior optical sectioning ability, and (3) minimal light absorption and scattering, while still permitting deep tissue imaging.
20 Related JoVE Articles!
A Standardized Obstacle Course for Assessment of Visual Function in Ultra Low Vision and Artificial Vision
Institutions: University of Pittsburgh, University of Pittsburgh.
We describe an indoor, portable, standardized course that can be used to evaluate obstacle avoidance in persons who have ultralow vision. Six sighted controls and 36 completely blind but otherwise healthy adult male (n=29) and female (n=13) subjects (age range 19-85 years), were enrolled in one of three studies involving testing of the BrainPort sensory substitution device. Subjects were asked to navigate the course prior to, and after, BrainPort training. They completed a total of 837 course runs in two different locations. Means and standard deviations were calculated across control types, courses, lights, and visits. We used a linear mixed effects model to compare different categories in the PPWS (percent preferred walking speed) and error percent data to show that the course iterations were properly designed. The course is relatively inexpensive, simple to administer, and has been shown to be a feasible way to test mobility function. Data analysis demonstrates that for the outcome of percent error as well as for percentage preferred walking speed, that each of the three courses is different, and that within each level, each of the three iterations are equal. This allows for randomization of the courses during administration.
preferred walking speed (PWS)
course speed (CS)
percentage preferred walking speed (PPWS)
Medicine, Issue 84, Obstacle course, navigation assessment, BrainPort, wayfinding, low vision
Using an Automated 3D-tracking System to Record Individual and Shoals of Adult Zebrafish
Like many aquatic animals, zebrafish (Danio rerio
) moves in a 3D space. It is thus preferable to use a 3D recording system to study its behavior. The presented automatic video tracking system accomplishes this by using a mirror system and a calibration procedure that corrects for the considerable error introduced by the transition of light from water to air. With this system it is possible to record both single and groups of adult zebrafish. Before use, the system has to be calibrated. The system consists of three modules: Recording, Path Reconstruction, and Data Processing. The step-by-step protocols for calibration and using the three modules are presented. Depending on the experimental setup, the system can be used for testing neophobia, white aversion, social cohesion, motor impairments, novel object exploration etc
. It is especially promising as a first-step tool to study the effects of drugs or mutations on basic behavioral patterns. The system provides information about vertical and horizontal distribution of the zebrafish, about the xyz-components of kinematic parameters (such as locomotion, velocity, acceleration, and turning angle) and it provides the data necessary to calculate parameters for social cohesions when testing shoals.
Behavior, Issue 82, neuroscience, Zebrafish, Danio rerio, anxiety, Shoaling, Pharmacology, 3D-tracking, MK801
Trajectory Data Analyses for Pedestrian Space-time Activity Study
Institutions: Kean University, University of Wisconsin-Madison.
It is well recognized that human movement in the spatial and temporal dimensions has direct influence on disease transmission1-3
. An infectious disease typically spreads via contact between infected and susceptible individuals in their overlapped activity spaces. Therefore, daily mobility-activity information can be used as an indicator to measure exposures to risk factors of infection. However, a major difficulty and thus the reason for paucity of studies of infectious disease transmission at the micro scale arise from the lack of detailed individual mobility data. Previously in transportation and tourism research detailed space-time activity data often relied on the time-space diary technique, which requires subjects to actively record their activities in time and space. This is highly demanding for the participants and collaboration from the participants greatly affects the quality of data4
Modern technologies such as GPS and mobile communications have made possible the automatic collection of trajectory data. The data collected, however, is not ideal for modeling human space-time activities, limited by the accuracies of existing devices. There is also no readily available tool for efficient processing of the data for human behavior study. We present here a suite of methods and an integrated ArcGIS desktop-based visual interface for the pre-processing and spatiotemporal analyses of trajectory data. We provide examples of how such processing may be used to model human space-time activities, especially with error-rich pedestrian trajectory data, that could be useful in public health studies such as infectious disease transmission modeling.
The procedure presented includes pre-processing, trajectory segmentation, activity space characterization, density estimation and visualization, and a few other exploratory analysis methods. Pre-processing is the cleaning of noisy raw trajectory data. We introduce an interactive visual pre-processing interface as well as an automatic module. Trajectory segmentation5
involves the identification of indoor and outdoor parts from pre-processed space-time tracks. Again, both interactive visual segmentation and automatic segmentation are supported. Segmented space-time tracks are then analyzed to derive characteristics of one's activity space such as activity radius etc.
Density estimation and visualization are used to examine large amount of trajectory data to model hot spots and interactions. We demonstrate both density surface mapping6
and density volume rendering7
. We also include a couple of other exploratory data analyses (EDA) and visualizations tools, such as Google Earth animation support and connection analysis. The suite of analytical as well as visual methods presented in this paper may be applied to any trajectory data for space-time activity studies.
Environmental Sciences, Issue 72, Computer Science, Behavior, Infectious Diseases, Geography, Cartography, Data Display, Disease Outbreaks, cartography, human behavior, Trajectory data, space-time activity, GPS, GIS, ArcGIS, spatiotemporal analysis, visualization, segmentation, density surface, density volume, exploratory data analysis, modelling
Localizing Protein in 3D Neural Stem Cell Culture: a Hybrid Visualization Methodology
Institutions: University of Ottawa, Carleton University.
The importance of 3-dimensional (3D) topography in influencing neural stem and progenitor cell (NPC) phenotype is widely acknowledged yet challenging to study. When dissociated from embryonic or post-natal brain, single NPCs will proliferate in suspension to form neurospheres. Daughter cells within these cultures spontaneously adopt distinct developmental lineages (neurons, oligodendrocytes, and astrocytes) over the course of expansion despite being exposed to the same extracellular milieu. This progression recapitulates many of the stages observed over the course of neurogenesis and gliogenesis in post-natal brain and is often used to study basic NPC biology within a controlled environment. Assessing the full impact of 3D topography and cellular positioning within these cultures on NPC fate is, however, difficult. To localize target proteins and identify NPC lineages by immunocytochemistry, free-floating neurospheres must be plated on a substrate or serially sectioned. This processing is required to ensure equivalent cell permeabilization and antibody access throughout the sphere. As a result, 2D epifluorescent images of cryosections or confocal reconstructions of 3D Z-stacks can only provide spatial information about cell position within discrete physical or digital 3D slices and do not visualize cellular position in the intact sphere. Here, to reiterate the topography of the neurosphere culture and permit spatial analysis of protein expression throughout the entire culture, we present a protocol for isolation, expansion, and serial sectioning of post-natal hippocampal neurospheres suitable for epifluorescent or confocal immunodetection of target proteins. Connexin29 (Cx29) is analyzed as an example. Next, using a hybrid of graphic editing and 3D modelling softwares rigorously applied to maintain biological detail, we describe how to re-assemble the 3D structural positioning of these images and digitally map labelled cells within the complete neurosphere. This methodology enables visualization and analysis of the cellular position of target proteins and cells throughout the entire 3D culture topography and will facilitate a more detailed analysis of the spatial relationships between cells over the course of neurogenesis and gliogenesis in vitro
Both Imbeault and Valenzuela contributed equally and should be considered joint first authors.
Neuroscience, Issue 46, neural stem cell, hippocampus, cryosectioning, 3D modelling, neurosphere, Maya, compositing
Soil Sampling and Isolation of Entomopathogenic Nematodes (Steinernematidae, Heterorhabditidae)
Institutions: University of Arizona.
Entomopathogenic nematodes (a.k.a. EPN) represent a group of soil-inhabiting nematodes that parasitize a wide range of insects. These nematodes belong to two families: Steinernematidae and Heterorhabditidae. Until now, more than 70 species have been described in the Steinernematidae and there are about 20 species in the Heterorhabditidae. The nematodes have a mutualistic partnership with Enterobacteriaceae bacteria and together they act as a potent insecticidal complex that kills a wide range of insect species.
Herein, we focus on the most common techniques considered for collecting EPN from soil. The second part of this presentation focuses on the insect-baiting technique, a widely used approach for the isolation of EPN from soil samples, and the modified White trap technique which is used for the recovery of these nematodes from infected insects. These methods and techniques are key steps for the successful establishment of EPN cultures in the laboratory and also form the basis for other bioassays that consider these nematodes as model organisms for research in other biological disciplines. The techniques shown in this presentation correspond to those performed and/or designed by members of S. P. Stock laboratory as well as those described by various authors.
Environmental Sciences, Issue 89, Entomology, Nematology, Steinernema, Heterorhabditis, nematodes, soil sampling, insect-bait, modified White-trap
Linking Predation Risk, Herbivore Physiological Stress and Microbial Decomposition of Plant Litter
Institutions: Yale University, Virginia Tech, The Hebrew University of Jerusalem.
The quantity and quality of detritus entering the soil determines the rate of decomposition by microbial communities as well as recycle rates of nitrogen (N) and carbon (C) sequestration1,2
. Plant litter comprises the majority of detritus3
, and so it is assumed that decomposition is only marginally influenced by biomass inputs from animals such as herbivores and carnivores4,5
. However, carnivores may influence microbial decomposition of plant litter via a chain of interactions in which predation risk alters the physiology of their herbivore prey that in turn alters soil microbial functioning when the herbivore carcasses are decomposed6
. A physiological stress response by herbivores to the risk of predation can change the C:N elemental composition of herbivore biomass7,8,9
because stress from predation risk increases herbivore basal energy demands that in nutrient-limited systems forces herbivores to shift their consumption from N-rich resources to support growth and reproduction to C-rich carbohydrate resources to support heightened metabolism6
. Herbivores have limited ability to store excess nutrients, so stressed herbivores excrete N as they increase carbohydrate-C consumption7
. Ultimately, prey stressed by predation risk increase their body C:N ratio7,10
, making them poorer quality resources for the soil microbial pool likely due to lower availability of labile N for microbial enzyme production6
. Thus, decomposition of carcasses of stressed herbivores has a priming effect on the functioning of microbial communities that decreases subsequent ability to of microbes to decompose plant litter6,10,11
We present the methodology to evaluate linkages between predation risk and litter decomposition by soil microbes. We describe how to: induce stress in herbivores from predation risk; measure those stress responses, and measure the consequences on microbial decomposition. We use insights from a model grassland ecosystem comprising the hunting spider predator (Pisuarina mira
), a dominant grasshopper herbivore (Melanoplus femurrubrum
),and a variety of grass and forb plants9
Environmental Sciences, Issue 73, Microbiology, Plant Biology, Entomology, Organisms, Investigative Techniques, Biological Phenomena, Chemical Phenomena, Metabolic Phenomena, Microbiological Phenomena, Earth Resources and Remote Sensing, Life Sciences (General), Litter Decomposition, Ecological Stoichiometry, Physiological Stress and Ecosystem Function, Predation Risk, Soil Respiration, Carbon Sequestration, Soil Science, respiration, spider, grasshoper, model system
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Integrated Field Lysimetry and Porewater Sampling for Evaluation of Chemical Mobility in Soils and Established Vegetation
Institutions: North Carolina State University, North Carolina State University.
Potentially toxic chemicals are routinely applied to land to meet growing demands on waste management and food production, but the fate of these chemicals is often not well understood. Here we demonstrate an integrated field lysimetry and porewater sampling method for evaluating the mobility of chemicals applied to soils and established vegetation. Lysimeters, open columns made of metal or plastic, are driven into bareground or vegetated soils. Porewater samplers, which are commercially available and use vacuum to collect percolating soil water, are installed at predetermined depths within the lysimeters. At prearranged times following chemical application to experimental plots, porewater is collected, and lysimeters, containing soil and vegetation, are exhumed. By analyzing chemical concentrations in the lysimeter soil, vegetation, and porewater, downward leaching rates, soil retention capacities, and plant uptake for the chemical of interest may be quantified. Because field lysimetry and porewater sampling are conducted under natural environmental conditions and with minimal soil disturbance, derived results project real-case scenarios and provide valuable information for chemical management. As chemicals are increasingly applied to land worldwide, the described techniques may be utilized to determine whether applied chemicals pose adverse effects to human health or the environment.
Environmental Sciences, Issue 89,
Lysimetry, porewater, soil, chemical leaching, pesticides, turfgrass, waste
Experimental Protocol for Manipulating Plant-induced Soil Heterogeneity
Institutions: Case Western Reserve University.
Coexistence theory has often treated environmental heterogeneity as being independent of the community composition; however biotic feedbacks such as plant-soil feedbacks (PSF) have large effects on plant performance, and create environmental heterogeneity that depends on the community composition. Understanding the importance of PSF for plant community assembly necessitates understanding of the role of heterogeneity in PSF, in addition to mean PSF effects. Here, we describe a protocol for manipulating plant-induced soil heterogeneity. Two example experiments are presented: (1) a field experiment with a 6-patch grid of soils to measure plant population responses and (2) a greenhouse experiment with 2-patch soils to measure individual plant responses. Soils can be collected from the zone of root influence (soils from the rhizosphere and directly adjacent to the rhizosphere) of plants in the field from conspecific and heterospecific plant species. Replicate collections are used to avoid pseudoreplicating soil samples. These soils are then placed into separate patches for heterogeneous treatments or mixed for a homogenized treatment. Care should be taken to ensure that heterogeneous and homogenized treatments experience the same degree of soil disturbance. Plants can then be placed in these soil treatments to determine the effect of plant-induced soil heterogeneity on plant performance. We demonstrate that plant-induced heterogeneity results in different outcomes than predicted by traditional coexistence models, perhaps because of the dynamic nature of these feedbacks. Theory that incorporates environmental heterogeneity influenced by the assembling community and additional empirical work is needed to determine when heterogeneity intrinsic to the assembling community will result in different assembly outcomes compared with heterogeneity extrinsic to the community composition.
Environmental Sciences, Issue 85, Coexistence, community assembly, environmental drivers, plant-soil feedback, soil heterogeneity, soil microbial communities, soil patch
Design and Construction of an Urban Runoff Research Facility
Institutions: Texas A&M University, The Scotts Miracle-Gro Company.
As the urban population increases, so does the area of irrigated urban landscape. Summer water use in urban areas can be 2-3x winter base line water use due to increased demand for landscape irrigation. Improper irrigation practices and large rainfall events can result in runoff from urban landscapes which has potential to carry nutrients and sediments into local streams and lakes where they may contribute to eutrophication. A 1,000 m2
facility was constructed which consists of 24 individual 33.6 m2
field plots, each equipped for measuring total runoff volumes with time and collection of runoff subsamples at selected intervals for quantification of chemical constituents in the runoff water from simulated urban landscapes. Runoff volumes from the first and second trials had coefficient of variability (CV) values of 38.2 and 28.7%, respectively. CV values for runoff pH, EC, and Na concentration for both trials were all under 10%. Concentrations of DOC, TDN, DON, PO4
, and Ca2+
had CV values less than 50% in both trials. Overall, the results of testing performed after sod installation at the facility indicated good uniformity between plots for runoff volumes and chemical constituents. The large plot size is sufficient to include much of the natural variability and therefore provides better simulation of urban landscape ecosystems.
Environmental Sciences, Issue 90, urban runoff, landscapes, home lawns, turfgrass, St. Augustinegrass, carbon, nitrogen, phosphorus, sodium
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Design and Operation of a Continuous 13C and 15N Labeling Chamber for Uniform or Differential, Metabolic and Structural, Plant Isotope Labeling
Institutions: Colorado State University, USDA-ARS, Colorado State University.
Tracing rare stable isotopes from plant material through the ecosystem provides the most sensitive information about ecosystem processes; from CO2
fluxes and soil organic matter formation to small-scale stable-isotope biomarker probing. Coupling multiple stable isotopes such as 13
C with 15
O or 2
H has the potential to reveal even more information about complex stoichiometric relationships during biogeochemical transformations. Isotope labeled plant material has been used in various studies of litter decomposition and soil organic matter formation1-4
. From these and other studies, however, it has become apparent that structural components of plant material behave differently than metabolic components (i.e
. leachable low molecular weight compounds) in terms of microbial utilization and long-term carbon storage5-7
. The ability to study structural and metabolic components separately provides a powerful new tool for advancing the forefront of ecosystem biogeochemical studies. Here we describe a method for producing 13
C and 15
N labeled plant material that is either uniformly labeled throughout the plant or differentially labeled in structural and metabolic plant components.
Here, we present the construction and operation of a continuous 13
C and 15
N labeling chamber that can be modified to meet various research needs. Uniformly labeled plant material is produced by continuous labeling from seedling to harvest, while differential labeling is achieved by removing the growing plants from the chamber weeks prior to harvest. Representative results from growing Andropogon gerardii
Kaw demonstrate the system's ability to efficiently label plant material at the targeted levels. Through this method we have produced plant material with a 4.4 atom%13
C and 6.7 atom%15
N uniform plant label, or material that is differentially labeled by up to 1.29 atom%13
C and 0.56 atom%15
N in its metabolic and structural components (hot water extractable and hot water residual components, respectively). Challenges lie in maintaining proper temperature, humidity, CO2
concentration, and light levels in an airtight 13
atmosphere for successful plant production. This chamber description represents a useful research tool to effectively produce uniformly or differentially multi-isotope labeled plant material for use in experiments on ecosystem biogeochemical cycling.
Environmental Sciences, Issue 83, 13C, 15N, plant, stable isotope labeling, Andropogon gerardii, metabolic compounds, structural compounds, hot water extraction
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Introducing an Angle Adjustable Cutting Box for Analyzing Slice Shear Force in Meat
Institutions: Agriculture and Agri-Food Canada, Universidad de Córdoba, University of Nebraska.
Research indicates the fibre angle of the longissimus
muscle can vary, depending upon location within a steak and throughout the muscle. Instead of using the original fixed 45 ° or 90 ° cutting angle for testing shear force, a variable angle cutting box can be adjusted so the angles of the knives correspond to the fibre angle of each sample. Within 2 min after cooking to an internal temperature of 71 °C on an open-hearth grill set at 210 °C, a 1 cm by 5 cm core is cut from the steak, parallel to muscle fibre direction, using 2 knife blades set 1 cm apart. This warm core is then subjected to the Slice Shear Force protocol (SSF) to evaluate meat texture. The use of the variable angle cutting box and the SSF protocol provides an accurate representation of the maximal shear force, as the slice and muscle fibres are consistently parallel. Therefore, the variable angle cutting box, in conjunction with the SSF protocol, can be used as a high-throughput technique to accurately evaluate meat tenderness in different locations of the longissimus
muscle and, potentially, in other muscles.
Biophysics, Issue 74, Anatomy, Physiology, Physics, Agricultural Sciences, Meat, beef, shear force, tenderness, Warner-Bratzler, muscle angle, fibre, fiber, tissue, animal science
Tomato Analyzer: A Useful Software Application to Collect Accurate and Detailed Morphological and Colorimetric Data from Two-dimensional Objects
Institutions: The Ohio State University.
Measuring fruit morphology and color traits of vegetable and fruit crops in an objective and reproducible way is important for detailed phenotypic analyses of these traits. Tomato Analyzer (TA) is a software program that measures 37 attributes related to two-dimensional shape in a semi-automatic and reproducible manner1,2
. Many of these attributes, such as angles at the distal and proximal ends of the fruit and areas of indentation, are difficult to quantify manually. The attributes are organized in ten categories within the software: Basic Measurement, Fruit Shape Index, Blockiness, Homogeneity, Proximal Fruit End Shape, Distal Fruit End Shape, Asymmetry, Internal Eccentricity, Latitudinal Section and Morphometrics. The last category requires neither prior knowledge nor predetermined notions of the shape attributes, so morphometric analysis offers an unbiased option that may be better adapted to high-throughput analyses than attribute analysis. TA also offers the Color Test application that was designed to collect color measurements from scanned images and allow scanning devices to be calibrated using color standards3
TA provides several options to export and analyze shape attribute, morphometric, and color data. The data may be exported to an excel file in batch mode (more than 100 images at one time) or exported as individual images. The user can choose between output that displays the average for each attribute for the objects in each image (including standard deviation), or an output that displays the attribute values for each object on the image. TA has been a valuable and effective tool for indentifying and confirming tomato fruit shape Quantitative Trait Loci (QTL), as well as performing in-depth analyses of the effect of key fruit shape genes on plant morphology. Also, TA can be used to objectively classify fruit into various shape categories. Lastly, fruit shape and color traits in other plant species as well as other plant organs such as leaves and seeds can be evaluated with TA.
Plant Biology, Issue 37, morphology, color, image processing, quantitative trait loci, software
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Computer-Generated Animal Model Stimuli
Institutions: Macquarie University.
Communication between animals is diverse and complex. Animals may communicate using auditory, seismic, chemosensory, electrical, or visual signals. In particular, understanding the constraints on visual signal design for communication has been of great interest. Traditional methods for investigating animal interactions have used basic observational techniques, staged encounters, or physical manipulation of morphology. Less intrusive methods have tried to simulate conspecifics using crude playback tools, such as mirrors, still images, or models. As technology has become more advanced, video playback has emerged as another tool in which to examine visual communication (Rosenthal, 2000). However, to move one step further, the application of computer-animation now allows researchers to specifically isolate critical components necessary to elicit social responses from conspecifics, and manipulate these features to control interactions. Here, I provide detail on how to create an animation using the Jacky dragon as a model, but this process may be adaptable for other species. In building the animation, I elected to use Lightwave 3D to alter object morphology, add texture, install bones, and provide comparable weight shading that prevents exaggerated movement. The animation is then matched to select motor patterns to replicate critical movement features. Finally, the sequence must rendered into an individual clip for presentation. Although there are other adaptable techniques, this particular method had been demonstrated to be effective in eliciting both conspicuous and social responses in staged interactions.
Neuroscience, Issue 6, behavior, lizard, simulation, animation