JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Resource quantity and quality determine the inter-specific associations between ecosystem engineers and resource users in a cavity-nest web.
PLoS ONE
PUBLISHED: 01-01-2013
While ecosystem engineering is a widespread structural force of ecological communities, the mechanisms underlying the inter-specific associations between ecosystem engineers and resource users are poorly understood. A proper knowledge of these mechanisms is, however, essential to understand how communities are structured. Previous studies suggest that increasing the quantity of resources provided by ecosystem engineers enhances populations of resource users. In a long-term study (1995-2011), we show that the quality of the resources (i.e. tree cavities) provided by ecosystem engineers is also a key feature that explains the inter-specific associations in a tree cavity-nest web. Red-naped sapsuckers (Sphyrapicusnuchalis) provided the most abundant cavities (52% of cavities, 0.49 cavities/ha). These cavities were less likely to be used than other cavity types by mountain bluebirds (Sialiacurrucoides), but provided numerous nest-sites (41% of nesting cavities) to tree swallows (Tachycinetabicolour). Swallows experienced low reproductive outputs in northern flicker (Colaptesauratus) cavities compared to those in sapsucker cavities (1.1 vs. 2.1 fledglings/nest), but the highly abundant flickers (33% of cavities, 0.25 cavities/ha) provided numerous suitable nest-sites for bluebirds (58%). The relative shortage of cavities supplied by hairy woodpeckers (Picoidesvillosus) and fungal/insect decay (<10% of cavities each, <0.09 cavities/ha) provided fewer breeding opportunities (<15% of nests), but represented high quality nest-sites for both bluebirds and swallows. Because both the quantity and quality of resources supplied by different ecosystem engineers may explain the amount of resources used by each resource user, conservation strategies may require different management actions to be implemented for the key ecosystem engineer of each resource user. We, therefore, urge the incorporation of both resource quantity and quality into models that assess community dynamics to improve conservation actions and our understanding of ecological communities based on ecosystem engineering.
Authors: Ludovico Carbone, Paul Fulda, Charlotte Bond, Frank Brueckner, Daniel Brown, Mengyao Wang, Deepali Lodhia, Rebecca Palmer, Andreas Freise.
Published: 08-12-2013
ABSTRACT
Thermal noise in high-reflectivity mirrors is a major impediment for several types of high-precision interferometric experiments that aim to reach the standard quantum limit or to cool mechanical systems to their quantum ground state. This is for example the case of future gravitational wave observatories, whose sensitivity to gravitational wave signals is expected to be limited in the most sensitive frequency band, by atomic vibration of their mirror masses. One promising approach being pursued to overcome this limitation is to employ higher-order Laguerre-Gauss (LG) optical beams in place of the conventionally used fundamental mode. Owing to their more homogeneous light intensity distribution these beams average more effectively over the thermally driven fluctuations of the mirror surface, which in turn reduces the uncertainty in the mirror position sensed by the laser light. We demonstrate a promising method to generate higher-order LG beams by shaping a fundamental Gaussian beam with the help of diffractive optical elements. We show that with conventional sensing and control techniques that are known for stabilizing fundamental laser beams, higher-order LG modes can be purified and stabilized just as well at a comparably high level. A set of diagnostic tools allows us to control and tailor the properties of generated LG beams. This enabled us to produce an LG beam with the highest purity reported to date. The demonstrated compatibility of higher-order LG modes with standard interferometry techniques and with the use of standard spherical optics makes them an ideal candidate for application in a future generation of high-precision interferometry.
22 Related JoVE Articles!
Play Button
Terahertz Microfluidic Sensing Using a Parallel-plate Waveguide Sensor
Authors: Victoria Astley, Kimberly Reichel, Rajind Mendis, Daniel M. Mittleman.
Institutions: Rice University .
Refractive index (RI) sensing is a powerful noninvasive and label-free sensing technique for the identification, detection and monitoring of microfluidic samples with a wide range of possible sensor designs such as interferometers and resonators 1,2. Most of the existing RI sensing applications focus on biological materials in aqueous solutions in visible and IR frequencies, such as DNA hybridization and genome sequencing. At terahertz frequencies, applications include quality control, monitoring of industrial processes and sensing and detection applications involving nonpolar materials. Several potential designs for refractive index sensors in the terahertz regime exist, including photonic crystal waveguides 3, asymmetric split-ring resonators 4, and photonic band gap structures integrated into parallel-plate waveguides 5. Many of these designs are based on optical resonators such as rings or cavities. The resonant frequencies of these structures are dependent on the refractive index of the material in or around the resonator. By monitoring the shifts in resonant frequency the refractive index of a sample can be accurately measured and this in turn can be used to identify a material, monitor contamination or dilution, etc. The sensor design we use here is based on a simple parallel-plate waveguide 6,7. A rectangular groove machined into one face acts as a resonant cavity (Figures 1 and 2). When terahertz radiation is coupled into the waveguide and propagates in the lowest-order transverse-electric (TE1) mode, the result is a single strong resonant feature with a tunable resonant frequency that is dependent on the geometry of the groove 6,8. This groove can be filled with nonpolar liquid microfluidic samples which cause a shift in the observed resonant frequency that depends on the amount of liquid in the groove and its refractive index 9. Our technique has an advantage over other terahertz techniques in its simplicity, both in fabrication and implementation, since the procedure can be accomplished with standard laboratory equipment without the need for a clean room or any special fabrication or experimental techniques. It can also be easily expanded to multichannel operation by the incorporation of multiple grooves 10. In this video we will describe our complete experimental procedure, from the design of the sensor to the data analysis and determination of the sample refractive index.
Physics, Issue 66, Electrical Engineering, Computer Engineering, Terahertz radiation, sensing, microfluidic, refractive index sensor, waveguide, optical sensing
4304
Play Button
Double Whole Mount in situ Hybridization of Early Chick Embryos
Authors: Delphine Psychoyos, Richard Finnell.
Institutions: Institute of Biosciences and Technology - Texas A&M Health Science Center , Texas A&M University (TAMU).
The chick embryo is a valuable tool in the study of early embryonic development. Its transparency, accessibility and ease of manipulation, make it an ideal tool for studying gene expression in brain, neural tube, somite and heart primordia formation. This video demonstrates the different steps in 2-color whole mount in situ hybridization; First, the embryo is dissected from the egg and fixed in paraformaldehyde. Second, the embryo is processed for prehybridization. The embryo is then hybridized with two different probes, one coupled to DIG, and one coupled to FITC. Following overnight hybridization, the embryo is incubated with DIG coupled antibody. Color reaction for DIG substrate is performed, and the region of interest appears blue. The embryo is then incubated with FITC coupled antibody. The embryo is processed for color reaction with FITC, and the region of interest appears red. Finally, the embryo is fixed and processed for phtograph and sectioning. A troubleshooting guide is also presented.
Developmental Biology, Issue 20, whole mount in situ hybridization, gene expression, chick embryo
904
Play Button
Hollow Microneedle-based Sensor for Multiplexed Transdermal Electrochemical Sensing
Authors: Philip R. Miller, Shelby A. Skoog, Thayne L. Edwards, David R. Wheeler, Xiaoyin Xiao, Susan M. Brozik, Ronen Polsky, Roger J. Narayan.
Institutions: University of North Carolina and North Carolina State University, Sandia National Laboratories.
The development of a minimally invasive multiplexed monitoring system for rapid analysis of biologically-relevant molecules could offer individuals suffering from chronic medical conditions facile assessment of their immediate physiological state. Furthermore, it could serve as a research tool for analysis of complex, multifactorial medical conditions. In order for such a multianalyte sensor to be realized, it must be minimally invasive, sampling of interstitial fluid must occur without pain or harm to the user, and analysis must be rapid as well as selective. Initially developed for pain-free drug delivery, microneedles have been used to deliver vaccines and pharmacologic agents (e.g., insulin) through the skin.1-2 Since these devices access the interstitial space, microneedles that are integrated with microelectrodes can be used as transdermal electrochemical sensors. Selective detection of glucose, glutamate, lactate, hydrogen peroxide, and ascorbic acid has been demonstrated using integrated microneedle-electrode devices with carbon fibers, modified carbon pastes, and platinum-coated polymer microneedles serving as transducing elements.3-7,8 This microneedle sensor technology has enabled a novel and sophisticated analytical approach for in situ and simultaneous detection of multiple analytes. Multiplexing offers the possibility of monitoring complex microenvironments, which are otherwise difficult to characterize in a rapid and minimally invasive manner. For example, this technology could be utilized for simultaneous monitoring of extracellular levels of, glucose, lactate and pH,9 which are important metabolic indicators of disease states7,10-14 (e.g., cancer proliferation) and exercise-induced acidosis.15
Bioengineering, Issue 64, Biomedical Engineering, Microneedle, Microneedle sensors, multiplexed detection, electrochemistry, stereolithography
4067
Play Button
Shrinkage of Dental Composite in Simulated Cavity Measured with Digital Image Correlation
Authors: Jianying Li, Preetanjali Thakur, Alex S. L. Fok.
Institutions: University of Minnesota.
Polymerization shrinkage of dental resin composites can lead to restoration debonding or cracked tooth tissues in composite-restored teeth. In order to understand where and how shrinkage strain and stress develop in such restored teeth, Digital Image Correlation (DIC) was used to provide a comprehensive view of the displacement and strain distributions within model restorations that had undergone polymerization shrinkage. Specimens with model cavities were made of cylindrical glass rods with both diameter and length being 10 mm. The dimensions of the mesial-occlusal-distal (MOD) cavity prepared in each specimen measured 3 mm and 2 mm in width and depth, respectively. After filling the cavity with resin composite, the surface under observation was sprayed with first a thin layer of white paint and then fine black charcoal powder to create high-contrast speckles. Pictures of that surface were then taken before curing and 5 min after. Finally, the two pictures were correlated using DIC software to calculate the displacement and strain distributions. The resin composite shrunk vertically towards the bottom of the cavity, with the top center portion of the restoration having the largest downward displacement. At the same time, it shrunk horizontally towards its vertical midline. Shrinkage of the composite stretched the material in the vicinity of the “tooth-restoration” interface, resulting in cuspal deflections and high tensile strains around the restoration. Material close to the cavity walls or floor had direct strains mostly in the directions perpendicular to the interfaces. Summation of the two direct strain components showed a relatively uniform distribution around the restoration and its magnitude equaled approximately to the volumetric shrinkage strain of the material.
Medicine, Issue 89, image processing, computer-assisted, polymer matrix composites, testing of materials (composite materials), dental composite restoration, polymerization shrinkage, digital image correlation, full-field strain measurement, interfacial debonding
51191
Play Button
A Protocol for Computer-Based Protein Structure and Function Prediction
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
3259
Play Button
Training Synesthetic Letter-color Associations by Reading in Color
Authors: Olympia Colizoli, Jaap M. J. Murre, Romke Rouw.
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
50893
Play Button
A Practical Guide to Phylogenetics for Nonexperts
Authors: Damien O'Halloran.
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
50975
Play Button
Genomic MRI - a Public Resource for Studying Sequence Patterns within Genomic DNA
Authors: Ashwin Prakash, Jason Bechtel, Alexei Fedorov.
Institutions: University of Toledo Health Science Campus.
Non-coding genomic regions in complex eukaryotes, including intergenic areas, introns, and untranslated segments of exons, are profoundly non-random in their nucleotide composition and consist of a complex mosaic of sequence patterns. These patterns include so-called Mid-Range Inhomogeneity (MRI) regions -- sequences 30-10000 nucleotides in length that are enriched by a particular base or combination of bases (e.g. (G+T)-rich, purine-rich, etc.). MRI regions are associated with unusual (non-B-form) DNA structures that are often involved in regulation of gene expression, recombination, and other genetic processes (Fedorova & Fedorov 2010). The existence of a strong fixation bias within MRI regions against mutations that tend to reduce their sequence inhomogeneity additionally supports the functionality and importance of these genomic sequences (Prakash et al. 2009). Here we demonstrate a freely available Internet resource -- the Genomic MRI program package -- designed for computational analysis of genomic sequences in order to find and characterize various MRI patterns within them (Bechtel et al. 2008). This package also allows generation of randomized sequences with various properties and level of correspondence to the natural input DNA sequences. The main goal of this resource is to facilitate examination of vast regions of non-coding DNA that are still scarcely investigated and await thorough exploration and recognition.
Genetics, Issue 51, bioinformatics, computational biology, genomics, non-randomness, signals, gene regulation, DNA conformation
2663
Play Button
Nest Building as an Indicator of Health and Welfare in Laboratory Mice
Authors: Brianna N. Gaskill, Alicia Z. Karas, Joseph P. Garner, Kathleen R. Pritchett-Corning.
Institutions: Charles River, Tufts University, Stanford University, Stanford University.
The minimization and alleviation of suffering has moral and scientific implications. In order to mitigate this negative experience one must be able to identify when an animal is actually in distress. Pain, illness, or distress cannot be managed if unrecognized. Evaluation of pain or illness typically involves the measurement of physiologic and behavioral indicators which are either invasive or not suitable for large scale assessment. The observation of nesting behavior shows promise as the basis of a species appropriate cage-side assessment tool for recognizing distress in mice. Here we demonstrate the utility of nest building behavior in laboratory mice as an ethologically relevant indicator of welfare. The methods presented can be successfully used to identify thermal stressors, aggressive cages, sickness, and pain. Observation of nest building behavior in mouse colonies provides a refinement to health and well-being assessment on a day to day basis.
Behavior, Issue 82, Animal Structures, Surgical Procedures, Life Sciences (General), Behavioral Sciences, Mouse, Welfare assessment, Nest building
51012
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Fabrication of Silica Ultra High Quality Factor Microresonators
Authors: Ashley J. Maker, Andrea M. Armani.
Institutions: University of Southern California, University of Southern California.
Whispering gallery resonant cavities confine light in circular orbits at their periphery.1-2 The photon storage lifetime in the cavity, quantified by the quality factor (Q) of the cavity, can be in excess of 500ns for cavities with Q factors above 100 million. As a result of their low material losses, silica microcavities have demonstrated some of the longest photon lifetimes to date1-2. Since a portion of the circulating light extends outside the resonator, these devices can also be used to probe the surroundings. This interaction has enabled numerous experiments in biology, such as single molecule biodetection and antibody-antigen kinetics, as well as discoveries in other fields, such as development of ultra-low-threshold microlasers, characterization of thin films, and cavity quantum electrodynamics studies.3-7 The two primary silica resonant cavity geometries are the microsphere and the microtoroid. Both devices rely on a carbon dioxide laser reflow step to achieve their ultra-high-Q factors (Q>100 million).1-2,8-9 However, there are several notable differences between the two structures. Silica microspheres are free-standing, supported by a single optical fiber, whereas silica microtoroids can be fabricated on a silicon wafer in large arrays using a combination of lithography and etching steps. These differences influence which device is optimal for a given experiment. Here, we present detailed fabrication protocols for both types of resonant cavities. While the fabrication of microsphere resonant cavities is fairly straightforward, the fabrication of microtoroid resonant cavities requires additional specialized equipment and facilities (cleanroom). Therefore, this additional requirement may also influence which device is selected for a given experiment. Introduction An optical resonator efficiently confines light at specific wavelengths, known as the resonant wavelengths of the device. 1-2 The common figure of merit for these optical resonators is the quality factor or Q. This term describes the photon lifetime (τo) within the resonator, which is directly related to the resonator's optical losses. Therefore, an optical resonator with a high Q factor has low optical losses, long photon lifetimes, and very low photon decay rates (1/τo). As a result of the long photon lifetimes, it is possible to build-up extremely large circulating optical field intensities in these devices. This very unique property has allowed these devices to be used as laser sources and integrated biosensors.10 A unique sub-class of resonators is the whispering gallery mode optical microcavity. In these devices, the light is confined in circular orbits at the periphery. Therefore, the field is not completely confined within the device, but evanesces into the environment. Whispering gallery mode optical cavities have demonstrated some of the highest quality factors of any optical resonant cavity to date.9,11 Therefore, these devices are used throughout science and engineering, including in fundamental physics studies and in telecommunications as well as in biodetection experiments. 3-7,12 Optical microcavities can be fabricated from a wide range of materials and in a wide variety of geometries. A few examples include silica and silicon microtoroids, silicon, silicon nitride, and silica microdisks, micropillars, and silica and polymer microrings.13-17 The range in quality factor (Q) varies as dramatically as the geometry. Although both geometry and high Q are important considerations in any field, in many applications, there is far greater leverage in boosting device performance through Q enhancement. Among the numerous options detailed previously, the silica microsphere and the silica microtoroid resonator have achieved some of the highest Q factors to date.1,9 Additionally, as a result of the extremely low optical loss of silica from the visible through the near-IR, both microspheres and microtoroids are able to maintain their Q factors over a wide range of testing wavelengths.18 Finally, because silica is inherently biocompatible, it is routinely used in biodetection experiments. In addition to high material absorption, there are several other potential loss mechanisms, including surface roughness, radiation loss, and contamination loss.2 Through an optimization of the device size, it is possible to eliminate radiation losses, which arise from poor optical field confinement within the device. Similarly, by storing a device in an appropriately clean environment, contamination of the surface can be minimized. Therefore, in addition to material loss, surface scattering is the primary loss mechanism of concern.2,8 In silica devices, surface scattering is minimized by using a laser reflow technique, which melts the silica through surface tension induced reflow. While spherical optical resonators have been studied for many years, it is only with recent advances in fabrication technologies that researchers been able to fabricate high quality silica optical toroidal microresonators (Q>100 million) on a silicon substrate, thus paving the way for integration with microfluidics.1 The present series of protocols details how to fabricate both silica microsphere and microtoroid resonant cavities. While silica microsphere resonant cavities are well-established, microtoroid resonant cavities were only recently invented.1 As many of the fundamental methods used to fabricate the microsphere are also used in the more complex microtoroid fabrication procedure, by including both in a single protocol it will enable researchers to more easily trouble-shoot their experiments.
Materials Science, Issue 65, Chemical Engineering, Physics, Electrophysics, Biosensor, device fabrication, microcavity, optical resonator
4164
Play Button
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Authors: Colin W. Bell, Barbara E. Fricks, Jennifer D. Rocca, Jessica M. Steinweg, Shawna K. McMahon, Matthew D. Wallenstein.
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample). Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
50961
Play Button
Design and Operation of a Continuous 13C and 15N Labeling Chamber for Uniform or Differential, Metabolic and Structural, Plant Isotope Labeling
Authors: Jennifer L Soong, Dan Reuss, Colin Pinney, Ty Boyack, Michelle L Haddix, Catherine E Stewart, M. Francesca Cotrufo.
Institutions: Colorado State University, USDA-ARS, Colorado State University.
Tracing rare stable isotopes from plant material through the ecosystem provides the most sensitive information about ecosystem processes; from CO2 fluxes and soil organic matter formation to small-scale stable-isotope biomarker probing. Coupling multiple stable isotopes such as 13C with 15N, 18O or 2H has the potential to reveal even more information about complex stoichiometric relationships during biogeochemical transformations. Isotope labeled plant material has been used in various studies of litter decomposition and soil organic matter formation1-4. From these and other studies, however, it has become apparent that structural components of plant material behave differently than metabolic components (i.e. leachable low molecular weight compounds) in terms of microbial utilization and long-term carbon storage5-7. The ability to study structural and metabolic components separately provides a powerful new tool for advancing the forefront of ecosystem biogeochemical studies. Here we describe a method for producing 13C and 15N labeled plant material that is either uniformly labeled throughout the plant or differentially labeled in structural and metabolic plant components. Here, we present the construction and operation of a continuous 13C and 15N labeling chamber that can be modified to meet various research needs. Uniformly labeled plant material is produced by continuous labeling from seedling to harvest, while differential labeling is achieved by removing the growing plants from the chamber weeks prior to harvest. Representative results from growing Andropogon gerardii Kaw demonstrate the system's ability to efficiently label plant material at the targeted levels. Through this method we have produced plant material with a 4.4 atom%13C and 6.7 atom%15N uniform plant label, or material that is differentially labeled by up to 1.29 atom%13C and 0.56 atom%15N in its metabolic and structural components (hot water extractable and hot water residual components, respectively). Challenges lie in maintaining proper temperature, humidity, CO2 concentration, and light levels in an airtight 13C-CO2 atmosphere for successful plant production. This chamber description represents a useful research tool to effectively produce uniformly or differentially multi-isotope labeled plant material for use in experiments on ecosystem biogeochemical cycling.
Environmental Sciences, Issue 83, 13C, 15N, plant, stable isotope labeling, Andropogon gerardii, metabolic compounds, structural compounds, hot water extraction
51117
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
A Proboscis Extension Response Protocol for Investigating Behavioral Plasticity in Insects: Application to Basic, Biomedical, and Agricultural Research
Authors: Brian H. Smith, Christina M. Burden.
Institutions: Arizona State University.
Insects modify their responses to stimuli through experience of associating those stimuli with events important for survival (e.g., food, mates, threats). There are several behavioral mechanisms through which an insect learns salient associations and relates them to these events. It is important to understand this behavioral plasticity for programs aimed toward assisting insects that are beneficial for agriculture. This understanding can also be used for discovering solutions to biomedical and agricultural problems created by insects that act as disease vectors and pests. The Proboscis Extension Response (PER) conditioning protocol was developed for honey bees (Apis mellifera) over 50 years ago to study how they perceive and learn about floral odors, which signal the nectar and pollen resources a colony needs for survival. The PER procedure provides a robust and easy-to-employ framework for studying several different ecologically relevant mechanisms of behavioral plasticity. It is easily adaptable for use with several other insect species and other behavioral reflexes. These protocols can be readily employed in conjunction with various means for monitoring neural activity in the CNS via electrophysiology or bioimaging, or for manipulating targeted neuromodulatory pathways. It is a robust assay for rapidly detecting sub-lethal effects on behavior caused by environmental stressors, toxins or pesticides. We show how the PER protocol is straightforward to implement using two procedures. One is suitable as a laboratory exercise for students or for quick assays of the effect of an experimental treatment. The other provides more thorough control of variables, which is important for studies of behavioral conditioning. We show how several measures for the behavioral response ranging from binary yes/no to more continuous variable like latency and duration of proboscis extension can be used to test hypotheses. And, we discuss some pitfalls that researchers commonly encounter when they use the procedure for the first time.
Neuroscience, Issue 91, PER, conditioning, honey bee, olfaction, olfactory processing, learning, memory, toxin assay
51057
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
51242
Play Button
Linking Predation Risk, Herbivore Physiological Stress and Microbial Decomposition of Plant Litter
Authors: Oswald J. Schmitz, Mark A. Bradford, Michael S. Strickland, Dror Hawlena.
Institutions: Yale University, Virginia Tech, The Hebrew University of Jerusalem.
The quantity and quality of detritus entering the soil determines the rate of decomposition by microbial communities as well as recycle rates of nitrogen (N) and carbon (C) sequestration1,2. Plant litter comprises the majority of detritus3, and so it is assumed that decomposition is only marginally influenced by biomass inputs from animals such as herbivores and carnivores4,5. However, carnivores may influence microbial decomposition of plant litter via a chain of interactions in which predation risk alters the physiology of their herbivore prey that in turn alters soil microbial functioning when the herbivore carcasses are decomposed6. A physiological stress response by herbivores to the risk of predation can change the C:N elemental composition of herbivore biomass7,8,9 because stress from predation risk increases herbivore basal energy demands that in nutrient-limited systems forces herbivores to shift their consumption from N-rich resources to support growth and reproduction to C-rich carbohydrate resources to support heightened metabolism6. Herbivores have limited ability to store excess nutrients, so stressed herbivores excrete N as they increase carbohydrate-C consumption7. Ultimately, prey stressed by predation risk increase their body C:N ratio7,10, making them poorer quality resources for the soil microbial pool likely due to lower availability of labile N for microbial enzyme production6. Thus, decomposition of carcasses of stressed herbivores has a priming effect on the functioning of microbial communities that decreases subsequent ability to of microbes to decompose plant litter6,10,11. We present the methodology to evaluate linkages between predation risk and litter decomposition by soil microbes. We describe how to: induce stress in herbivores from predation risk; measure those stress responses, and measure the consequences on microbial decomposition. We use insights from a model grassland ecosystem comprising the hunting spider predator (Pisuarina mira), a dominant grasshopper herbivore (Melanoplus femurrubrum),and a variety of grass and forb plants9.
Environmental Sciences, Issue 73, Microbiology, Plant Biology, Entomology, Organisms, Investigative Techniques, Biological Phenomena, Chemical Phenomena, Metabolic Phenomena, Microbiological Phenomena, Earth Resources and Remote Sensing, Life Sciences (General), Litter Decomposition, Ecological Stoichiometry, Physiological Stress and Ecosystem Function, Predation Risk, Soil Respiration, Carbon Sequestration, Soil Science, respiration, spider, grasshoper, model system
50061
Play Button
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Authors: Andreas Florian Haas, Ben Knowles, Yan Wei Lim, Tracey McDole Somera, Linda Wegley Kelly, Mark Hatay, Forest Rohwer.
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
52131
Play Button
Fabrication And Characterization Of Photonic Crystal Slow Light Waveguides And Cavities
Authors: Christopher Paul Reardon, Isabella H. Rey, Karl Welna, Liam O'Faolain, Thomas F. Krauss.
Institutions: University of St Andrews.
Slow light has been one of the hot topics in the photonics community in the past decade, generating great interest both from a fundamental point of view and for its considerable potential for practical applications. Slow light photonic crystal waveguides, in particular, have played a major part and have been successfully employed for delaying optical signals1-4 and the enhancement of both linear5-7 and nonlinear devices.8-11 Photonic crystal cavities achieve similar effects to that of slow light waveguides, but over a reduced band-width. These cavities offer high Q-factor/volume ratio, for the realization of optically12 and electrically13 pumped ultra-low threshold lasers and the enhancement of nonlinear effects.14-16 Furthermore, passive filters17 and modulators18-19 have been demonstrated, exhibiting ultra-narrow line-width, high free-spectral range and record values of low energy consumption. To attain these exciting results, a robust repeatable fabrication protocol must be developed. In this paper we take an in-depth look at our fabrication protocol which employs electron-beam lithography for the definition of photonic crystal patterns and uses wet and dry etching techniques. Our optimised fabrication recipe results in photonic crystals that do not suffer from vertical asymmetry and exhibit very good edge-wall roughness. We discuss the results of varying the etching parameters and the detrimental effects that they can have on a device, leading to a diagnostic route that can be taken to identify and eliminate similar issues. The key to evaluating slow light waveguides is the passive characterization of transmission and group index spectra. Various methods have been reported, most notably resolving the Fabry-Perot fringes of the transmission spectrum20-21 and interferometric techniques.22-25 Here, we describe a direct, broadband measurement technique combining spectral interferometry with Fourier transform analysis.26 Our method stands out for its simplicity and power, as we can characterise a bare photonic crystal with access waveguides, without need for on-chip interference components, and the setup only consists of a Mach-Zehnder interferometer, with no need for moving parts and delay scans. When characterising photonic crystal cavities, techniques involving internal sources21 or external waveguides directly coupled to the cavity27 impact on the performance of the cavity itself, thereby distorting the measurement. Here, we describe a novel and non-intrusive technique that makes use of a cross-polarised probe beam and is known as resonant scattering (RS), where the probe is coupled out-of plane into the cavity through an objective. The technique was first demonstrated by McCutcheon et al.28 and further developed by Galli et al.29
Physics, Issue 69, Optics and Photonics, Astronomy, light scattering, light transmission, optical waveguides, photonics, photonic crystals, Slow-light, Cavities, Waveguides, Silicon, SOI, Fabrication, Characterization
50216
Play Button
Microfabrication of Chip-sized Scaffolds for Three-dimensional Cell cultivation
Authors: Stefan Giselbrecht, Eric Gottwald, Roman Truckenmueller, Christina Trautmann, Alexander Welle, Andreas Guber, Volker Saile, Thomas Gietzelt, Karl-Friedrich Weibezahn.
Institutions: Karlsruhe Research Centre, University of Twente, Institute for Heavy Ion Research, Karlsruhe Research Centre, Karlsruhe Research Centre.
Using microfabrication technologies is a prerequisite to create scaffolds of reproducible geometry and constant quality for three-dimensional cell cultivation. These technologies offer a wide spectrum of advantages not only for manufacturing but also for different applications. The size and shape of formed cell clusters can be influenced by the exact and reproducible architecture of the microfabricated scaffold and, therefore, the diffusion path length of nutrients and gases can be controlled.1 This is unquestionably a useful tool to prevent apoptosis and necrosis of cells due to an insufficient nutrient and gas supply or removal of cellular metabolites. Our polymer chip, called CellChip, has the outer dimensions of 2 x 2 cm with a central microstructured area. This area is subdivided into an array of up to 1156 microcontainers with a typical dimension of 300 m edge length for the cubic design (cp- or cf-chip) or of 300 m diameter and depth for the round design (r-chip).2 So far, hot embossing or micro injection moulding (in combination with subsequent laborious machining of the parts) was used for the fabrication of the microstructured chips. Basically, micro injection moulding is one of the only polymer based replication techniques that, up to now, is capable for mass production of polymer microstructures.3 However, both techniques have certain unwanted limitations due to the processing of a viscous polymer melt with the generation of very thin walls or integrated through holes. In case of the CellChip, thin bottom layers are necessary to perforate the polymer and provide small pores of defined size to supply cells with culture medium e.g. by microfluidic perfusion of the containers. In order to overcome these limitations and to reduce the manufacturing costs we have developed a new microtechnical approach on the basis of a down-scaled thermoforming process. For the manufacturing of highly porous and thin walled polymer chips, we use a combination of heavy ion irradiation, microthermoforming and track etching. In this so called "SMART" process (Substrate Modification And Replication by Thermoforming) thin polymer films are irradiated with energetic heavy projectiles of several hundred MeV introducing so-called "latent tracks" Subsequently, the film in a rubber elastic state is formed into three dimensional parts without modifying or annealing the tracks. After the forming process, selective chemical etching finally converts the tracks into cylindrical pores of adjustable diameter.
Cellular Biology, Issue 15, SMART, microthermoforming, microfabrication, scaffolds, polymer
699
Play Button
Microbial Communities in Nature and Laboratory - Interview
Authors: Edward F. DeLong.
Institutions: MIT - Massachusetts Institute of Technology.
Microbiology, issue 4, microbial community, biofilm, genome
202
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.