JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
May-June maximum temperature reconstruction from mean earlywood density in north central China and its linkages to the summer monsoon activities.
PUBLISHED: 01-01-2014
Cores of Pinus tabulaformis from Tianshui were subjected to densitometric analysis to obtain mean earlywood density data. Climate response analysis indicates that May-June maximum temperature is the main factor limiting the mean earlywood density (EWD) of Chinese pine trees in the Shimen Mountains. Based on the EWD chronology, we have reconstructed May-June maximum temperature 1666 to 2008 for Tianshui, north central China. The reconstruction explains 40.1% of the actual temperature variance during the common period 1953-2008. The temperature reconstruction is representative of temperature conditions over a large area to the southeast and northwest of the sampling site. Preliminary analysis of links between large-scale climatic variation and the temperature reconstruction shows that there is a relationship between extremes in spring temperature and anomalous atmospheric circulation in the region. It is thus revealed that the mean earlywood density chronology of Pinus tabulaformis has enough potential to reconstruct the temperature variability further into the past.
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Published: 06-30-2014
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
23 Related JoVE Articles!
Play Button
Designing Silk-silk Protein Alloy Materials for Biomedical Applications
Authors: Xiao Hu, Solomon Duki, Joseph Forys, Jeffrey Hettinger, Justin Buchicchio, Tabbetha Dobbins, Catherine Yang.
Institutions: Rowan University, Rowan University, Cooper Medical School of Rowan University, Rowan University.
Fibrous proteins display different sequences and structures that have been used for various applications in biomedical fields such as biosensors, nanomedicine, tissue regeneration, and drug delivery. Designing materials based on the molecular-scale interactions between these proteins will help generate new multifunctional protein alloy biomaterials with tunable properties. Such alloy material systems also provide advantages in comparison to traditional synthetic polymers due to the materials biodegradability, biocompatibility, and tenability in the body. This article used the protein blends of wild tussah silk (Antheraea pernyi) and domestic mulberry silk (Bombyx mori) as an example to provide useful protocols regarding these topics, including how to predict protein-protein interactions by computational methods, how to produce protein alloy solutions, how to verify alloy systems by thermal analysis, and how to fabricate variable alloy materials including optical materials with diffraction gratings, electric materials with circuits coatings, and pharmaceutical materials for drug release and delivery. These methods can provide important information for designing the next generation multifunctional biomaterials based on different protein alloys.
Bioengineering, Issue 90, protein alloys, biomaterials, biomedical, silk blends, computational simulation, implantable electronic devices
Play Button
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Authors: Colin W. Bell, Barbara E. Fricks, Jennifer D. Rocca, Jessica M. Steinweg, Shawna K. McMahon, Matthew D. Wallenstein.
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample). Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
Play Button
Optimized Negative Staining: a High-throughput Protocol for Examining Small and Asymmetric Protein Structure by Electron Microscopy
Authors: Matthew Rames, Yadong Yu, Gang Ren.
Institutions: The Molecular Foundry.
Structural determination of proteins is rather challenging for proteins with molecular masses between 40 - 200 kDa. Considering that more than half of natural proteins have a molecular mass between 40 - 200 kDa1,2, a robust and high-throughput method with a nanometer resolution capability is needed. Negative staining (NS) electron microscopy (EM) is an easy, rapid, and qualitative approach which has frequently been used in research laboratories to examine protein structure and protein-protein interactions. Unfortunately, conventional NS protocols often generate structural artifacts on proteins, especially with lipoproteins that usually form presenting rouleaux artifacts. By using images of lipoproteins from cryo-electron microscopy (cryo-EM) as a standard, the key parameters in NS specimen preparation conditions were recently screened and reported as the optimized NS protocol (OpNS), a modified conventional NS protocol 3 . Artifacts like rouleaux can be greatly limited by OpNS, additionally providing high contrast along with reasonably high‐resolution (near 1 nm) images of small and asymmetric proteins. These high-resolution and high contrast images are even favorable for an individual protein (a single object, no average) 3D reconstruction, such as a 160 kDa antibody, through the method of electron tomography4,5. Moreover, OpNS can be a high‐throughput tool to examine hundreds of samples of small proteins. For example, the previously published mechanism of 53 kDa cholesteryl ester transfer protein (CETP) involved the screening and imaging of hundreds of samples 6. Considering cryo-EM rarely successfully images proteins less than 200 kDa has yet to publish any study involving screening over one hundred sample conditions, it is fair to call OpNS a high-throughput method for studying small proteins. Hopefully the OpNS protocol presented here can be a useful tool to push the boundaries of EM and accelerate EM studies into small protein structure, dynamics and mechanisms.
Environmental Sciences, Issue 90, small and asymmetric protein structure, electron microscopy, optimized negative staining
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Quantum State Engineering of Light with Continuous-wave Optical Parametric Oscillators
Authors: Olivier Morin, Jianli Liu, Kun Huang, Felippe Barbosa, Claude Fabre, Julien Laurat.
Institutions: Université Pierre et Marie Curie, Ecole Normale Supérieure, CNRS, East China Normal University, Universidade de São Paulo.
Engineering non-classical states of the electromagnetic field is a central quest for quantum optics1,2. Beyond their fundamental significance, such states are indeed the resources for implementing various protocols, ranging from enhanced metrology to quantum communication and computing. A variety of devices can be used to generate non-classical states, such as single emitters, light-matter interfaces or non-linear systems3. We focus here on the use of a continuous-wave optical parametric oscillator3,4. This system is based on a non-linear χ2 crystal inserted inside an optical cavity and it is now well-known as a very efficient source of non-classical light, such as single-mode or two-mode squeezed vacuum depending on the crystal phase matching. Squeezed vacuum is a Gaussian state as its quadrature distributions follow a Gaussian statistics. However, it has been shown that number of protocols require non-Gaussian states5. Generating directly such states is a difficult task and would require strong χ3 non-linearities. Another procedure, probabilistic but heralded, consists in using a measurement-induced non-linearity via a conditional preparation technique operated on Gaussian states. Here, we detail this generation protocol for two non-Gaussian states, the single-photon state and a superposition of coherent states, using two differently phase-matched parametric oscillators as primary resources. This technique enables achievement of a high fidelity with the targeted state and generation of the state in a well-controlled spatiotemporal mode.
Physics, Issue 87, Optics, Quantum optics, Quantum state engineering, Optical parametric oscillator, Squeezed vacuum, Single photon, Coherent state superposition, Homodyne detection
Play Button
Reconstruction of 3-Dimensional Histology Volume and its Application to Study Mouse Mammary Glands
Authors: Rushin Shojaii, Stephanie Bacopulos, Wenyi Yang, Tigran Karavardanyan, Demetri Spyropoulos, Afshin Raouf, Anne Martel, Arun Seth.
Institutions: University of Toronto, Sunnybrook Research Institute, University of Toronto, Sunnybrook Research Institute, Medical University of South Carolina, University of Manitoba.
Histology volume reconstruction facilitates the study of 3D shape and volume change of an organ at the level of macrostructures made up of cells. It can also be used to investigate and validate novel techniques and algorithms in volumetric medical imaging and therapies. Creating 3D high-resolution atlases of different organs1,2,3 is another application of histology volume reconstruction. This provides a resource for investigating tissue structures and the spatial relationship between various cellular features. We present an image registration approach for histology volume reconstruction, which uses a set of optical blockface images. The reconstructed histology volume represents a reliable shape of the processed specimen with no propagated post-processing registration error. The Hematoxylin and Eosin (H&E) stained sections of two mouse mammary glands were registered to their corresponding blockface images using boundary points extracted from the edges of the specimen in histology and blockface images. The accuracy of the registration was visually evaluated. The alignment of the macrostructures of the mammary glands was also visually assessed at high resolution. This study delineates the different steps of this image registration pipeline, ranging from excision of the mammary gland through to 3D histology volume reconstruction. While 2D histology images reveal the structural differences between pairs of sections, 3D histology volume provides the ability to visualize the differences in shape and volume of the mammary glands.
Bioengineering, Issue 89, Histology Volume Reconstruction, Transgenic Mouse Model, Image Registration, Digital Histology, Image Processing, Mouse Mammary Gland
Play Button
Super-resolution Imaging of the Cytokinetic Z Ring in Live Bacteria Using Fast 3D-Structured Illumination Microscopy (f3D-SIM)
Authors: Lynne Turnbull, Michael P. Strauss, Andrew T. F. Liew, Leigh G. Monahan, Cynthia B. Whitchurch, Elizabeth J. Harry.
Institutions: University of Technology, Sydney.
Imaging of biological samples using fluorescence microscopy has advanced substantially with new technologies to overcome the resolution barrier of the diffraction of light allowing super-resolution of live samples. There are currently three main types of super-resolution techniques – stimulated emission depletion (STED), single-molecule localization microscopy (including techniques such as PALM, STORM, and GDSIM), and structured illumination microscopy (SIM). While STED and single-molecule localization techniques show the largest increases in resolution, they have been slower to offer increased speeds of image acquisition. Three-dimensional SIM (3D-SIM) is a wide-field fluorescence microscopy technique that offers a number of advantages over both single-molecule localization and STED. Resolution is improved, with typical lateral and axial resolutions of 110 and 280 nm, respectively and depth of sampling of up to 30 µm from the coverslip, allowing for imaging of whole cells. Recent advancements (fast 3D-SIM) in the technology increasing the capture rate of raw images allows for fast capture of biological processes occurring in seconds, while significantly reducing photo-toxicity and photobleaching. Here we describe the use of one such method to image bacterial cells harboring the fluorescently-labelled cytokinetic FtsZ protein to show how cells are analyzed and the type of unique information that this technique can provide.
Molecular Biology, Issue 91, super-resolution microscopy, fluorescence microscopy, OMX, 3D-SIM, Blaze, cell division, bacteria, Bacillus subtilis, Staphylococcus aureus, FtsZ, Z ring constriction
Play Button
Design and Construction of an Urban Runoff Research Facility
Authors: Benjamin G. Wherley, Richard H. White, Kevin J. McInnes, Charles H. Fontanier, James C. Thomas, Jacqueline A. Aitkenhead-Peterson, Steven T. Kelly.
Institutions: Texas A&M University, The Scotts Miracle-Gro Company.
As the urban population increases, so does the area of irrigated urban landscape. Summer water use in urban areas can be 2-3x winter base line water use due to increased demand for landscape irrigation. Improper irrigation practices and large rainfall events can result in runoff from urban landscapes which has potential to carry nutrients and sediments into local streams and lakes where they may contribute to eutrophication. A 1,000 m2 facility was constructed which consists of 24 individual 33.6 m2 field plots, each equipped for measuring total runoff volumes with time and collection of runoff subsamples at selected intervals for quantification of chemical constituents in the runoff water from simulated urban landscapes. Runoff volumes from the first and second trials had coefficient of variability (CV) values of 38.2 and 28.7%, respectively. CV values for runoff pH, EC, and Na concentration for both trials were all under 10%. Concentrations of DOC, TDN, DON, PO4-P, K+, Mg2+, and Ca2+ had CV values less than 50% in both trials. Overall, the results of testing performed after sod installation at the facility indicated good uniformity between plots for runoff volumes and chemical constituents. The large plot size is sufficient to include much of the natural variability and therefore provides better simulation of urban landscape ecosystems.
Environmental Sciences, Issue 90, urban runoff, landscapes, home lawns, turfgrass, St. Augustinegrass, carbon, nitrogen, phosphorus, sodium
Play Button
Laboratory-determined Phosphorus Flux from Lake Sediments as a Measure of Internal Phosphorus Loading
Authors: Mary E. Ogdahl, Alan D. Steinman, Maggie E. Weinert.
Institutions: Grand Valley State University.
Eutrophication is a water quality issue in lakes worldwide, and there is a critical need to identify and control nutrient sources. Internal phosphorus (P) loading from lake sediments can account for a substantial portion of the total P load in eutrophic, and some mesotrophic, lakes. Laboratory determination of P release rates from sediment cores is one approach for determining the role of internal P loading and guiding management decisions. Two principal alternatives to experimental determination of sediment P release exist for estimating internal load: in situ measurements of changes in hypolimnetic P over time and P mass balance. The experimental approach using laboratory-based sediment incubations to quantify internal P load is a direct method, making it a valuable tool for lake management and restoration. Laboratory incubations of sediment cores can help determine the relative importance of internal vs. external P loads, as well as be used to answer a variety of lake management and research questions. We illustrate the use of sediment core incubations to assess the effectiveness of an aluminum sulfate (alum) treatment for reducing sediment P release. Other research questions that can be investigated using this approach include the effects of sediment resuspension and bioturbation on P release. The approach also has limitations. Assumptions must be made with respect to: extrapolating results from sediment cores to the entire lake; deciding over what time periods to measure nutrient release; and addressing possible core tube artifacts. A comprehensive dissolved oxygen monitoring strategy to assess temporal and spatial redox status in the lake provides greater confidence in annual P loads estimated from sediment core incubations.
Environmental Sciences, Issue 85, Limnology, internal loading, eutrophication, nutrient flux, sediment coring, phosphorus, lakes
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Test Samples for Optimizing STORM Super-Resolution Microscopy
Authors: Daniel J. Metcalf, Rebecca Edwards, Neelam Kumarswami, Alex E. Knight.
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
Play Button
Processing the Loblolly Pine PtGen2 cDNA Microarray
Authors: W. Walter Lorenz, Yuan-Sheng Yu, Marta Simões, Jeffrey F. D. Dean.
Institutions: University of Georgia (UGA), Instituto Tecnologia Química e Biológica UNL, Av. da República.
PtGen2 is a 26,496 feature cDNA microarray containing amplified loblolly pine ESTs. The array is produced in our laboratory for use by researchers studying gene expression in pine and other conifer species. PtGen2 was developed as a result of our gene discovery efforts in loblolly pine, and is comprised of sequences identified primarily from root tissues, but also from needle and stem.1,2 PtGen2 has been tested by hybridizing different Cy-dye labeled conifer target cDNAs, using both amplified and non-amplified indirect labeling methods, and also tested with a number of hybridization and washing conditions. This video focuses on the handling and processing of slides before and after pre-hybridization, as well as after hybridization, using some modifications to procedures developed previously.3,4 Also included, in text form only, are the protocols used for the generation, labeling and clean up of target cDNA s, as well as information on software used for downstream data processing. PtGen2 is printed with a proprietary print buffer that contains high concentrations of salt that can be difficult to remove completely. The slides are washed first in a warm SDS solution prior to pre-hybridization. After pre-hybridization, the slides are washed vigorously in several changes of water to complete removal of remaining salts. LifterSlips™ are then cleaned and positioned on the slides and labeled cDNA is carefully loaded onto the microarray by way of capillary action which provides for even distribution of the sample across the slide, and reduces the chance of bubble incorporation. Hybridization of targets to the array is done at 48°C in high humidity conditions. After hybridization, a series of standard washes are done at 53°C and room temperature for extended times. Processing PtGen2 slides using this technique reduces salt and SDS-derived artifacts often seen when the array is processed less rigorously. Hybridizing targets derived from several different conifer RNA sources, this processing protocol yielded fewer artifacts, reduced background, and provided better consistency among different experimental groups of arrays.
Plant Biology, Issue 25, Loblolly pine, P. taeda, cDNA, microarray, slide processing
Play Button
An Improved Method of RNA Isolation from Loblolly Pine (P. taeda L.) and Other Conifer Species
Authors: W. Walter Lorenz, Yuan-Sheng Yu, Jeffrey F. D. Dean.
Institutions: University of Georgia (UGA).
Tissues isolated from conifer species, particularly those belonging to the Pinaceae family, such as loblolly pine (Pinus taeda L.), contain high concentrations of phenolic compounds and polysaccharides that interfere with RNA purification. Isolation of high-quality RNA from these species requires rigorous tissue collection procedures in the field and the employment of an RNA isolation protocol comprised of multiple organic extraction steps in order to isolate RNA of sufficient quality for microarray and other genomic analyses. The isolation of high-quality RNA from field-collected loblolly pine samples can be challenging, but several modifications to standard tissue and RNA isolation procedures greatly improve results. The extent of general RNA degradation increases if samples are not properly collected and transported from the field, especially during large-scale harvests. Total RNA yields can be increased significantly by pulverizing samples in a liquid nitrogen freezer mill prior to RNA isolation, especially when samples come from woody tissues. This is primarily due to the presence of oxidizing agents, such as phenolic compounds, and polysaccharides that are both present at high levels in extracts from the woody tissues of most conifer species. If not removed, these contaminants can carry over leading to problems, such as RNA degradation, that result in low yields and a poor quality RNA sample. Carryover of phenolic compounds, as well as polysaccharides, can also reduce or even completely eliminate the activity of reverse transcriptase or other polymerases commonly used for cDNA synthesis. In particular, RNA destined to be used as template for double-stranded cDNA synthesis in the generation of cDNA libraries, single-stranded cDNA synthesis for PCR or qPCR's, or for the synthesis of microarray target materials must be of the highest quality if researchers expect to obtain optimal results. RNA isolation techniques commonly employed for many other plant species are often insufficient in their ability to remove these contaminants from conifer samples and thus do not yield total RNA samples suitable for downstream manipulations. In this video we demonstrate methods for field collection of conifer tissues, beginning with the felling of a forty year-old tree, to the harvesting of phloem, secondary xylem, and reaction wood xylem. We also demonstrate an RNA isolation protocol that has consistently yielded high-quality RNA for subsequent enzymatic manipulations.
Plant Biology, Issue 36, RNA isolation, loblolly pine, Pinus taeda, conifer, wood, xylem, phloem
Play Button
Single Particle Electron Microscopy Reconstruction of the Exosome Complex Using the Random Conical Tilt Method
Authors: Xueqi Liu, Hong-Wei Wang.
Institutions: Yale University.
Single particle electron microscopy (EM) reconstruction has recently become a popular tool to get the three-dimensional (3D) structure of large macromolecular complexes. Compared to X-ray crystallography, it has some unique advantages. First, single particle EM reconstruction does not need to crystallize the protein sample, which is the bottleneck in X-ray crystallography, especially for large macromolecular complexes. Secondly, it does not need large amounts of protein samples. Compared with milligrams of proteins necessary for crystallization, single particle EM reconstruction only needs several micro-liters of protein solution at nano-molar concentrations, using the negative staining EM method. However, despite a few macromolecular assemblies with high symmetry, single particle EM is limited at relatively low resolution (lower than 1 nm resolution) for many specimens especially those without symmetry. This technique is also limited by the size of the molecules under study, i.e. 100 kDa for negatively stained specimens and 300 kDa for frozen-hydrated specimens in general. For a new sample of unknown structure, we generally use a heavy metal solution to embed the molecules by negative staining. The specimen is then examined in a transmission electron microscope to take two-dimensional (2D) micrographs of the molecules. Ideally, the protein molecules have a homogeneous 3D structure but exhibit different orientations in the micrographs. These micrographs are digitized and processed in computers as "single particles". Using two-dimensional alignment and classification techniques, homogenous molecules in the same views are clustered into classes. Their averages enhance the signal of the molecule's 2D shapes. After we assign the particles with the proper relative orientation (Euler angles), we will be able to reconstruct the 2D particle images into a 3D virtual volume. In single particle 3D reconstruction, an essential step is to correctly assign the proper orientation of each single particle. There are several methods to assign the view for each particle, including the angular reconstitution1 and random conical tilt (RCT) method2. In this protocol, we describe our practice in getting the 3D reconstruction of yeast exosome complex using negative staining EM and RCT. It should be noted that our protocol of electron microscopy and image processing follows the basic principle of RCT but is not the only way to perform the method. We first describe how to embed the protein sample into a layer of Uranyl-Formate with a thickness comparable to the protein size, using a holey carbon grid covered with a layer of continuous thin carbon film. Then the specimen is inserted into a transmission electron microscope to collect untilted (0-degree) and tilted (55-degree) pairs of micrographs that will be used later for processing and obtaining an initial 3D model of the yeast exosome. To this end, we perform RCT and then refine the initial 3D model by using the projection matching refinement method3.
Structural Biology, Issue 49, Electron microscopy, single particle three-dimensional reconstruction, exosome complex, negative staining
Play Button
Structure of HIV-1 Capsid Assemblies by Cryo-electron Microscopy and Iterative Helical Real-space Reconstruction
Authors: Xin Meng, Gongpu Zhao, Peijun Zhang.
Institutions: University of Pittsburgh School of Medicine.
Cryo-electron microscopy (cryo-EM), combined with image processing, is an increasingly powerful tool for structure determination of macromolecular protein complexes and assemblies. In fact, single particle electron microscopy1 and two-dimensional (2D) electron crystallography2 have become relatively routine methodologies and a large number of structures have been solved using these methods. At the same time, image processing and three-dimensional (3D) reconstruction of helical objects has rapidly developed, especially, the iterative helical real-space reconstruction (IHRSR) method3, which uses single particle analysis tools in conjunction with helical symmetry. Many biological entities function in filamentous or helical forms, including actin filaments4, microtubules5, amyloid fibers6, tobacco mosaic viruses7, and bacteria flagella8, and, because a 3D density map of a helical entity can be attained from a single projection image, compared to the many images required for 3D reconstruction of a non-helical object, with the IHRSR method, structural analysis of such flexible and disordered helical assemblies is now attainable. In this video article, we provide detailed protocols for obtaining a 3D density map of a helical protein assembly (HIV-1 capsid9 is our example), including protocols for cryo-EM specimen preparation, low dose data collection by cryo-EM, indexing of helical diffraction patterns, and image processing and 3D reconstruction using IHRSR. Compared to other techniques, cryo-EM offers optimal specimen preservation under near native conditions. Samples are embedded in a thin layer of vitreous ice, by rapid freezing, and imaged in electron microscopes at liquid nitrogen temperature, under low dose conditions to minimize the radiation damage. Sample images are obtained under near native conditions at the expense of low signal and low contrast in the recorded micrographs. Fortunately, the process of helical reconstruction has largely been automated, with the exception of indexing the helical diffraction pattern. Here, we describe an approach to index helical structure and determine helical symmetries (helical parameters) from digitized micrographs, an essential step for 3D helical reconstruction. Briefly, we obtain an initial 3D density map by applying the IHRSR method. This initial map is then iteratively refined by introducing constraints for the alignment parameters of each segment, thus controlling their degrees of freedom. Further improvement is achieved by correcting for the contrast transfer function (CTF) of the electron microscope (amplitude and phase correction) and by optimizing the helical symmetry of the assembly.
Immunology, Issue 54, cryo-electron microscopy, helical indexing, helical real-space reconstruction, tubular assemblies, HIV-1 capsid
Play Button
The ITS2 Database
Authors: Benjamin Merget, Christian Koetschan, Thomas Hackl, Frank Förster, Thomas Dandekar, Tobias Müller, Jörg Schultz, Matthias Wolf.
Institutions: University of Würzburg, University of Würzburg.
The internal transcribed spacer 2 (ITS2) has been used as a phylogenetic marker for more than two decades. As ITS2 research mainly focused on the very variable ITS2 sequence, it confined this marker to low-level phylogenetics only. However, the combination of the ITS2 sequence and its highly conserved secondary structure improves the phylogenetic resolution1 and allows phylogenetic inference at multiple taxonomic ranks, including species delimitation2-8. The ITS2 Database9 presents an exhaustive dataset of internal transcribed spacer 2 sequences from NCBI GenBank11 accurately reannotated10. Following an annotation by profile Hidden Markov Models (HMMs), the secondary structure of each sequence is predicted. First, it is tested whether a minimum energy based fold12 (direct fold) results in a correct, four helix conformation. If this is not the case, the structure is predicted by homology modeling13. In homology modeling, an already known secondary structure is transferred to another ITS2 sequence, whose secondary structure was not able to fold correctly in a direct fold. The ITS2 Database is not only a database for storage and retrieval of ITS2 sequence-structures. It also provides several tools to process your own ITS2 sequences, including annotation, structural prediction, motif detection and BLAST14 search on the combined sequence-structure information. Moreover, it integrates trimmed versions of 4SALE15,16 and ProfDistS17 for multiple sequence-structure alignment calculation and Neighbor Joining18 tree reconstruction. Together they form a coherent analysis pipeline from an initial set of sequences to a phylogeny based on sequence and secondary structure. In a nutshell, this workbench simplifies first phylogenetic analyses to only a few mouse-clicks, while additionally providing tools and data for comprehensive large-scale analyses.
Genetics, Issue 61, alignment, internal transcribed spacer 2, molecular systematics, secondary structure, ribosomal RNA, phylogenetic tree, homology modeling, phylogeny
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Play Button
Setting Limits on Supersymmetry Using Simplified Models
Authors: Christian Gütschow, Zachary Marshall.
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
Play Button
X-ray Dose Reduction through Adaptive Exposure in Fluoroscopic Imaging
Authors: Steve Burion, Tobias Funk.
Institutions: Triple Ring Technologies.
X-ray fluoroscopy is widely used for image guidance during cardiac intervention. However, radiation dose in these procedures can be high, and this is a significant concern, particularly in pediatric applications. Pediatrics procedures are in general much more complex than those performed on adults and thus are on average four to eight times longer1. Furthermore, children can undergo up to 10 fluoroscopic procedures by the age of 10, and have been shown to have a three-fold higher risk of developing fatal cancer throughout their life than the general population2,3. We have shown that radiation dose can be significantly reduced in adult cardiac procedures by using our scanning beam digital x-ray (SBDX) system4-- a fluoroscopic imaging system that employs an inverse imaging geometry5,6 (Figure 1, Movie 1 and Figure 2). Instead of a single focal spot and an extended detector as used in conventional systems, our approach utilizes an extended X-ray source with multiple focal spots focused on a small detector. Our X-ray source consists of a scanning electron beam sequentially illuminating up to 9,000 focal spot positions. Each focal spot projects a small portion of the imaging volume onto the detector. In contrast to a conventional system where the final image is directly projected onto the detector, the SBDX uses a dedicated algorithm to reconstruct the final image from the 9,000 detector images. For pediatric applications, dose savings with the SBDX system are expected to be smaller than in adult procedures. However, the SBDX system allows for additional dose savings by implementing an electronic adaptive exposure technique. Key to this method is the multi-beam scanning technique of the SBDX system: rather than exposing every part of the image with the same radiation dose, we can dynamically vary the exposure depending on the opacity of the region exposed. Therefore, we can significantly reduce exposure in radiolucent areas and maintain exposure in more opaque regions. In our current implementation, the adaptive exposure requires user interaction (Figure 3). However, in the future, the adaptive exposure will be real time and fully automatic. We have performed experiments with an anthropomorphic phantom and compared measured radiation dose with and without adaptive exposure using a dose area product (DAP) meter. In the experiment presented here, we find a dose reduction of 30%.
Bioengineering, Issue 55, Scanning digital X-ray, fluoroscopy, pediatrics, interventional cardiology, adaptive exposure, dose savings
Play Button
Determining 3D Flow Fields via Multi-camera Light Field Imaging
Authors: Tadd T. Truscott, Jesse Belden, Joseph R. Nielson, David J. Daily, Scott L. Thomson.
Institutions: Brigham Young University, Naval Undersea Warfare Center, Newport, RI.
In the field of fluid mechanics, the resolution of computational schemes has outpaced experimental methods and widened the gap between predicted and observed phenomena in fluid flows. Thus, a need exists for an accessible method capable of resolving three-dimensional (3D) data sets for a range of problems. We present a novel technique for performing quantitative 3D imaging of many types of flow fields. The 3D technique enables investigation of complicated velocity fields and bubbly flows. Measurements of these types present a variety of challenges to the instrument. For instance, optically dense bubbly multiphase flows cannot be readily imaged by traditional, non-invasive flow measurement techniques due to the bubbles occluding optical access to the interior regions of the volume of interest. By using Light Field Imaging we are able to reparameterize images captured by an array of cameras to reconstruct a 3D volumetric map for every time instance, despite partial occlusions in the volume. The technique makes use of an algorithm known as synthetic aperture (SA) refocusing, whereby a 3D focal stack is generated by combining images from several cameras post-capture 1. Light Field Imaging allows for the capture of angular as well as spatial information about the light rays, and hence enables 3D scene reconstruction. Quantitative information can then be extracted from the 3D reconstructions using a variety of processing algorithms. In particular, we have developed measurement methods based on Light Field Imaging for performing 3D particle image velocimetry (PIV), extracting bubbles in a 3D field and tracking the boundary of a flickering flame. We present the fundamentals of the Light Field Imaging methodology in the context of our setup for performing 3DPIV of the airflow passing over a set of synthetic vocal folds, and show representative results from application of the technique to a bubble-entraining plunging jet.
Physics, Issue 73, Mechanical Engineering, Fluid Mechanics, Engineering, synthetic aperture imaging, light field, camera array, particle image velocimetry, three dimensional, vector fields, image processing, auto calibration, vocal chords, bubbles, flow, fluids
Play Button
Born Normalization for Fluorescence Optical Projection Tomography for Whole Heart Imaging
Authors: Claudio Vinegoni, Daniel Razansky, Jose-Luiz Figueiredo, Lyuba Fexon, Misha Pivovarov, Matthias Nahrendorf, Vasilis Ntziachristos, Ralph Weissleder.
Institutions: Harvard Medical School, MGH - Massachusetts General Hospital, Technical University of Munich and Helmholtz Center Munich.
Optical projection tomography is a three-dimensional imaging technique that has been recently introduced as an imaging tool primarily in developmental biology and gene expression studies. The technique renders biological sample optically transparent by first dehydrating them and then placing in a mixture of benzyl alcohol and benzyl benzoate in a 2:1 ratio (BABB or Murray s Clear solution). The technique renders biological samples optically transparent by first dehydrating them in graded ethanol solutions then placing them in a mixture of benzyl alcohol and benzyl benzoate in a 2:1 ratio (BABB or Murray s Clear solution) to clear. After the clearing process the scattering contribution in the sample can be greatly reduced and made almost negligible while the absorption contribution cannot be eliminated completely. When trying to reconstruct the fluorescence distribution within the sample under investigation, this contribution affects the reconstructions and leads, inevitably, to image artifacts and quantification errors.. While absorption could be reduced further with a permanence of weeks or months in the clearing media, this will lead to progressive loss of fluorescence and to an unrealistically long sample processing time. This is true when reconstructing both exogenous contrast agents (molecular contrast agents) as well as endogenous contrast (e.g. reconstructions of genetically expressed fluorescent proteins).
Bioengineering, Issue 28, optical imaging, fluorescence imaging, optical projection tomography, born normalization, molecular imaging, heart imaging
Play Button
Investigating the Microbial Community in the Termite Hindgut - Interview
Authors: Jared Leadbetter.
Institutions: California Institute of Technology - Caltech.
Jared Leadbetter explains why the termite-gut microbial community is an excellent system for studying the complex interactions between microbes. The symbiotic relationship existing between the host insect and lignocellulose-degrading gut microbes is explained, as well as the industrial uses of these microbes for degrading plant biomass and generating biofuels.
Microbiology, issue 4, microbial community, diversity
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.