JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Exploring the Relationship between the Engineering and Physical Sciences and the Health and Life Sciences by Advanced Bibliometric Methods.
PLoS ONE
PUBLISHED: 01-01-2014
We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach to analyze the 'EPS-HLS interface' is based on term map visualizations of HLS research fields. We consider 16 clinical fields and five life science fields. On the basis of expert judgment, EPS research in these fields is studied by identifying EPS-related terms in the term maps. In the second approach, a large-scale citation-based network analysis is applied to publications from all fields of science. We work with about 22,000 clusters of publications, each representing a topic in the scientific literature. Citation relations are used to identify topics at the EPS-HLS interface. The two approaches complement each other. The advantages of working with textual data compensate for the limitations of working with citation relations and the other way around. An important advantage of working with textual data is in the in-depth qualitative insights it provides. Working with citation relations, on the other hand, yields many relevant quantitative statistics. We find that EPS research contributes to HLS developments mainly in the following five ways: new materials and their properties; chemical methods for analysis and molecular synthesis; imaging of parts of the body as well as of biomaterial surfaces; medical engineering mainly related to imaging, radiation therapy, signal processing technology, and other medical instrumentation; mathematical and statistical methods for data analysis. In our analysis, about 10% of all EPS and HLS publications are classified as being at the EPS-HLS interface. This percentage has remained more or less constant during the past decade.
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Published: 11-28-2014
ABSTRACT
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
26 Related JoVE Articles!
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
Characterization of Recombination Effects in a Liquid Ionization Chamber Used for the Dosimetry of a Radiosurgical Accelerator
Authors: Antoine Wagner, Frederik Crop, Thomas Lacornerie, Nick Reynaert.
Institutions: Centre Oscar Lambret.
Most modern radiation therapy devices allow the use of very small fields, either through beamlets in Intensity-Modulated Radiation Therapy (IMRT) or via stereotactic radiotherapy where positioning accuracy allows delivering very high doses per fraction in a small volume of the patient. Dosimetric measurements on medical accelerators are conventionally realized using air-filled ionization chambers. However, in small beams these are subject to nonnegligible perturbation effects. This study focuses on liquid ionization chambers, which offer advantages in terms of spatial resolution and low fluence perturbation. Ion recombination effects are investigated for the microLion detector (PTW) used with the Cyberknife system (Accuray). The method consists of performing a series of water tank measurements at different source-surface distances, and applying corrections to the liquid detector readings based on simultaneous gaseous detector measurements. This approach facilitates isolating the recombination effects arising from the high density of the liquid sensitive medium and obtaining correction factors to apply to the detector readings. The main difficulty resides in achieving a sufficient level of accuracy in the setup to be able to detect small changes in the chamber response.
Physics, Issue 87, Radiation therapy, dosimetry, small fields, Cyberknife, liquid ionization, recombination effects
51296
Play Button
Flame Experiments at the Advanced Light Source: New Insights into Soot Formation Processes
Authors: Nils Hansen, Scott A. Skeen, Hope A. Michelsen, Kevin R. Wilson, Katharina Kohse-Höinghaus.
Institutions: Sandia National Laboratories, Lawrence Berkeley National Laboratory, Universität Bielefeld.
The following experimental protocols and the accompanying video are concerned with the flame experiments that are performed at the Chemical Dynamics Beamline of the Advanced Light Source (ALS) of the Lawrence Berkeley National Laboratory1-4. This video demonstrates how the complex chemical structures of laboratory-based model flames are analyzed using flame-sampling mass spectrometry with tunable synchrotron-generated vacuum-ultraviolet (VUV) radiation. This experimental approach combines isomer-resolving capabilities with high sensitivity and a large dynamic range5,6. The first part of the video describes experiments involving burner-stabilized, reduced-pressure (20-80 mbar) laminar premixed flames. A small hydrocarbon fuel was used for the selected flame to demonstrate the general experimental approach. It is shown how species’ profiles are acquired as a function of distance from the burner surface and how the tunability of the VUV photon energy is used advantageously to identify many combustion intermediates based on their ionization energies. For example, this technique has been used to study gas-phase aspects of the soot-formation processes, and the video shows how the resonance-stabilized radicals, such as C3H3, C3H5, and i-C4H5, are identified as important intermediates7. The work has been focused on soot formation processes, and, from the chemical point of view, this process is very intriguing because chemical structures containing millions of carbon atoms are assembled from a fuel molecule possessing only a few carbon atoms in just milliseconds. The second part of the video highlights a new experiment, in which an opposed-flow diffusion flame and synchrotron-based aerosol mass spectrometry are used to study the chemical composition of the combustion-generated soot particles4. The experimental results indicate that the widely accepted H-abstraction-C2H2-addition (HACA) mechanism is not the sole molecular growth process responsible for the formation of the observed large polycyclic aromatic hydrocarbons (PAHs).
Physics, Issue 87, Combustion, Flame, Energy Conversion, Mass Spectrometry, Photoionization, Synchrotron, Hydrocarbon, Soot, Aerosol, Isomer
51369
Play Button
Confocal Imaging of Confined Quiescent and Flowing Colloid-polymer Mixtures
Authors: Rahul Pandey, Melissa Spannuth, Jacinta C. Conrad.
Institutions: University of Houston.
The behavior of confined colloidal suspensions with attractive interparticle interactions is critical to the rational design of materials for directed assembly1-3, drug delivery4, improved hydrocarbon recovery5-7, and flowable electrodes for energy storage8. Suspensions containing fluorescent colloids and non-adsorbing polymers are appealing model systems, as the ratio of the polymer radius of gyration to the particle radius and concentration of polymer control the range and strength of the interparticle attraction, respectively. By tuning the polymer properties and the volume fraction of the colloids, colloid fluids, fluids of clusters, gels, crystals, and glasses can be obtained9. Confocal microscopy, a variant of fluorescence microscopy, allows an optically transparent and fluorescent sample to be imaged with high spatial and temporal resolution in three dimensions. In this technique, a small pinhole or slit blocks the emitted fluorescent light from regions of the sample that are outside the focal volume of the microscope optical system. As a result, only a thin section of the sample in the focal plane is imaged. This technique is particularly well suited to probe the structure and dynamics in dense colloidal suspensions at the single-particle scale: the particles are large enough to be resolved using visible light and diffuse slowly enough to be captured at typical scan speeds of commercial confocal systems10. Improvements in scan speeds and analysis algorithms have also enabled quantitative confocal imaging of flowing suspensions11-16,37. In this paper, we demonstrate confocal microscopy experiments to probe the confined phase behavior and flow properties of colloid-polymer mixtures. We first prepare colloid-polymer mixtures that are density- and refractive-index matched. Next, we report a standard protocol for imaging quiescent dense colloid-polymer mixtures under varying confinement in thin wedge-shaped cells. Finally, we demonstrate a protocol for imaging colloid-polymer mixtures during microchannel flow.
Chemistry, Issue 87, confocal microscopy, particle tracking, colloids, suspensions, confinement, gelation, microfluidics, image correlation, dynamics, suspension flow
51461
Play Button
A Method of Permeabilization of Drosophila Embryos for Assays of Small Molecule Activity
Authors: Matthew D. Rand.
Institutions: University of Rochester School of Dentistry and Medicine.
The Drosophila embryo has long been a powerful laboratory model for elucidating molecular and genetic mechanisms that control development. The ease of genetic manipulations with this model has supplanted pharmacological approaches that are commonplace in other animal models and cell-based assays. Here we describe recent advances in a protocol that enables application of small molecules to the developing fruit fly embryo. The method details steps to overcome the impermeability of the eggshell while maintaining embryo viability. Eggshell permeabilization across a broad range of developmental stages is achieved by application of a previously described d-limonene embryo permeabilization solvent (EPS1) and by aging embryos at reduced temperature (18 °C) prior to treatments. In addition, use of a far-red dye (CY5) as a permeabilization indicator is described, which is compatible with downstream applications involving standard red and green fluorescent dyes in live and fixed preparations. This protocol is applicable to studies using bioactive compounds to probe developmental mechanisms as well as for studies aimed at evaluating teratogenic or pharmacologic activity of uncharacterized small molecules.
Bioengineering, Issue 89, Drosophila embryo, embryo development, viteline membrane, d-limonene, membrane permeabilization, teratogen, Rhodamine B, CY5, methylmercury
51634
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
51823
Play Button
Multimodal Optical Microscopy Methods Reveal Polyp Tissue Morphology and Structure in Caribbean Reef Building Corals
Authors: Mayandi Sivaguru, Glenn A. Fried, Carly A. H. Miller, Bruce W. Fouke.
Institutions: University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign.
An integrated suite of imaging techniques has been applied to determine the three-dimensional (3D) morphology and cellular structure of polyp tissues comprising the Caribbean reef building corals Montastraeaannularis and M. faveolata. These approaches include fluorescence microscopy (FM), serial block face imaging (SBFI), and two-photon confocal laser scanning microscopy (TPLSM). SBFI provides deep tissue imaging after physical sectioning; it details the tissue surface texture and 3D visualization to tissue depths of more than 2 mm. Complementary FM and TPLSM yield ultra-high resolution images of tissue cellular structure. Results have: (1) identified previously unreported lobate tissue morphologies on the outer wall of individual coral polyps and (2) created the first surface maps of the 3D distribution and tissue density of chromatophores and algae-like dinoflagellate zooxanthellae endosymbionts. Spectral absorption peaks of 500 nm and 675 nm, respectively, suggest that M. annularis and M. faveolata contain similar types of chlorophyll and chromatophores. However, M. annularis and M. faveolata exhibit significant differences in the tissue density and 3D distribution of these key cellular components. This study focusing on imaging methods indicates that SBFI is extremely useful for analysis of large mm-scale samples of decalcified coral tissues. Complimentary FM and TPLSM reveal subtle submillimeter scale changes in cellular distribution and density in nondecalcified coral tissue samples. The TPLSM technique affords: (1) minimally invasive sample preparation, (2) superior optical sectioning ability, and (3) minimal light absorption and scattering, while still permitting deep tissue imaging.
Environmental Sciences, Issue 91, Serial block face imaging, two-photon fluorescence microscopy, Montastraea annularis, Montastraea faveolata, 3D coral tissue morphology and structure, zooxanthellae, chromatophore, autofluorescence, light harvesting optimization, environmental change
51824
Play Button
Utilization of Microscale Silicon Cantilevers to Assess Cellular Contractile Function In Vitro
Authors: Alec S.T. Smith, Christopher J. Long, Christopher McAleer, Nathaniel Bobbitt, Balaji Srinivasan, James J. Hickman.
Institutions: University of Central Florida.
The development of more predictive and biologically relevant in vitro assays is predicated on the advancement of versatile cell culture systems which facilitate the functional assessment of the seeded cells. To that end, microscale cantilever technology offers a platform with which to measure the contractile functionality of a range of cell types, including skeletal, cardiac, and smooth muscle cells, through assessment of contraction induced substrate bending. Application of multiplexed cantilever arrays provides the means to develop moderate to high-throughput protocols for assessing drug efficacy and toxicity, disease phenotype and progression, as well as neuromuscular and other cell-cell interactions. This manuscript provides the details for fabricating reliable cantilever arrays for this purpose, and the methods required to successfully culture cells on these surfaces. Further description is provided on the steps necessary to perform functional analysis of contractile cell types maintained on such arrays using a novel laser and photo-detector system. The representative data provided highlights the precision and reproducible nature of the analysis of contractile function possible using this system, as well as the wide range of studies to which such technology can be applied. Successful widespread adoption of this system could provide investigators with the means to perform rapid, low cost functional studies in vitro, leading to more accurate predictions of tissue performance, disease development and response to novel therapeutic treatment.
Bioengineering, Issue 92, cantilever, in vitro, contraction, skeletal muscle, NMJ, cardiomyocytes, functional
51866
Play Button
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Authors: Madeleine E. Hackney, Kathleen McKee.
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
52066
Play Button
Protocols for Implementing an Escherichia coli Based TX-TL Cell-Free Expression System for Synthetic Biology
Authors: Zachary Z. Sun, Clarmyra A. Hayes, Jonghyeon Shin, Filippo Caschera, Richard M. Murray, Vincent Noireaux.
Institutions: California Institute of Technology, California Institute of Technology, Massachusetts Institute of Technology, University of Minnesota.
Ideal cell-free expression systems can theoretically emulate an in vivo cellular environment in a controlled in vitro platform.1 This is useful for expressing proteins and genetic circuits in a controlled manner as well as for providing a prototyping environment for synthetic biology.2,3 To achieve the latter goal, cell-free expression systems that preserve endogenous Escherichia coli transcription-translation mechanisms are able to more accurately reflect in vivo cellular dynamics than those based on T7 RNA polymerase transcription. We describe the preparation and execution of an efficient endogenous E. coli based transcription-translation (TX-TL) cell-free expression system that can produce equivalent amounts of protein as T7-based systems at a 98% cost reduction to similar commercial systems.4,5 The preparation of buffers and crude cell extract are described, as well as the execution of a three tube TX-TL reaction. The entire protocol takes five days to prepare and yields enough material for up to 3000 single reactions in one preparation. Once prepared, each reaction takes under 8 hr from setup to data collection and analysis. Mechanisms of regulation and transcription exogenous to E. coli, such as lac/tet repressors and T7 RNA polymerase, can be supplemented.6 Endogenous properties, such as mRNA and DNA degradation rates, can also be adjusted.7 The TX-TL cell-free expression system has been demonstrated for large-scale circuit assembly, exploring biological phenomena, and expression of proteins under both T7- and endogenous promoters.6,8 Accompanying mathematical models are available.9,10 The resulting system has unique applications in synthetic biology as a prototyping environment, or "TX-TL biomolecular breadboard."
Cellular Biology, Issue 79, Bioengineering, Synthetic Biology, Chemistry Techniques, Synthetic, Molecular Biology, control theory, TX-TL, cell-free expression, in vitro, transcription-translation, cell-free protein synthesis, synthetic biology, systems biology, Escherichia coli cell extract, biological circuits, biomolecular breadboard
50762
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
50671
Play Button
Concurrent Quantification of Cellular and Extracellular Components of Biofilms
Authors: Sharukh S. Khajotia, Kristin H. Smart, Mpala Pilula, David M. Thompson.
Institutions: University of Oklahoma Health Sciences Center, University of Oklahoma Health Sciences Center, The Copperbelt University.
Confocal laser scanning microscopy (CLSM) is a powerful tool for investigation of biofilms. Very few investigations have successfully quantified concurrent distribution of more than two components within biofilms because: 1) selection of fluorescent dyes having minimal spectral overlap is complicated, and 2) quantification of multiple fluorochromes poses a multifactorial problem. Objectives: Report a methodology to quantify and compare concurrent 3-dimensional distributions of three cellular/extracellular components of biofilms grown on relevant substrates. Methods: The method consists of distinct, interconnected steps involving biofilm growth, staining, CLSM imaging, biofilm structural analysis and visualization, and statistical analysis of structural parameters. Biofilms of Streptococcus mutans (strain UA159) were grown for 48 hr on sterile specimens of Point 4 and TPH3 resin composites. Specimens were subsequently immersed for 60 sec in either Biotène PBF (BIO) or Listerine Total Care (LTO) mouthwashes, or water (control group; n=5/group). Biofilms were stained with fluorochromes for extracellular polymeric substances, proteins and nucleic acids before imaging with CLSM. Biofilm structural parameters calculated using ISA3D image analysis software were biovolume and mean biofilm thickness. Mixed models statistical analyses compared structural parameters between mouthwash and control groups (SAS software; α=0.05). Volocity software permitted visualization of 3D distributions of overlaid biofilm components (fluorochromes). Results: Mouthwash BIO produced biofilm structures that differed significantly from the control (p<0.05) on both resin composites, whereas LTO did not produce differences (p>0.05) on either product. Conclusions: This methodology efficiently and successfully quantified and compared concurrent 3D distributions of three major components within S. mutans biofilms on relevant substrates, thus overcoming two challenges to simultaneous assessment of biofilm components. This method can also be used to determine the efficacy of antibacterial/antifouling agents against multiple biofilm components, as shown using mouthwashes. Furthermore, this method has broad application because it facilitates comparison of 3D structures/architecture of biofilms in a variety of disciplines.
Immunology, Issue 82, Extracellular Matrix, Streptococcus mutans, Dental Materials, Fluorescent Dyes, Composite Resins, Microscopy, Confocal, Permanent, Biofilms, Microbiological Phenomena, Streptococcus mutans, 3-dimensional structure, confocal laser scanning microscopy, fluorescent stains, dental biomaterials, dental resin composites, biofilm structural analysis, image analysis, image reconstruction
50639
Play Button
An Analytical Tool-box for Comprehensive Biochemical, Structural and Transcriptome Evaluation of Oral Biofilms Mediated by Mutans Streptococci
Authors: Marlise I. Klein, Jin Xiao, Arne Heydorn, Hyun Koo.
Institutions: University of Rochester Medical Center, Sichuan University, Glostrup Hospital, Glostrup, Denmark, University of Rochester Medical Center.
Biofilms are highly dynamic, organized and structured communities of microbial cells enmeshed in an extracellular matrix of variable density and composition 1, 2. In general, biofilms develop from initial microbial attachment on a surface followed by formation of cell clusters (or microcolonies) and further development and stabilization of the microcolonies, which occur in a complex extracellular matrix. The majority of biofilm matrices harbor exopolysaccharides (EPS), and dental biofilms are no exception; especially those associated with caries disease, which are mostly mediated by mutans streptococci 3. The EPS are synthesized by microorganisms (S. mutans, a key contributor) by means of extracellular enzymes, such as glucosyltransferases using sucrose primarily as substrate 3. Studies of biofilms formed on tooth surfaces are particularly challenging owing to their constant exposure to environmental challenges associated with complex diet-host-microbial interactions occurring in the oral cavity. Better understanding of the dynamic changes of the structural organization and composition of the matrix, physiology and transcriptome/proteome profile of biofilm-cells in response to these complex interactions would further advance the current knowledge of how oral biofilms modulate pathogenicity. Therefore, we have developed an analytical tool-box to facilitate biofilm analysis at structural, biochemical and molecular levels by combining commonly available and novel techniques with custom-made software for data analysis. Standard analytical (colorimetric assays, RT-qPCR and microarrays) and novel fluorescence techniques (for simultaneous labeling of bacteria and EPS) were integrated with specific software for data analysis to address the complex nature of oral biofilm research. The tool-box is comprised of 4 distinct but interconnected steps (Figure 1): 1) Bioassays, 2) Raw Data Input, 3) Data Processing, and 4) Data Analysis. We used our in vitro biofilm model and specific experimental conditions to demonstrate the usefulness and flexibility of the tool-box. The biofilm model is simple, reproducible and multiple replicates of a single experiment can be done simultaneously 4, 5. Moreover, it allows temporal evaluation, inclusion of various microbial species 5 and assessment of the effects of distinct experimental conditions (e.g. treatments 6; comparison of knockout mutants vs. parental strain 5; carbohydrates availability 7). Here, we describe two specific components of the tool-box, including (i) new software for microarray data mining/organization (MDV) and fluorescence imaging analysis (DUOSTAT), and (ii) in situ EPS-labeling. We also provide an experimental case showing how the tool-box can assist with biofilms analysis, data organization, integration and interpretation.
Microbiology, Issue 47, Extracellular matrix, polysaccharides, biofilm, mutans streptococci, glucosyltransferases, confocal fluorescence, microarray
2512
Play Button
Extraction of High Molecular Weight DNA from Microbial Mats
Authors: Benjamin S. Bey, Erin B. Fichot, R. Sean Norman.
Institutions: Arnold School of Public Health, University of South Carolina.
Successful and accurate analysis and interpretation of metagenomic data is dependent upon the efficient extraction of high-quality, high molecular weight (HMW) community DNA. However, environmental mat samples often pose difficulties to obtaining large concentrations of high-quality, HMW DNA. Hypersaline microbial mats contain high amounts of extracellular polymeric substances (EPS)1 and salts that may inhibit downstream applications of extracted DNA. Direct and harsh methods are often used in DNA extraction from refractory samples. These methods are typically used because the EPS in mats, an adhesive matrix, binds DNA2,3 during direct lysis. As a result of harsher extraction methods, DNA becomes fragmented into small sizes4,5,6. The DNA thus becomes inappropriate for large-insert vector cloning. In order to circumvent these limitations, we report an improved methodology to extract HMW DNA of good quality and quantity from hypersaline microbial mats. We employed an indirect method involving the separation of microbial cells from the background mat matrix through blending and differential centrifugation. A combination of mechanical and chemical procedures was used to extract and purify DNA from the extracted microbial cells. Our protocol yields approximately 2 μg of HMW DNA (35-50 kb) per gram of mat sample, with an A260/280 ratio of 1.6. Furthermore, amplification of 16S rRNA genes7 suggests that the protocol is able to minimize or eliminate any inhibitory effects of contaminants. Our results provide an appropriate methodology for the extraction of HMW DNA from microbial mats for functional metagenomic studies and may be applicable to other environmental samples from which DNA extraction is challenging.
Molecular Biology, Issue 53, Metagenomics, extracellular polymeric substances, DNA extraction, Microbial mats, hypersaline, extreme environment
2887
Play Button
Growth of Mycobacterium tuberculosis Biofilms
Authors: Kathleen Kulka, Graham Hatfull, Anil K. Ojha.
Institutions: University of Pittsburgh, University of Pittsburgh.
Mycobacterium tuberculosis, the etiologic agent of human tuberculosis, has an extraordinary ability to survive against environmental stresses including antibiotics. Although stress tolerance of M. tuberculosis is one of the likely contributors to the 6-month long chemotherapy of tuberculosis 1, the molecular mechanisms underlying this characteristic phenotype of the pathogen remain unclear. Many microbial species have evolved to survive in stressful environments by self-assembling in highly organized, surface attached, and matrix encapsulated structures called biofilms 2-4. Growth in communities appears to be a preferred survival strategy of microbes, and is achieved through genetic components that regulate surface attachment, intercellular communications, and synthesis of extracellular polymeric substances (EPS) 5,6. The tolerance to environmental stress is likely facilitated by EPS, and perhaps by the physiological adaptation of individual bacilli to heterogeneous microenvironments within the complex architecture of biofilms 7. In a series of recent papers we established that M. tuberculosis and Mycobacterium smegmatis have a strong propensity to grow in organized multicellular structures, called biofilms, which can tolerate more than 50 times the minimal inhibitory concentrations of the anti-tuberculosis drugs isoniazid and rifampicin 8-10. M. tuberculosis, however, intriguingly requires specific conditions to form mature biofilms, in particular 9:1 ratio of headspace: media as well as limited exchange of air with the atmosphere 9. Requirements of specialized environmental conditions could possibly be linked to the fact that M. tuberculosis is an obligate human pathogen and thus has adapted to tissue environments. In this publication we demonstrate methods for culturing M. tuberculosis biofilms in a bottle and a 12-well plate format, which is convenient for bacteriological as well as genetic studies. We have described the protocol for an attenuated strain of M. tuberculosis, mc27000, with deletion in the two loci, panCD and RD1, that are critical for in vivo growth of the pathogen 9. This strain can be safely used in a BSL-2 containment for understanding the basic biology of the tuberculosis pathogen thus avoiding the requirement of an expensive BSL-3 facility. The method can be extended, with appropriate modification in media, to grow biofilm of other culturable mycobacterial species. Overall, a uniform protocol of culturing mycobacterial biofilms will help the investigators interested in studying the basic resilient characteristics of mycobacteria. In addition, a clear and concise method of growing mycobacterial biofilms will also help the clinical and pharmaceutical investigators to test the efficacy of a potential drug.
Immunology, Issue 60, Mycobacterium tuberculosis, tuberculosis, drug tolerance, biofilms
3820
Play Button
Mapping Cortical Dynamics Using Simultaneous MEG/EEG and Anatomically-constrained Minimum-norm Estimates: an Auditory Attention Example
Authors: Adrian K.C. Lee, Eric Larson, Ross K. Maddox.
Institutions: University of Washington.
Magneto- and electroencephalography (MEG/EEG) are neuroimaging techniques that provide a high temporal resolution particularly suitable to investigate the cortical networks involved in dynamical perceptual and cognitive tasks, such as attending to different sounds in a cocktail party. Many past studies have employed data recorded at the sensor level only, i.e., the magnetic fields or the electric potentials recorded outside and on the scalp, and have usually focused on activity that is time-locked to the stimulus presentation. This type of event-related field / potential analysis is particularly useful when there are only a small number of distinct dipolar patterns that can be isolated and identified in space and time. Alternatively, by utilizing anatomical information, these distinct field patterns can be localized as current sources on the cortex. However, for a more sustained response that may not be time-locked to a specific stimulus (e.g., in preparation for listening to one of the two simultaneously presented spoken digits based on the cued auditory feature) or may be distributed across multiple spatial locations unknown a priori, the recruitment of a distributed cortical network may not be adequately captured by using a limited number of focal sources. Here, we describe a procedure that employs individual anatomical MRI data to establish a relationship between the sensor information and the dipole activation on the cortex through the use of minimum-norm estimates (MNE). This inverse imaging approach provides us a tool for distributed source analysis. For illustrative purposes, we will describe all procedures using FreeSurfer and MNE software, both freely available. We will summarize the MRI sequences and analysis steps required to produce a forward model that enables us to relate the expected field pattern caused by the dipoles distributed on the cortex onto the M/EEG sensors. Next, we will step through the necessary processes that facilitate us in denoising the sensor data from environmental and physiological contaminants. We will then outline the procedure for combining and mapping MEG/EEG sensor data onto the cortical space, thereby producing a family of time-series of cortical dipole activation on the brain surface (or "brain movies") related to each experimental condition. Finally, we will highlight a few statistical techniques that enable us to make scientific inference across a subject population (i.e., perform group-level analysis) based on a common cortical coordinate space.
Neuroscience, Issue 68, Magnetoencephalography, MEG, Electroencephalography, EEG, audition, attention, inverse imaging
4262
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Telomere Length and Telomerase Activity; A Yin and Yang of Cell Senescence
Authors: Mary Derasmo Axelrad, Temuri Budagov, Gil Atzmon.
Institutions: Albert Einstein College of Medicine , Albert Einstein College of Medicine , Albert Einstein College of Medicine .
Telomeres are repeating DNA sequences at the tip ends of the chromosomes that are diverse in length and in humans can reach a length of 15,000 base pairs. The telomere serves as a bioprotective mechanism of chromosome attrition at each cell division. At a certain length, telomeres become too short to allow replication, a process that may lead to chromosome instability or cell death. Telomere length is regulated by two opposing mechanisms: attrition and elongation. Attrition occurs as each cell divides. In contrast, elongation is partially modulated by the enzyme telomerase, which adds repeating sequences to the ends of the chromosomes. In this way, telomerase could possibly reverse an aging mechanism and rejuvenates cell viability. These are crucial elements in maintaining cell life and are used to assess cellular aging. In this manuscript we will describe an accurate, short, sophisticated and cheap method to assess telomere length in multiple tissues and species. This method takes advantage of two key elements, the tandem repeat of the telomere sequence and the sensitivity of the qRT-PCR to detect differential copy numbers of tested samples. In addition, we will describe a simple assay to assess telomerase activity as a complementary backbone test for telomere length.
Genetics, Issue 75, Molecular Biology, Cellular Biology, Medicine, Biomedical Engineering, Genomics, Telomere length, telomerase activity, telomerase, telomeres, telomere, DNA, PCR, polymerase chain reaction, qRT-PCR, sequencing, aging, telomerase assay
50246
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
50341
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
50427
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Facilitating the Analysis of Immunological Data with Visual Analytic Techniques
Authors: David C. Shih, Kevin C. Ho, Kyle M. Melnick, Ronald A. Rensink, Tobias R. Kollmann, Edgardo S. Fortuno III.
Institutions: University of British Columbia, University of British Columbia, University of British Columbia.
Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.
Immunology, Issue 47, Visual analytics, flow cytometry, Luminex, Tableau, cytokine, innate immunity, single nucleotide polymorphism
2397
Play Button
Micro-scale Engineering for Cell Biology
Authors: Joel Voldman.
Institutions: MIT - Massachusetts Institute of Technology.
Cellular Biology, Issue 8, stem cells, tissue engineering, bioengineering
317
Play Button
Experimental Approaches to Tissue Engineering
Authors: Ali Khademhosseini.
Institutions: Brigham and Women's Hospital.
Issue 7, Cell Biology, tissue engineering, microfluidics, stem cells
272
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.