JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Can Handheld Thermal Imaging Technology Improve Detection of Poachers in African Bushveldt?
PUBLISHED: 06-26-2015
Illegal hunting (poaching) is a global threat to wildlife. Anti-poaching initiatives are making increasing use of technology, such as infrared thermography (IRT), to support traditional foot and vehicle patrols. To date, the effectiveness of IRT for poacher location has not been tested under field conditions, where thermal signatures are often complex. Here, we test the hypothesis that IRT will increase the distance over which a poacher hiding in African scrub bushveldt can be detected relative to a conventional flashlight. We also test whether any increase in effectiveness is related to the cost and complexity of the equipment by comparing comparatively expensive (22000 USD) and relatively inexpensive (2000 USD) IRT devices. To test these hypotheses we employ a controlled, fully randomised, double-blind procedure to find a poacher in nocturnal field conditions in African bushveldt. Each of our 27 volunteer observers walked three times along a pathway using one detection technology on each pass in randomised order. They searched a prescribed search area of bushveldt within which the target was hiding. Hiding locations were pre-determined, randomised, and changed with each pass. Distances of first detection and positive detection were noted. All technologies could be used to detect the target. Average first detection distance for flashlight was 37.3m, improving by 19.8m to 57.1m using LIRT and by a further 11.2m to 68.3m using HIRT. Although detection distances were significantly greater for both IRTs compared to flashlight, there was no significant difference between LIRT and HIRT. False detection rates were low and there was no significant association between technology and accuracy of detection. Although IRT technology should ideally be tested in the specific environment intended before significant investment is made, we conclude that IRT technology is promising for anti-poaching patrols and that for this purpose low cost IRT units are as effective as units ten times more expensive.
Authors: Frank Bucholtz.
Published: 05-29-2008
Site-specific recombinase (SSR) technology allows the manipulation of gene structure to explore gene function and has become an integral tool of molecular biology. Site-specific recombinases are proteins that bind to distinct DNA target sequences. The Cre/lox system was first described in bacteriophages during the 1980's. Cre recombinase is a Type I topoisomerase that catalyzes site-specific recombination of DNA between two loxP (locus of X-over P1) sites. The Cre/lox system does not require any cofactors. LoxP sequences contain distinct binding sites for Cre recombinases that surround a directional core sequence where recombination and rearrangement takes place. When cells contain loxP sites and express the Cre recombinase, a recombination event occurs. Double-stranded DNA is cut at both loxP sites by the Cre recombinase, rearranged, and ligated ("scissors and glue"). Products of the recombination event depend on the relative orientation of the asymmetric sequences. SSR technology is frequently used as a tool to explore gene function. Here the gene of interest is flanked with Cre target sites loxP ("floxed"). Animals are then crossed with animals expressing the Cre recombinase under the control of a tissue-specific promoter. In tissues that express the Cre recombinase it binds to target sequences and excises the floxed gene. Controlled gene deletion allows the investigation of gene function in specific tissues and at distinct time points. Analysis of gene function employing SSR technology --- conditional mutagenesis -- has significant advantages over traditional knock-outs where gene deletion is frequently lethal.
27 Related JoVE Articles!
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Play Button
Scalable High Throughput Selection From Phage-displayed Synthetic Antibody Libraries
Authors: Shane Miersch, Zhijian Li, Rachel Hanna, Megan E. McLaughlin, Michael Hornsby, Tet Matsuguchi, Marcin Paduch, Annika Sääf, Jim Wells, Shohei Koide, Anthony Kossiakoff, Sachdev S. Sidhu.
Institutions: The Recombinant Antibody Network, University of Toronto, University of California, San Francisco at Mission Bay, The University of Chicago.
The demand for antibodies that fulfill the needs of both basic and clinical research applications is high and will dramatically increase in the future. However, it is apparent that traditional monoclonal technologies are not alone up to this task. This has led to the development of alternate methods to satisfy the demand for high quality and renewable affinity reagents to all accessible elements of the proteome. Toward this end, high throughput methods for conducting selections from phage-displayed synthetic antibody libraries have been devised for applications involving diverse antigens and optimized for rapid throughput and success. Herein, a protocol is described in detail that illustrates with video demonstration the parallel selection of Fab-phage clones from high diversity libraries against hundreds of targets using either a manual 96 channel liquid handler or automated robotics system. Using this protocol, a single user can generate hundreds of antigens, select antibodies to them in parallel and validate antibody binding within 6-8 weeks. Highlighted are: i) a viable antigen format, ii) pre-selection antigen characterization, iii) critical steps that influence the selection of specific and high affinity clones, and iv) ways of monitoring selection effectiveness and early stage antibody clone characterization. With this approach, we have obtained synthetic antibody fragments (Fabs) to many target classes including single-pass membrane receptors, secreted protein hormones, and multi-domain intracellular proteins. These fragments are readily converted to full-length antibodies and have been validated to exhibit high affinity and specificity. Further, they have been demonstrated to be functional in a variety of standard immunoassays including Western blotting, ELISA, cellular immunofluorescence, immunoprecipitation and related assays. This methodology will accelerate antibody discovery and ultimately bring us closer to realizing the goal of generating renewable, high quality antibodies to the proteome.
Immunology, Issue 95, Bacteria, Viruses, Amino Acids, Peptides, and Proteins, Nucleic Acids, Nucleotides, and Nucleosides, Life Sciences (General), phage display, synthetic antibodies, high throughput, antibody selection, scalable methodology
Play Button
Reduced-gravity Environment Hardware Demonstrations of a Prototype Miniaturized Flow Cytometer and Companion Microfluidic Mixing Technology
Authors: William S. Phipps, Zhizhong Yin, Candice Bae, Julia Z. Sharpe, Andrew M. Bishara, Emily S. Nelson, Aaron S. Weaver, Daniel Brown, Terri L. McKay, DeVon Griffin, Eugene Y. Chan.
Institutions: DNA Medicine Institute, Harvard Medical School, NASA Glenn Research Center, ZIN Technologies.
Until recently, astronaut blood samples were collected in-flight, transported to earth on the Space Shuttle, and analyzed in terrestrial laboratories. If humans are to travel beyond low Earth orbit, a transition towards space-ready, point-of-care (POC) testing is required. Such testing needs to be comprehensive, easy to perform in a reduced-gravity environment, and unaffected by the stresses of launch and spaceflight. Countless POC devices have been developed to mimic laboratory scale counterparts, but most have narrow applications and few have demonstrable use in an in-flight, reduced-gravity environment. In fact, demonstrations of biomedical diagnostics in reduced gravity are limited altogether, making component choice and certain logistical challenges difficult to approach when seeking to test new technology. To help fill the void, we are presenting a modular method for the construction and operation of a prototype blood diagnostic device and its associated parabolic flight test rig that meet the standards for flight-testing onboard a parabolic flight, reduced-gravity aircraft. The method first focuses on rig assembly for in-flight, reduced-gravity testing of a flow cytometer and a companion microfluidic mixing chip. Components are adaptable to other designs and some custom components, such as a microvolume sample loader and the micromixer may be of particular interest. The method then shifts focus to flight preparation, by offering guidelines and suggestions to prepare for a successful flight test with regard to user training, development of a standard operating procedure (SOP), and other issues. Finally, in-flight experimental procedures specific to our demonstrations are described.
Cellular Biology, Issue 93, Point-of-care, prototype, diagnostics, spaceflight, reduced gravity, parabolic flight, flow cytometry, fluorescence, cell counting, micromixing, spiral-vortex, blood mixing
Play Button
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Authors: Mirella Vivoli, Halina R. Novak, Jennifer A. Littlechild, Nicholas J. Harmer.
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
Play Button
Universal Hand-held Three-dimensional Optoacoustic Imaging Probe for Deep Tissue Human Angiography and Functional Preclinical Studies in Real Time
Authors: Xosé Deán-Ben, Thomas Felix Fehm, Daniel Razansky.
Institutions: Helmholtz Zentrum München, Technische Universität München.
The exclusive combination of high optical contrast and excellent spatial resolution makes optoacoustics (photoacoustics) ideal for simultaneously attaining anatomical, functional and molecular contrast in deep optically opaque tissues. While enormous potential has been recently demonstrated in the application of optoacoustics for small animal research, vast efforts have also been undertaken in translating this imaging technology into clinical practice. We present here a newly developed optoacoustic tomography approach capable of delivering high resolution and spectrally enriched volumetric images of tissue morphology and function in real time. A detailed description of the experimental protocol for operating with the imaging system in both hand-held and stationary modes is provided and showcased for different potential scenarios involving functional and molecular studies in murine models and humans. The possibility for real time visualization in three dimensions along with the versatile handheld design of the imaging probe make the newly developed approach unique among the pantheon of imaging modalities used in today’s preclinical research and clinical practice.
Physiology, Issue 93, Optoacoustic tomography, photoacoustic imaging, hand-held probe, volumetric imaging, real-time tomography, five dimensional imaging, clinical imaging, functional imaging, molecular imaging, preclinical research
Play Button
Quantitative Detection of Trace Explosive Vapors by Programmed Temperature Desorption Gas Chromatography-Electron Capture Detector
Authors: Christopher R. Field, Adam Lubrano, Morgan Woytowitz, Braden C. Giordano, Susan L. Rose-Pehrsson.
Institutions: U.S. Naval Research Laboratory, NOVA Research, Inc., U.S. Naval Research Laboratory, U.S. Naval Research Laboratory.
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples.
Chemistry, Issue 89, Gas Chromatography (GC), Electron Capture Detector, Explosives, Quantitation, Thermal Desorption, TNT, RDX
Play Button
Measuring Attentional Biases for Threat in Children and Adults
Authors: Vanessa LoBue.
Institutions: Rutgers University.
Investigators have long been interested in the human propensity for the rapid detection of threatening stimuli. However, until recently, research in this domain has focused almost exclusively on adult participants, completely ignoring the topic of threat detection over the course of development. One of the biggest reasons for the lack of developmental work in this area is likely the absence of a reliable paradigm that can measure perceptual biases for threat in children. To address this issue, we recently designed a modified visual search paradigm similar to the standard adult paradigm that is appropriate for studying threat detection in preschool-aged participants. Here we describe this new procedure. In the general paradigm, we present participants with matrices of color photographs, and ask them to find and touch a target on the screen. Latency to touch the target is recorded. Using a touch-screen monitor makes the procedure simple and easy, allowing us to collect data in participants ranging from 3 years of age to adults. Thus far, the paradigm has consistently shown that both adults and children detect threatening stimuli (e.g., snakes, spiders, angry/fearful faces) more quickly than neutral stimuli (e.g., flowers, mushrooms, happy/neutral faces). Altogether, this procedure provides an important new tool for researchers interested in studying the development of attentional biases for threat.
Behavior, Issue 92, Detection, threat, attention, attentional bias, anxiety, visual search
Play Button
Development of a Quantitative Recombinase Polymerase Amplification Assay with an Internal Positive Control
Authors: Zachary A. Crannell, Brittany Rohrman, Rebecca Richards-Kortum.
Institutions: Rice University.
It was recently demonstrated that recombinase polymerase amplification (RPA), an isothermal amplification platform for pathogen detection, may be used to quantify DNA sample concentration using a standard curve. In this manuscript, a detailed protocol for developing and implementing a real-time quantitative recombinase polymerase amplification assay (qRPA assay) is provided. Using HIV-1 DNA quantification as an example, the assembly of real-time RPA reactions, the design of an internal positive control (IPC) sequence, and co-amplification of the IPC and target of interest are all described. Instructions and data processing scripts for the construction of a standard curve using data from multiple experiments are provided, which may be used to predict the concentration of unknown samples or assess the performance of the assay. Finally, an alternative method for collecting real-time fluorescence data with a microscope and a stage heater as a step towards developing a point-of-care qRPA assay is described. The protocol and scripts provided may be used for the development of a qRPA assay for any DNA target of interest.
Genetics, Issue 97, recombinase polymerase amplification, isothermal amplification, quantitative, diagnostic, HIV-1, viral load
Play Button
Making Record-efficiency SnS Solar Cells by Thermal Evaporation and Atomic Layer Deposition
Authors: Rafael Jaramillo, Vera Steinmann, Chuanxi Yang, Katy Hartman, Rupak Chakraborty, Jeremy R. Poindexter, Mariela Lizet Castillo, Roy Gordon, Tonio Buonassisi.
Institutions: Massachusetts Institute of Technology, Massachusetts Institute of Technology, Harvard University, Massachusetts Institute of Technology, Harvard University.
Tin sulfide (SnS) is a candidate absorber material for Earth-abundant, non-toxic solar cells. SnS offers easy phase control and rapid growth by congruent thermal evaporation, and it absorbs visible light strongly. However, for a long time the record power conversion efficiency of SnS solar cells remained below 2%. Recently we demonstrated new certified record efficiencies of 4.36% using SnS deposited by atomic layer deposition, and 3.88% using thermal evaporation. Here the fabrication procedure for these record solar cells is described, and the statistical distribution of the fabrication process is reported. The standard deviation of efficiency measured on a single substrate is typically over 0.5%. All steps including substrate selection and cleaning, Mo sputtering for the rear contact (cathode), SnS deposition, annealing, surface passivation, Zn(O,S) buffer layer selection and deposition, transparent conductor (anode) deposition, and metallization are described. On each substrate we fabricate 11 individual devices, each with active area 0.25 cm2. Further, a system for high throughput measurements of current-voltage curves under simulated solar light, and external quantum efficiency measurement with variable light bias is described. With this system we are able to measure full data sets on all 11 devices in an automated manner and in minimal time. These results illustrate the value of studying large sample sets, rather than focusing narrowly on the highest performing devices. Large data sets help us to distinguish and remedy individual loss mechanisms affecting our devices.
Engineering, Issue 99, Solar cells, thin films, thermal evaporation, atomic layer deposition, annealing, tin sulfide
Play Button
Morris Water Maze Test: Optimization for Mouse Strain and Testing Environment
Authors: Daniel S. Weitzner, Elizabeth B. Engler-Chiurazzi, Linda A. Kotilinek, Karen Hsiao Ashe, Miranda Nicole Reed.
Institutions: West Virginia University, West Virginia University, N. Bud Grossman Center for Memory Research and Care, University of Minnesota, N. Bud Grossman Center for Memory Research and Care, University of Minnesota, GRECC, VA Medical Center, West Virginia University.
The Morris water maze (MWM) is a commonly used task to assess hippocampal-dependent spatial learning and memory in transgenic mouse models of disease, including neurocognitive disorders such as Alzheimer’s disease. However, the background strain of the mouse model used can have a substantial effect on the observed behavioral phenotype, with some strains exhibiting superior learning ability relative to others. To ensure differences between transgene negative and transgene positive mice can be detected, identification of a training procedure sensitive to the background strain is essential. Failure to tailor the MWM protocol to the background strain of the mouse model may lead to under- or over- training, thereby masking group differences in probe trials. Here, a MWM protocol tailored for use with the F1 FVB/N x 129S6 background is described. This is a frequently used background strain to study the age-dependent effects of mutant P301L tau (rTg(TauP301L)4510 mice) on the memory deficits associated with Alzheimer’s disease. Also described is a strategy to re-optimize, as dictated by the particular testing environment utilized.
Behavior, Issue 100, Spatial learning, spatial reference memory, Morris water maze, Alzheimer’s disease, behavior, tau, hippocampal-dependent learning, rTg4510, Tg2576, strain background, transgenic mouse models
Play Button
Scanning-probe Single-electron Capacitance Spectroscopy
Authors: Kathleen A. Walsh, Megan E. Romanowich, Morewell Gasseller, Irma Kuljanishvili, Raymond Ashoori, Stuart Tessmer.
Institutions: Michigan State University, Mercyhurst University, Saint Louis University, Massachusetts Institute of Technology.
The integration of low-temperature scanning-probe techniques and single-electron capacitance spectroscopy represents a powerful tool to study the electronic quantum structure of small systems - including individual atomic dopants in semiconductors. Here we present a capacitance-based method, known as Subsurface Charge Accumulation (SCA) imaging, which is capable of resolving single-electron charging while achieving sufficient spatial resolution to image individual atomic dopants. The use of a capacitance technique enables observation of subsurface features, such as dopants buried many nanometers beneath the surface of a semiconductor material1,2,3. In principle, this technique can be applied to any system to resolve electron motion below an insulating surface. As in other electric-field-sensitive scanned-probe techniques4, the lateral spatial resolution of the measurement depends in part on the radius of curvature of the probe tip. Using tips with a small radius of curvature can enable spatial resolution of a few tens of nanometers. This fine spatial resolution allows investigations of small numbers (down to one) of subsurface dopants1,2. The charge resolution depends greatly on the sensitivity of the charge detection circuitry; using high electron mobility transistors (HEMT) in such circuits at cryogenic temperatures enables a sensitivity of approximately 0.01 electrons/Hz½ at 0.3 K 5.
Physics, Issue 77, Biophysics, Molecular Biology, Cellular Biology, Microscopy, Scanning Probe, Nanotechnology, Physics, Electronics, acceptors (solid state), donors (solid state), Solid-State Physics, tunneling microscopy, scanning capacitance microscopy, subsurface charge accumulation imaging, capacitance spectroscopy, scanning probe microscopy, single-electron spectroscopy, imaging
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
Play Button
Determining Cell Number During Cell Culture using the Scepter Cell Counter
Authors: Kathleen Ongena, Chandreyee Das, Janet L. Smith, Sónia Gil, Grace Johnston.
Institutions: Millipore Inc.
Counting cells is often a necessary but tedious step for in vitro cell culture. Consistent cell concentrations ensure experimental reproducibility and accuracy. Cell counts are important for monitoring cell health and proliferation rate, assessing immortalization or transformation, seeding cells for subsequent experiments, transfection or infection, and preparing for cell-based assays. It is important that cell counts be accurate, consistent, and fast, particularly for quantitative measurements of cellular responses. Despite this need for speed and accuracy in cell counting, 71% of 400 researchers surveyed1 who count cells using a hemocytometer. While hemocytometry is inexpensive, it is laborious and subject to user bias and misuse, which results in inaccurate counts. Hemocytometers are made of special optical glass on which cell suspensions are loaded in specified volumes and counted under a microscope. Sources of errors in hemocytometry include: uneven cell distribution in the sample, too many or too few cells in the sample, subjective decisions as to whether a given cell falls within the defined counting area, contamination of the hemocytometer, user-to-user variation, and variation of hemocytometer filling rate2. To alleviate the tedium associated with manual counting, 29% of researchers count cells using automated cell counting devices; these include vision-based counters, systems that detect cells using the Coulter principle, or flow cytometry1. For most researchers, the main barrier to using an automated system is the price associated with these large benchtop instruments1. The Scepter cell counter is an automated handheld device that offers the automation and accuracy of Coulter counting at a relatively low cost. The system employs the Coulter principle of impedance-based particle detection3 in a miniaturized format using a combination of analog and digital hardware for sensing, signal processing, data storage, and graphical display. The disposable tip is engineered with a microfabricated, cell- sensing zone that enables discrimination by cell size and cell volume at sub-micron and sub-picoliter resolution. Enhanced with precision liquid-handling channels and electronics, the Scepter cell counter reports cell population statistics graphically displayed as a histogram.
Cellular Biology, Issue 45, Scepter, cell counting, cell culture, hemocytometer, Coulter, Impedance-based particle detection
Play Button
Comprehensive & Cost Effective Laboratory Monitoring of HIV/AIDS: an African Role Model
Authors: Denise Lawrie, George Janossy, Maarten Roos, Deborah K. Glencross.
Institutions: National Health Laboratory Services (NHLS-SA), University of Witwatersrand, Lightcurve Films.
We present the video about assisting anti-retroviral therapy (ART) by an apt laboratory service - representing a South-African role model for economical large scale diagnostic testing. In the low-income countries inexpensive ART has transformed the prospects for the survival of HIV seropositive patients but there are doubts whether there is a need for the laboratory monitoring of ART and at what costs - in situations when the overall quality of pathology services can still be very low. The appropriate answer is to establish economically sound services with better coordination and stricter internal quality assessment than seen in western countries. This video, photographed at location in the National Health Laboratory Services (NHLS-SA) at the Witwatersrand University, Johannesburg, South Africa, provides such a coordinated scheme expanding the original 2-color CD4-CD45 PanLeucoGating strategy (PLG). Thus the six modules of the video presentation reveal the simplicity of a 4-color flow cytometric assay to combine haematological, immunological and virology-related tests in a single tube. These video modules are: (i) the set-up of instruments; (ii) sample preparations; (iii) testing absolute counts and monitoring quality for each sample by bead-count-rate; (iv) the heamatological CD45 test for white cell counts and differentials; (v) the CD4 counts, and (vi) the activation of CD8+ T cells measured by CD38 display, a viral load related parameter. The potential cost-savings are remarkable. This arrangement is a prime example for the feasibility of performing > 800-1000 tests per day with a stricter quality control than that applied in western laboratories, and also with a transfer of technology to other laboratories within a NHLS-SA network. Expert advisors, laboratory managers and policy makers who carry the duty of making decisions about introducing modern medical technology are frequently not in a position to see the latest technical details as carried out in the large regional laboratories with huge burdens of workload. Hence this video shows details of these new developments.
Immunology, Issue 44, Human Immunodeficiency virus (HIV); CD4 lymphocyte count, white cell count, CD45, panleucogating, lymphocyte activation, CD38, HIV viral load, antiretroviral therapy (ART), internal quality control
Play Button
Quantitative Visualization and Detection of Skin Cancer Using Dynamic Thermal Imaging
Authors: Cila Herman, Muge Pirtini Cetingul.
Institutions: The Johns Hopkins University.
In 2010 approximately 68,720 melanomas will be diagnosed in the US alone, with around 8,650 resulting in death 1. To date, the only effective treatment for melanoma remains surgical excision, therefore, the key to extended survival is early detection 2,3. Considering the large numbers of patients diagnosed every year and the limitations in accessing specialized care quickly, the development of objective in vivo diagnostic instruments to aid the diagnosis is essential. New techniques to detect skin cancer, especially non-invasive diagnostic tools, are being explored in numerous laboratories. Along with the surgical methods, techniques such as digital photography, dermoscopy, multispectral imaging systems (MelaFind), laser-based systems (confocal scanning laser microscopy, laser doppler perfusion imaging, optical coherence tomography), ultrasound, magnetic resonance imaging, are being tested. Each technique offers unique advantages and disadvantages, many of which pose a compromise between effectiveness and accuracy versus ease of use and cost considerations. Details about these techniques and comparisons are available in the literature 4. Infrared (IR) imaging was shown to be a useful method to diagnose the signs of certain diseases by measuring the local skin temperature. There is a large body of evidence showing that disease or deviation from normal functioning are accompanied by changes of the temperature of the body, which again affect the temperature of the skin 5,6. Accurate data about the temperature of the human body and skin can provide a wealth of information on the processes responsible for heat generation and thermoregulation, in particular the deviation from normal conditions, often caused by disease. However, IR imaging has not been widely recognized in medicine due to the premature use of the technology 7,8 several decades ago, when temperature measurement accuracy and the spatial resolution were inadequate and sophisticated image processing tools were unavailable. This situation changed dramatically in the late 1990s-2000s. Advances in IR instrumentation, implementation of digital image processing algorithms and dynamic IR imaging, which enables scientists to analyze not only the spatial, but also the temporal thermal behavior of the skin 9, allowed breakthroughs in the field. In our research, we explore the feasibility of IR imaging, combined with theoretical and experimental studies, as a cost effective, non-invasive, in vivo optical measurement technique for tumor detection, with emphasis on the screening and early detection of melanoma 10-13. In this study, we show data obtained in a patient study in which patients that possess a pigmented lesion with a clinical indication for biopsy are selected for imaging. We compared the difference in thermal responses between healthy and malignant tissue and compared our data with biopsy results. We concluded that the increased metabolic activity of the melanoma lesion can be detected by dynamic infrared imaging.
Medicine, Issue 51, Infrared imaging, quantitative thermal analysis, image processing, skin cancer, melanoma, transient thermal response, skin thermal models, skin phantom experiment, patient study
Play Button
ampliPHOX Colorimetric Detection on a DNA Microarray for Influenza
Authors: Kevin R. Moulton, Amber W. Taylor, Kathy L. Rowlen, Erica D. Dawson.
Institutions: Inc..
DNA microarrays have emerged as a powerful tool for pathogen detection.1-5 For instance, many examples of the ability to type and subtype influenza virus have been demonstrated.6-11 The identification and subtyping of influenza on DNA microarrays has applications in both public health and the clinic for early detection, rapid intervention, and minimizing the impact of an influenza pandemic. Traditional fluorescence is currently the most commonly used microarray detection method. However, as microarray technology progresses towards clinical use,1 replacing expensive instrumentation with low cost detection technology exhibiting similar performance characteristics to fluorescence will make microarray assays more attractive and cost-effective. The ampliPHOX colorimetric detection technology is intended for research applications, and has a limit of detection within one order of magnitude of traditional fluorescence11, with a main advantage being an approximate ten-fold lower instrument cost compared to the confocal microarray scanners required for fluorescence microarray detection. Another advantage is the compact size of the instrument which allows for portability and flexibility, unlike traditional fluorescence instruments. Because the polymerization technology is not as inherently linear as fluorescence detection, however, it is best suited for lower density microarray applications in which a yes/no answer for the presence of a certain sequence is desired, such as for pathogen detection arrays. Currently the maximum spot density compatible with ampliPHOX detection is ˜1800 spots/array. Because of the spot density limitations, higher density microarrays are not suitable for ampliPHOX detection. Here, we present ampliPHOX colorimetric detection technology as a method of signal amplification on a low density microarray developed for the detection and characterization of influenza viruses (FluChip). Although this protocol uses the FluChip (a DNA microarray) as one specific application of ampliPHOX detection, any microarray incorporating biotinylated target can be labeled and detected in a similar manner. The microarray design and biotinylation of the target to be captured are the responsibility of the user. Once the biotinylated target has been captured on the array, ampliPHOX detection can be performed by first tagging the array with a streptavidin-label conjugate (ampliTAG). Upon light exposure using the ampliPHOX Reader instrument, polymerization of a monomer solution (ampliPHY) occurs only in regions containing ampliTAG-labeled targets. The polymer formed can be subsequently stained with a non-toxic solution to improve visual contrast, followed by imaging and analysis using a simple software package (ampliVIEW). The entire FluChip assay from un-extracted sample to result can be performed in about 6 hours, and the ampliPHOX detection steps described above can be completed in about 30 min.
Immunology, Issue 52, microarrays, colorimetric detection, ampliPHOX, diagnostic, low-density, pathogen detection, influenza
Play Button
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Authors: Aya D. Pusic, Yelena Y. Grinberg, Heidi M. Mitchell, Richard P. Kraig.
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers. In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic and modulate SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
Play Button
Diagnosing Pulmonary Tuberculosis with the Xpert MTB/RIF Test
Authors: Thomas Bodmer, Angelika Ströhle.
Institutions: University of Bern, MCL Laboratories Inc..
Tuberculosis (TB) due to Mycobacterium tuberculosis (MTB) remains a major public health issue: the infection affects up to one third of the world population1, and almost two million people are killed by TB each year.2 Universal access to high-quality, patient-centered treatment for all TB patients is emphasized by WHO's Stop TB Strategy.3 The rapid detection of MTB in respiratory specimens and drug therapy based on reliable drug resistance testing results are a prerequisite for the successful implementation of this strategy. However, in many areas of the world, TB diagnosis still relies on insensitive, poorly standardized sputum microscopy methods. Ineffective TB detection and the emergence and transmission of drug-resistant MTB strains increasingly jeopardize global TB control activities.2 Effective diagnosis of pulmonary TB requires the availability - on a global scale - of standardized, easy-to-use, and robust diagnostic tools that would allow the direct detection of both the MTB complex and resistance to key antibiotics, such as rifampicin (RIF). The latter result can serve as marker for multidrug-resistant MTB (MDR TB) and has been reported in > 95% of the MDR-TB isolates.4, 5 The rapid availability of reliable test results is likely to directly translate into sound patient management decisions that, ultimately, will cure the individual patient and break the chain of TB transmission in the community.2 Cepheid's (Sunnyvale, CA, U.S.A.) Xpert MTB/RIF assay6, 7 meets the demands outlined above in a remarkable manner. It is a nucleic-acids amplification test for 1) the detection of MTB complex DNA in sputum or concentrated sputum sediments; and 2) the detection of RIF resistance-associated mutations of the rpoB gene.8 It is designed for use with Cepheid's GeneXpert Dx System that integrates and automates sample processing, nucleic acid amplification, and detection of the target sequences using real-time PCR and reverse transcriptase PCR. The system consists of an instrument, personal computer, barcode scanner, and preloaded software for running tests and viewing the results.9 It employs single-use disposable Xpert MTB/RIF cartridges that hold PCR reagents and host the PCR process. Because the cartridges are self-contained, cross-contamination between samples is eliminated.6 Current nucleic acid amplification methods used to detect MTB are complex, labor-intensive, and technically demanding. The Xpert MTB/RIF assay has the potential to bring standardized, sensitive and very specific diagnostic testing for both TB and drug resistance to universal-access point-of-care settings3, provided that they will be able to afford it. In order to facilitate access, the Foundation for Innovative New Diagnostics (FIND) has negotiated significant price reductions. Current FIND-negotiated prices, along with the list of countries eligible for the discounts, are available on the web.10
Immunology, Issue 62, tuberculosis, drug resistance, rifampicin, rapid diagnosis, Xpert MTB/RIF test
Play Button
The Use of Thermal Infra-Red Imaging to Detect Delayed Onset Muscle Soreness
Authors: Hani H. Al-Nakhli, Jerrold S. Petrofsky, Michael S. Laymon, Lee S. Berk.
Institutions: Loma Linda University, Azusa Pacific University.
Delayed onset muscle soreness (DOMS), also known as exercise induced muscle damage (EIMD), is commonly experienced in individuals who have been physically inactive for prolonged periods of time, and begin with an unexpected bout of exercise1-4, but can also occur in athletes who exercise beyond their normal limits of training5. The symptoms associated with this painful phenomenon can range from slight muscle tenderness, to severe debilitating pain1,3,5. The intensity of these symptoms and the related discomfort increases within the first 24 hours following the termination of the exercise, and peaks between 24 to 72 hours post exercise1,3. For this reason, DOMS is one of the most common recurrent forms of sports injury that can affect an individual’s performance, and become intimidating for many1,4. For the last 3 decades, the DOMS phenomenon has gained a considerable amount of interest amongst researchers and specialists in exercise physiology, sports, and rehabilitation fields6. There has been a variety of published studies investigating this painful occurrence in regards to its underlying mechanisms, treatment interventions, and preventive strategies1-5,7-12. However, it is evident from the literature that DOMS is not an easy pathology to quantify, as there is a wide amount of variability between the measurement tools and methods used to quantify this condition6. It is obvious that no agreement has been made on one best evaluation measure for DOMS, which makes it difficult to verify whether a specific intervention really helps in decreasing the symptoms associated with this type of soreness or not. Thus, DOMS can be seen as somewhat ambiguous, because many studies depend on measuring soreness using a visual analog scale (VAS)10,13-15, which is a subjective rather than an objective measure. Even though needle biopsies of the muscle, and blood levels of myofibre proteins might be considered a gold standard to some6, large variations in some of these blood proteins have been documented 6,16, in addition to the high risks sometimes associated with invasive techniques. Therefore, in the current investigation, we tested a thermal infra-red (IR) imaging technique of the skin above the exercised muscle to detect the associated muscle soreness. Infra-red thermography has been used, and found to be successful in detecting different types of diseases and infections since the 1950’s17. But surprisingly, near to nothing has been done on DOMS and changes in skin temperature. The main purpose of this investigation was to examine changes in DOMS using this safe and non-invasive technique.
Medicine, Issue 59, DOMS, Imaging, Thermal, Infra-Red, Muscle, Soreness, Thermography
Play Button
Detection of Invasive Pulmonary Aspergillosis in Haematological Malignancy Patients by using Lateral-flow Technology
Authors: Christopher Thornton, Gemma Johnson, Samir Agrawal.
Institutions: University of Exeter, Queen Mary University of London, St. Bartholomew's Hospital and The London NHS Trust.
Invasive pulmonary aspergillosis (IPA) is a leading cause of morbidity and mortality in haematological malignancy patients and hematopoietic stem cell transplant recipients1. Detection of IPA represents a formidable diagnostic challenge and, in the absence of a 'gold standard', relies on a combination of clinical data and microbiology and histopathology where feasible. Diagnosis of IPA must conform to the European Organization for Research and Treatment of Cancer and the National Institute of Allergy and Infectious Diseases Mycology Study Group (EORTC/MSG) consensus defining "proven", "probable", and "possible" invasive fungal diseases2. Currently, no nucleic acid-based tests have been externally validated for IPA detection and so polymerase chain reaction (PCR) is not included in current EORTC/MSG diagnostic criteria. Identification of Aspergillus in histological sections is problematic because of similarities in hyphal morphologies with other invasive fungal pathogens3, and proven identification requires isolation of the etiologic agent in pure culture. Culture-based approaches rely on the availability of biopsy samples, but these are not always accessible in sick patients, and do not always yield viable propagules for culture when obtained. An important feature in the pathogenesis of Aspergillus is angio-invasion, a trait that provides opportunities to track the fungus immunologically using tests that detect characteristic antigenic signatures molecules in serum and bronchoalveolar lavage (BAL) fluids. This has led to the development of the Platelia enzyme immunoassay (GM-EIA) that detects Aspergillus galactomannan and a 'pan-fungal' assay (Fungitell test) that detects the conserved fungal cell wall component (1 →3)-β-D-glucan, but not in the mucorales that lack this component in their cell walls1,4. Issues surrounding the accuracy of these tests1,4-6 has led to the recent development of next-generation monoclonal antibody (MAb)-based assays that detect surrogate markers of infection1,5. Thornton5 recently described the generation of an Aspergillus-specific MAb (JF5) using hybridoma technology and its use to develop an immuno-chromatographic lateral-flow device (LFD) for the point-of-care (POC) diagnosis of IPA. A major advantage of the LFD is its ability to detect activity since MAb JF5 binds to an extracellular glycoprotein antigen that is secreted during active growth of the fungus only5. This is an important consideration when using fluids such as lung BAL for diagnosing IPA since Aspergillus spores are a common component of inhaled air. The utility of the device in diagnosing IPA has been demonstrated using an animal model of infection, where the LFD displayed improved sensitivity and specificity compared to the Platelia GM and Fungitell (1 → 3)-β-D-glucan assays7. Here, we present a simple LFD procedure to detect Aspergillus antigen in human serum and BAL fluids. Its speed and accuracy provides a novel adjunct point-of-care test for diagnosis of IPA in haematological malignancy patients.
Immunology, Issue 61, Invasive pulmonary aspergillosis, acute myeloid leukemia, bone marrow transplant, diagnosis, monoclonal antibody, lateral-flow technology
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Driving Simulation in the Clinic: Testing Visual Exploratory Behavior in Daily Life Activities in Patients with Visual Field Defects
Authors: Johanna Hamel, Antje Kraft, Sven Ohl, Sophie De Beukelaer, Heinrich J. Audebert, Stephan A. Brandt.
Institutions: Universitätsmedizin Charité, Universitätsmedizin Charité, Humboldt Universität zu Berlin.
Patients suffering from homonymous hemianopia after infarction of the posterior cerebral artery (PCA) report different degrees of constraint in daily life, despite similar visual deficits. We assume this could be due to variable development of compensatory strategies such as altered visual scanning behavior. Scanning compensatory therapy (SCT) is studied as part of the visual training after infarction next to vision restoration therapy. SCT consists of learning to make larger eye movements into the blind field enlarging the visual field of search, which has been proven to be the most useful strategy1, not only in natural search tasks but also in mastering daily life activities2. Nevertheless, in clinical routine it is difficult to identify individual levels and training effects of compensatory behavior, since it requires measurement of eye movements in a head unrestrained condition. Studies demonstrated that unrestrained head movements alter the visual exploratory behavior compared to a head-restrained laboratory condition3. Martin et al.4 and Hayhoe et al.5 showed that behavior demonstrated in a laboratory setting cannot be assigned easily to a natural condition. Hence, our goal was to develop a study set-up which uncovers different compensatory oculomotor strategies quickly in a realistic testing situation: Patients are tested in the clinical environment in a driving simulator. SILAB software (Wuerzburg Institute for Traffic Sciences GmbH (WIVW)) was used to program driving scenarios of varying complexity and recording the driver's performance. The software was combined with a head mounted infrared video pupil tracker, recording head- and eye-movements (EyeSeeCam, University of Munich Hospital, Clinical Neurosciences). The positioning of the patient in the driving simulator and the positioning, adjustment and calibration of the camera is demonstrated. Typical performances of a patient with and without compensatory strategy and a healthy control are illustrated in this pilot study. Different oculomotor behaviors (frequency and amplitude of eye- and head-movements) are evaluated very quickly during the drive itself by dynamic overlay pictures indicating where the subjects gaze is located on the screen, and by analyzing the data. Compensatory gaze behavior in a patient leads to a driving performance comparable to a healthy control, while the performance of a patient without compensatory behavior is significantly worse. The data of eye- and head-movement-behavior as well as driving performance are discussed with respect to different oculomotor strategies and in a broader context with respect to possible training effects throughout the testing session and implications on rehabilitation potential.
Medicine, Issue 67, Neuroscience, Physiology, Anatomy, Ophthalmology, compensatory oculomotor behavior, driving simulation, eye movements, homonymous hemianopia, stroke, visual field defects, visual field enlargement
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Thermal Measurement Techniques in Analytical Microfluidic Devices
Authors: Benyamin Davaji, Chung Hoon Lee.
Institutions: Marquette University.
Thermal measurement techniques have been used for many applications such as thermal characterization of materials and chemical reaction detection. Micromachining techniques allow reduction of the thermal mass of fabricated structures and introduce the possibility to perform high sensitivity thermal measurements in the micro-scale and nano-scale devices. Combining thermal measurement techniques with microfluidic devices allows performing different analytical measurements with low sample consumption and reduced measurement time by integrating the miniaturized system on a single chip. The procedures of thermal measurement techniques for particle detection, material characterization, and chemical detection are introduced in this paper.
Engineering, Issue 100, Thermal Particle Detection, Thermal Wave Analysis, Heat Penetration Time, Thermal Time Constant, Enthalpy Assay, Thermal Conductivity and Specific Heat
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.