JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Revised lithostratigraphy of the Sonsela Member (Chinle Formation, Upper Triassic) in the Southern Part of Petrified Forest National Park, Arizona.
PUBLISHED: 01-25-2010
Recent revisions to the Sonsela Member of the Chinle Formation in Petrified Forest National Park have presented a three-part lithostratigraphic model based on unconventional correlations of sandstone beds. As a vertebrate faunal transition is recorded within this stratigraphic interval, these correlations, and the purported existence of a depositional hiatus (the Tr-4 unconformity) at about the same level, must be carefully re-examined.
Authors: Lauren Kurek, Maya L. Najarian, David A. Cremers, Rosemarie C. Chinni.
Published: 09-23-2013
The dependence of some LIBS detection capabilities on lower pulse energies (<100 mJ) and timing parameters were examined using synthetic silicate samples. These samples were used as simulants for soil and contained minor and trace elements commonly found in soil at a wide range of concentrations. For this study, over 100 calibration curves were prepared using different pulse energies and timing parameters; detection limits and sensitivities were determined from the calibration curves. Plasma temperatures were also measured using Boltzmann plots for the various energies and the timing parameters tested. The electron density of the plasma was calculated using the full-width half maximum (FWHM) of the hydrogen line at 656.5 nm over the energies tested. Overall, the results indicate that the use of lower pulse energies and non-gated detection do not seriously compromise the analytical results. These results are very relevant to the design of field- and person-portable LIBS instruments.
25 Related JoVE Articles!
Play Button
Using Informational Connectivity to Measure the Synchronous Emergence of fMRI Multi-voxel Information Across Time
Authors: Marc N. Coutanche, Sharon L. Thompson-Schill.
Institutions: University of Pennsylvania.
It is now appreciated that condition-relevant information can be present within distributed patterns of functional magnetic resonance imaging (fMRI) brain activity, even for conditions with similar levels of univariate activation. Multi-voxel pattern (MVP) analysis has been used to decode this information with great success. FMRI investigators also often seek to understand how brain regions interact in interconnected networks, and use functional connectivity (FC) to identify regions that have correlated responses over time. Just as univariate analyses can be insensitive to information in MVPs, FC may not fully characterize the brain networks that process conditions with characteristic MVP signatures. The method described here, informational connectivity (IC), can identify regions with correlated changes in MVP-discriminability across time, revealing connectivity that is not accessible to FC. The method can be exploratory, using searchlights to identify seed-connected areas, or planned, between pre-selected regions-of-interest. The results can elucidate networks of regions that process MVP-related conditions, can breakdown MVPA searchlight maps into separate networks, or can be compared across tasks and patient groups.
Neuroscience, Issue 89, fMRI, MVPA, connectivity, informational connectivity, functional connectivity, networks, multi-voxel pattern analysis, decoding, classification, method, multivariate
Play Button
An In Vitro System to Study Tumor Dormancy and the Switch to Metastatic Growth
Authors: Dalit Barkan, Jeffrey E. Green.
Institutions: University of Haifa, National Cancer Institute.
Recurrence of breast cancer often follows a long latent period in which there are no signs of cancer, and metastases may not become clinically apparent until many years after removal of the primary tumor and adjuvant therapy. A likely explanation of this phenomenon is that tumor cells have seeded metastatic sites, are resistant to conventional therapies, and remain dormant for long periods of time 1-4. The existence of dormant cancer cells at secondary sites has been described previously as quiescent solitary cells that neither proliferate nor undergo apoptosis 5-7. Moreover, these solitary cells has been shown to disseminate from the primary tumor at an early stage of disease progression 8-10 and reside growth-arrested in the patients' bone marrow, blood and lymph nodes 1,4,11. Therefore, understanding mechanisms that regulate dormancy or the switch to a proliferative state is critical for discovering novel targets and interventions to prevent disease recurrence. However, unraveling the mechanisms regulating the switch from tumor dormancy to metastatic growth has been hampered by the lack of available model systems. in vivo and ex vivo model systems to study metastatic progression of tumor cells have been described previously 1,12-14. However these model systems have not provided in real time and in a high throughput manner mechanistic insights into what triggers the emergence of solitary dormant tumor cells to proliferate as metastatic disease. We have recently developed a 3D in vitro system to model the in vivo growth characteristics of cells that exhibit either dormant (D2.OR, MCF7, K7M2-AS.46) or proliferative (D2A1, MDA-MB-231, K7M2) metastatic behavior in vivo . We demonstrated that tumor cells that exhibit dormancy in vivo at a metastatic site remain quiescent when cultured in a 3-dimension (3D) basement membrane extract (BME), whereas cells highly metastatic in vivo readily proliferate in 3D culture after variable, but relatively short periods of quiescence. Importantly by utilizing the 3D in vitro model system we demonstrated for the first time that the ECM composition plays an important role in regulating whether dormant tumor cells will switch to a proliferative state and have confirmed this in in vivo studies15-17. Hence, the model system described in this report provides an in vitro method to model tumor dormancy and study the transition to proliferative growth induced by the microenvironment.
Medicine, Issue 54, Tumor dormancy, cancer recurrence, metastasis, reconstituted basement membrane extract (BME), 3D culture, breast cancer
Play Button
Deficient Pms2, ERCC1, Ku86, CcOI in Field Defects During Progression to Colon Cancer
Authors: Huy Nguyen, Cristy Loustaunau, Alexander Facista, Lois Ramsey, Nadia Hassounah, Hilary Taylor, Robert Krouse, Claire M. Payne, V. Liana Tsikitis, Steve Goldschmid, Bhaskar Banerjee, Rafael F. Perini, Carol Bernstein.
Institutions: University of Arizona, Tucson, Tucson, AZ, University of Arizona, Tucson, Tucson, AZ, University of Arizona, Tucson.
In carcinogenesis, the "field defect" is recognized clinically because of the high propensity of survivors of certain cancers to develop other malignancies of the same tissue type, often in a nearby location. Such field defects have been indicated in colon cancer. The molecular abnormalities that are responsible for a field defect in the colon should be detectable at high frequency in the histologically normal tissue surrounding a colonic adenocarcinoma or surrounding an adenoma with advanced neoplasia (well on the way to a colon cancer), but at low frequency in the colonic mucosa from patients without colonic neoplasia. Using immunohistochemistry, entire crypts within 10 cm on each side of colonic adenocarcinomas or advanced colonic neoplasias were found to be frequently reduced or absent in expression for two DNA repair proteins, Pms2 and/or ERCC1. Pms2 is a dual role protein, active in DNA mismatch repair as well as needed in apoptosis of cells with excess DNA damage. ERCC1 is active in DNA nucleotide excision repair. The reduced or absent expression of both ERCC1 and Pms2 would create cells with both increased ability to survive (apoptosis resistance) and increased level of mutability. The reduced or absent expression of both ERCC1 and Pms2 is likely an early step in progression to colon cancer. DNA repair gene Ku86 (active in DNA non-homologous end joining) and Cytochrome c Oxidase Subunit I (involved in apoptosis) had each been reported to be decreased in expression in mucosal areas close to colon cancers. However, immunohistochemical evaluation of their levels of expression showed only low to modest frequencies of crypts to be deficient in their expression in a field defect surrounding colon cancer or surrounding advanced colonic neoplasia. We show, here, our method of evaluation of crypts for expression of ERCC1, Pms2, Ku86 and CcOI. We show that frequency of entire crypts deficient for Pms2 and ERCC1 is often as great as 70% to 95% in 20 cm long areas surrounding a colonic neoplasia, while frequency of crypts deficient in Ku86 has a median value of 2% and frequency of crypts deficient in CcOI has a median value of 16% in these areas. The entire colon is 150 cm long (about 5 feet) and has about 10 million crypts in its mucosal layer. The defect in Pms2 and ERCC1 surrounding a colon cancer thus may include 1 million crypts. It is from a defective crypt that colon cancer arises.
Cellular Biology, Issue 41, DNA Repair, Apoptosis, Field Defect, Colon Cancer, Pms2, ERCC1, Cytochrome c Oxidase Subunit I, Ku86, Immunohistochemistry, Cancer Resection
Play Button
A Technique to Screen American Beech for Resistance to the Beech Scale Insect (Cryptococcus fagisuga Lind.)
Authors: Jennifer L. Koch, David W. Carey.
Institutions: US Forest Service.
Beech bark disease (BBD) results in high levels of initial mortality, leaving behind survivor trees that are greatly weakened and deformed. The disease is initiated by feeding activities of the invasive beech scale insect, Cryptococcus fagisuga, which creates entry points for infection by one of the Neonectria species of fungus. Without scale infestation, there is little opportunity for fungal infection. Using scale eggs to artificially infest healthy trees in heavily BBD impacted stands demonstrated that these trees were resistant to the scale insect portion of the disease complex1. Here we present a protocol that we have developed, based on the artificial infestation technique by Houston2, which can be used to screen for scale-resistant trees in the field and in smaller potted seedlings and grafts. The identification of scale-resistant trees is an important component of management of BBD through tree improvement programs and silvicultural manipulation.
Environmental Sciences, Issue 87, Forestry, Insects, Disease Resistance, American beech, Fagus grandifolia, beech scale, Cryptococcus fagisuga, resistance, screen, bioassay
Play Button
Technique for Studying Arthropod and Microbial Communities within Tree Tissues
Authors: Nicholas C Aflitto, Richard W Hofstetter, Reagan McGuire, David D Dunn, Kristen A Potter.
Institutions: Northern Arizona University, Acoustic Ecology Institute.
Phloem tissues of pine are habitats for many thousands of organisms. Arthropods and microbes use phloem and cambium tissues to seek mates, lay eggs, rear young, feed, or hide from natural enemies or harsh environmental conditions outside of the tree. Organisms that persist within the phloem habitat are difficult to observe given their location under bark. We provide a technique to preserve intact phloem and prepare it for experimentation with invertebrates and microorganisms. The apparatus is called a ‘phloem sandwich’ and allows for the introduction and observation of arthropods, microbes, and other organisms. This technique has resulted in a better understanding of the feeding behaviors, life-history traits, reproduction, development, and interactions of organisms within tree phloem. The strengths of this technique include the use of inexpensive materials, variability in sandwich size, flexibility to re-open the sandwich or introduce multiple organisms through drilled holes, and the preservation and maintenance of phloem integrity. The phloem sandwich is an excellent educational tool for scientific discovery in both K-12 science courses and university research laboratories.
Environmental Sciences, Issue 93, phloem sandwich, pine, bark beetles, mites, acoustics, phloem
Play Button
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Authors: Jeremy D. Smith, Abbie E. Ferris, Gary D. Heise, Richard N. Hinrichs, Philip E. Martin.
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Play Button
Laboratory Drop Towers for the Experimental Simulation of Dust-aggregate Collisions in the Early Solar System
Authors: Jürgen Blum, Eike Beitz, Mohtashim Bukhari, Bastian Gundlach, Jan-Hendrik Hagemann, Daniel Heißelmann, Stefan Kothe, Rainer Schräpler, Ingo von Borstel, René Weidling.
Institutions: Technische Universität Braunschweig.
For the purpose of investigating the evolution of dust aggregates in the early Solar System, we developed two vacuum drop towers in which fragile dust aggregates with sizes up to ~10 cm and porosities up to 70% can be collided. One of the drop towers is primarily used for very low impact speeds down to below 0.01 m/sec and makes use of a double release mechanism. Collisions are recorded in stereo-view by two high-speed cameras, which fall along the glass vacuum tube in the center-of-mass frame of the two dust aggregates. The other free-fall tower makes use of an electromagnetic accelerator that is capable of gently accelerating dust aggregates to up to 5 m/sec. In combination with the release of another dust aggregate to free fall, collision speeds up to ~10 m/sec can be achieved. Here, two fixed high-speed cameras record the collision events. In both drop towers, the dust aggregates are in free fall during the collision so that they are weightless and match the conditions in the early Solar System.
Physics, Issue 88, astrophysics, planet formation, collisions, granular matter, high-speed imaging, microgravity drop tower
Play Button
Lignin Down-regulation of Zea mays via dsRNAi and Klason Lignin Analysis
Authors: Sang-Hyuck Park, Rebecca Garlock Ong, Chuansheng Mei, Mariam Sticklen.
Institutions: University of Arizona, Michigan State University, The Institute for Advanced Learning and Research, Michigan State University.
To facilitate the use of lignocellulosic biomass as an alternative bioenergy resource, during biological conversion processes, a pretreatment step is needed to open up the structure of the plant cell wall, increasing the accessibility of the cell wall carbohydrates. Lignin, a polyphenolic material present in many cell wall types, is known to be a significant hindrance to enzyme access. Reduction in lignin content to a level that does not interfere with the structural integrity and defense system of the plant might be a valuable step to reduce the costs of bioethanol production. In this study, we have genetically down-regulated one of the lignin biosynthesis-related genes, cinnamoyl-CoA reductase (ZmCCR1) via a double stranded RNA interference technique. The ZmCCR1_RNAi construct was integrated into the maize genome using the particle bombardment method. Transgenic maize plants grew normally as compared to the wild-type control plants without interfering with biomass growth or defense mechanisms, with the exception of displaying of brown-coloration in transgenic plants leaf mid-ribs, husks, and stems. The microscopic analyses, in conjunction with the histological assay, revealed that the leaf sclerenchyma fibers were thinned but the structure and size of other major vascular system components was not altered. The lignin content in the transgenic maize was reduced by 7-8.7%, the crystalline cellulose content was increased in response to lignin reduction, and hemicelluloses remained unchanged. The analyses may indicate that carbon flow might have been shifted from lignin biosynthesis to cellulose biosynthesis. This article delineates the procedures used to down-regulate the lignin content in maize via RNAi technology, and the cell wall compositional analyses used to verify the effect of the modifications on the cell wall structure.
Bioengineering, Issue 89, Zea mays, cinnamoyl-CoA reductase (CCR), dsRNAi, Klason lignin measurement, cell wall carbohydrate analysis, gas chromatography (GC)
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
Play Button
Fabrication, Densification, and Replica Molding of 3D Carbon Nanotube Microstructures
Authors: Davor Copic, Sei Jin Park, Sameh Tawfick, Michael De Volder, A. John Hart.
Institutions: University of Michigan , IMEC, Belgium.
The introduction of new materials and processes to microfabrication has, in large part, enabled many important advances in microsystems, lab-on-a-chip devices, and their applications. In particular, capabilities for cost-effective fabrication of polymer microstructures were transformed by the advent of soft lithography and other micromolding techniques 1, 2, and this led a revolution in applications of microfabrication to biomedical engineering and biology. Nevertheless, it remains challenging to fabricate microstructures with well-defined nanoscale surface textures, and to fabricate arbitrary 3D shapes at the micro-scale. Robustness of master molds and maintenance of shape integrity is especially important to achieve high fidelity replication of complex structures and preserving their nanoscale surface texture. The combination of hierarchical textures, and heterogeneous shapes, is a profound challenge to existing microfabrication methods that largely rely upon top-down etching using fixed mask templates. On the other hand, the bottom-up synthesis of nanostructures such as nanotubes and nanowires can offer new capabilities to microfabrication, in particular by taking advantage of the collective self-organization of nanostructures, and local control of their growth behavior with respect to microfabricated patterns. Our goal is to introduce vertically aligned carbon nanotubes (CNTs), which we refer to as CNT "forests", as a new microfabrication material. We present details of a suite of related methods recently developed by our group: fabrication of CNT forest microstructures by thermal CVD from lithographically patterned catalyst thin films; self-directed elastocapillary densification of CNT microstructures; and replica molding of polymer microstructures using CNT composite master molds. In particular, our work shows that self-directed capillary densification ("capillary forming"), which is performed by condensation of a solvent onto the substrate with CNT microstructures, significantly increases the packing density of CNTs. This process enables directed transformation of vertical CNT microstructures into straight, inclined, and twisted shapes, which have robust mechanical properties exceeding those of typical microfabrication polymers. This in turn enables formation of nanocomposite CNT master molds by capillary-driven infiltration of polymers. The replica structures exhibit the anisotropic nanoscale texture of the aligned CNTs, and can have walls with sub-micron thickness and aspect ratios exceeding 50:1. Integration of CNT microstructures in fabrication offers further opportunity to exploit the electrical and thermal properties of CNTs, and diverse capabilities for chemical and biochemical functionalization 3.
Mechanical Engineering, Issue 65, Physics, Carbon nanotube, microstructure, fabrication, molding, transfer, polymer
Play Button
Design and Operation of a Continuous 13C and 15N Labeling Chamber for Uniform or Differential, Metabolic and Structural, Plant Isotope Labeling
Authors: Jennifer L Soong, Dan Reuss, Colin Pinney, Ty Boyack, Michelle L Haddix, Catherine E Stewart, M. Francesca Cotrufo.
Institutions: Colorado State University, USDA-ARS, Colorado State University.
Tracing rare stable isotopes from plant material through the ecosystem provides the most sensitive information about ecosystem processes; from CO2 fluxes and soil organic matter formation to small-scale stable-isotope biomarker probing. Coupling multiple stable isotopes such as 13C with 15N, 18O or 2H has the potential to reveal even more information about complex stoichiometric relationships during biogeochemical transformations. Isotope labeled plant material has been used in various studies of litter decomposition and soil organic matter formation1-4. From these and other studies, however, it has become apparent that structural components of plant material behave differently than metabolic components (i.e. leachable low molecular weight compounds) in terms of microbial utilization and long-term carbon storage5-7. The ability to study structural and metabolic components separately provides a powerful new tool for advancing the forefront of ecosystem biogeochemical studies. Here we describe a method for producing 13C and 15N labeled plant material that is either uniformly labeled throughout the plant or differentially labeled in structural and metabolic plant components. Here, we present the construction and operation of a continuous 13C and 15N labeling chamber that can be modified to meet various research needs. Uniformly labeled plant material is produced by continuous labeling from seedling to harvest, while differential labeling is achieved by removing the growing plants from the chamber weeks prior to harvest. Representative results from growing Andropogon gerardii Kaw demonstrate the system's ability to efficiently label plant material at the targeted levels. Through this method we have produced plant material with a 4.4 atom%13C and 6.7 atom%15N uniform plant label, or material that is differentially labeled by up to 1.29 atom%13C and 0.56 atom%15N in its metabolic and structural components (hot water extractable and hot water residual components, respectively). Challenges lie in maintaining proper temperature, humidity, CO2 concentration, and light levels in an airtight 13C-CO2 atmosphere for successful plant production. This chamber description represents a useful research tool to effectively produce uniformly or differentially multi-isotope labeled plant material for use in experiments on ecosystem biogeochemical cycling.
Environmental Sciences, Issue 83, 13C, 15N, plant, stable isotope labeling, Andropogon gerardii, metabolic compounds, structural compounds, hot water extraction
Play Button
Recapitulation of an Ion Channel IV Curve Using Frequency Components
Authors: John R. Rigby, Steven Poelzing.
Institutions: University of Utah.
INTRODUCTION: Presently, there are no established methods to measure multiple ion channel types simultaneously and decompose the measured current into portions attributable to each channel type. This study demonstrates how impedance spectroscopy may be used to identify specific frequencies that highly correlate with the steady state current amplitude measured during voltage clamp experiments. The method involves inserting a noise function containing specific frequencies into the voltage step protocol. In the work presented, a model cell is used to demonstrate that no high correlations are introduced by the voltage clamp circuitry, and also that the noise function itself does not introduce any high correlations when no ion channels are present. This validation is necessary before the technique can be applied to preparations containing ion channels. The purpose of the protocol presented is to demonstrate how to characterize the frequency response of a single ion channel type to a noise function. Once specific frequencies have been identified in an individual channel type, they can be used to reproduce the steady state current voltage (IV) curve. Frequencies that highly correlate with one channel type and minimally correlate with other channel types may then be used to estimate the current contribution of multiple channel types measured simultaneously. METHODS: Voltage clamp measurements were performed on a model cell using a standard voltage step protocol (-150 to +50 mV, 5mV steps). Noise functions containing equal magnitudes of 1-15 kHz frequencies (zero to peak amplitudes: 50 or 100mV) were inserted into each voltage step. The real component of the Fast Fourier transform (FFT) of the output signal was calculated with and without noise for each step potential. The magnitude of each frequency as a function of voltage step was correlated with the current amplitude at the corresponding voltages. RESULTS AND CONCLUSIONS: In the absence of noise (control), magnitudes of all frequencies except the DC component correlated poorly (|R|<0.5) with the IV curve, whereas the DC component had a correlation coefficient greater than 0.999 in all measurements. The quality of correlation between individual frequencies and the IV curve did not change when a noise function was added to the voltage step protocol. Likewise, increasing the amplitude of the noise function also did not increase the correlation. Control measurements demonstrate that the voltage clamp circuitry by itself does not cause any frequencies above 0 Hz to highly correlate with the steady-state IV curve. Likewise, measurements in the presence of the noise function demonstrate that the noise function does not cause any frequencies above 0 Hz to correlate with the steady-state IV curve when no ion channels are present. Based on this verification, the method can now be applied to preparations containing a single ion channel type with the intent of identifying frequencies whose amplitudes correlate specifically with that channel type.
Biophysics, Issue 48, Ion channel, Kir2.1, impedance spectroscopy, frequency response, voltage clamp, electrophysiology
Play Button
Two Types of Assays for Detecting Frog Sperm Chemoattraction
Authors: Lindsey A. Burnett, Nathan Tholl, Douglas E. Chandler.
Institutions: University of Illinois, Urbana-Champaign, Arizona State University .
Sperm chemoattraction in invertebrates can be sufficiently robust that one can place a pipette containing the attractive peptide into a sperm suspension and microscopically visualize sperm accumulation around the pipette1. Sperm chemoattraction in vertebrates such as frogs, rodents and humans is more difficult to detect and requires quantitative assays. Such assays are of two major types - assays that quantitate sperm movement to a source of chemoattractant, so-called sperm accumulation assays, and assays that actually track the swimming trajectories of individual sperm. Sperm accumulation assays are relatively rapid allowing tens or hundreds of assays to be done in a single day, thereby allowing dose response curves and time courses to be carried out relatively rapidly. These types of assays have been used extensively to characterize many well established chemoattraction systems - for example, neutrophil chemotaxis to bacterial peptides and sperm chemotaxis to follicular fluid. Sperm tracking assays can be more labor intensive but offer additional data on how chemoattractancts actually alter the swimming paths that sperm take. This type of assay is needed to demonstrate the orientation of sperm movement relative to the chemoattrractant gradient axis and to visualize characteristic turns or changes in orientation that bring the sperm closer to the egg. Here we describe methods used for each of these two types of assays. The sperm accumulation assay utilized is called a "two-chamber" assay. Amphibian sperm are placed in a tissue culture plate insert with a polycarbonate filter floor having 12 μm diameter pores. Inserts with sperm are placed into tissue culture plate wells containing buffer and a chemoatttractant carefully pipetted into the bottom well where the floor meets the wall (see Fig. 1). After incubation, the top insert containing the sperm reservoir is carefully removed, and sperm in the bottom chamber that have passed through the membrane are removed, pelleted and then counted by hemocytometer or flow cytometer. The sperm tracking assay utilizes a Zigmond chamber originally developed for observing neutrophil chemotaxis and modified for observation of sperm by Giojalas and coworkers2,3. The chamber consists of a thick glass slide into which two vertical troughs have been machined. These are separated by a 1 mm wide observation platform. After application of a cover glass, sperm are loaded into one trough, the chemoattractant agent into the other and movement of individual sperm visualized by video microscopy. Video footage is then analyzed using software to identify two-dimensional cell movements in the x-y plane as a function of time (xyt data sets) that form the trajectory of each sperm.
Developmental Biology, Issue 58, Sperm chemotaxis, fertilization, sperm accumulation assay, sperm tracking assay, sperm motility, Xenopus laevis, egg jelly
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Play Button
Contextual and Cued Fear Conditioning Test Using a Video Analyzing System in Mice
Authors: Hirotaka Shoji, Keizo Takao, Satoko Hattori, Tsuyoshi Miyakawa.
Institutions: Fujita Health University, Core Research for Evolutionary Science and Technology (CREST), National Institutes of Natural Sciences.
The contextual and cued fear conditioning test is one of the behavioral tests that assesses the ability of mice to learn and remember an association between environmental cues and aversive experiences. In this test, mice are placed into a conditioning chamber and are given parings of a conditioned stimulus (an auditory cue) and an aversive unconditioned stimulus (an electric footshock). After a delay time, the mice are exposed to the same conditioning chamber and a differently shaped chamber with presentation of the auditory cue. Freezing behavior during the test is measured as an index of fear memory. To analyze the behavior automatically, we have developed a video analyzing system using the ImageFZ application software program, which is available as a free download at Here, to show the details of our protocol, we demonstrate our procedure for the contextual and cued fear conditioning test in C57BL/6J mice using the ImageFZ system. In addition, we validated our protocol and the video analyzing system performance by comparing freezing time measured by the ImageFZ system or a photobeam-based computer measurement system with that scored by a human observer. As shown in our representative results, the data obtained by ImageFZ were similar to those analyzed by a human observer, indicating that the behavioral analysis using the ImageFZ system is highly reliable. The present movie article provides detailed information regarding the test procedures and will promote understanding of the experimental situation.
Behavior, Issue 85, Fear, Learning, Memory, ImageFZ program, Mouse, contextual fear, cued fear
Play Button
Simultaneous Scalp Electroencephalography (EEG), Electromyography (EMG), and Whole-body Segmental Inertial Recording for Multi-modal Neural Decoding
Authors: Thomas C. Bulea, Atilla Kilicarslan, Recep Ozdemir, William H. Paloski, Jose L. Contreras-Vidal.
Institutions: National Institutes of Health, University of Houston, University of Houston, University of Houston, University of Houston.
Recent studies support the involvement of supraspinal networks in control of bipedal human walking. Part of this evidence encompasses studies, including our previous work, demonstrating that gait kinematics and limb coordination during treadmill walking can be inferred from the scalp electroencephalogram (EEG) with reasonably high decoding accuracies. These results provide impetus for development of non-invasive brain-machine-interface (BMI) systems for use in restoration and/or augmentation of gait- a primary goal of rehabilitation research. To date, studies examining EEG decoding of activity during gait have been limited to treadmill walking in a controlled environment. However, to be practically viable a BMI system must be applicable for use in everyday locomotor tasks such as over ground walking and turning. Here, we present a novel protocol for non-invasive collection of brain activity (EEG), muscle activity (electromyography (EMG)), and whole-body kinematic data (head, torso, and limb trajectories) during both treadmill and over ground walking tasks. By collecting these data in the uncontrolled environment insight can be gained regarding the feasibility of decoding unconstrained gait and surface EMG from scalp EEG.
Behavior, Issue 77, Neuroscience, Neurobiology, Medicine, Anatomy, Physiology, Biomedical Engineering, Molecular Biology, Electroencephalography, EEG, Electromyography, EMG, electroencephalograph, gait, brain-computer interface, brain machine interface, neural decoding, over-ground walking, robotic gait, brain, imaging, clinical techniques
Play Button
Analysis of Nephron Composition and Function in the Adult Zebrafish Kidney
Authors: Kristen K. McCampbell, Kristin N. Springer, Rebecca A. Wingert.
Institutions: University of Notre Dame.
The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.
Cellular Biology, Issue 90, zebrafish; kidney; nephron; nephrology; renal; regeneration; proximal tubule; distal tubule; segment; mesonephros; physiology; acute kidney injury (AKI)
Play Button
High-speed Particle Image Velocimetry Near Surfaces
Authors: Louise Lu, Volker Sick.
Institutions: University of Michigan.
Multi-dimensional and transient flows play a key role in many areas of science, engineering, and health sciences but are often not well understood. The complex nature of these flows may be studied using particle image velocimetry (PIV), a laser-based imaging technique for optically accessible flows. Though many forms of PIV exist that extend the technique beyond the original planar two-component velocity measurement capabilities, the basic PIV system consists of a light source (laser), a camera, tracer particles, and analysis algorithms. The imaging and recording parameters, the light source, and the algorithms are adjusted to optimize the recording for the flow of interest and obtain valid velocity data. Common PIV investigations measure two-component velocities in a plane at a few frames per second. However, recent developments in instrumentation have facilitated high-frame rate (> 1 kHz) measurements capable of resolving transient flows with high temporal resolution. Therefore, high-frame rate measurements have enabled investigations on the evolution of the structure and dynamics of highly transient flows. These investigations play a critical role in understanding the fundamental physics of complex flows. A detailed description for performing high-resolution, high-speed planar PIV to study a transient flow near the surface of a flat plate is presented here. Details for adjusting the parameter constraints such as image and recording properties, the laser sheet properties, and processing algorithms to adapt PIV for any flow of interest are included.
Physics, Issue 76, Mechanical Engineering, Fluid Mechanics, flow measurement, fluid heat transfer, internal flow in turbomachinery (applications), boundary layer flow (general), flow visualization (instrumentation), laser instruments (design and operation), Boundary layer, micro-PIV, optical laser diagnostics, internal combustion engines, flow, fluids, particle, velocimetry, visualization
Play Button
Setting Limits on Supersymmetry Using Simplified Models
Authors: Christian Gütschow, Zachary Marshall.
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Developing Neuroimaging Phenotypes of the Default Mode Network in PTSD: Integrating the Resting State, Working Memory, and Structural Connectivity
Authors: Noah S. Philip, S. Louisa Carpenter, Lawrence H. Sweet.
Institutions: Alpert Medical School, Brown University, University of Georgia.
Complementary structural and functional neuroimaging techniques used to examine the Default Mode Network (DMN) could potentially improve assessments of psychiatric illness severity and provide added validity to the clinical diagnostic process. Recent neuroimaging research suggests that DMN processes may be disrupted in a number of stress-related psychiatric illnesses, such as posttraumatic stress disorder (PTSD). Although specific DMN functions remain under investigation, it is generally thought to be involved in introspection and self-processing. In healthy individuals it exhibits greatest activity during periods of rest, with less activity, observed as deactivation, during cognitive tasks, e.g., working memory. This network consists of the medial prefrontal cortex, posterior cingulate cortex/precuneus, lateral parietal cortices and medial temporal regions. Multiple functional and structural imaging approaches have been developed to study the DMN. These have unprecedented potential to further the understanding of the function and dysfunction of this network. Functional approaches, such as the evaluation of resting state connectivity and task-induced deactivation, have excellent potential to identify targeted neurocognitive and neuroaffective (functional) diagnostic markers and may indicate illness severity and prognosis with increased accuracy or specificity. Structural approaches, such as evaluation of morphometry and connectivity, may provide unique markers of etiology and long-term outcomes. Combined, functional and structural methods provide strong multimodal, complementary and synergistic approaches to develop valid DMN-based imaging phenotypes in stress-related psychiatric conditions. This protocol aims to integrate these methods to investigate DMN structure and function in PTSD, relating findings to illness severity and relevant clinical factors.
Medicine, Issue 89, default mode network, neuroimaging, functional magnetic resonance imaging, diffusion tensor imaging, structural connectivity, functional connectivity, posttraumatic stress disorder
Play Button
Utilizing Transcranial Magnetic Stimulation to Study the Human Neuromuscular System
Authors: David A. Goss, Richard L. Hoffman, Brian C. Clark.
Institutions: Ohio University.
Transcranial magnetic stimulation (TMS) has been in use for more than 20 years 1, and has grown exponentially in popularity over the past decade. While the use of TMS has expanded to the study of many systems and processes during this time, the original application and perhaps one of the most common uses of TMS involves studying the physiology, plasticity and function of the human neuromuscular system. Single pulse TMS applied to the motor cortex excites pyramidal neurons transsynaptically 2 (Figure 1) and results in a measurable electromyographic response that can be used to study and evaluate the integrity and excitability of the corticospinal tract in humans 3. Additionally, recent advances in magnetic stimulation now allows for partitioning of cortical versus spinal excitability 4,5. For example, paired-pulse TMS can be used to assess intracortical facilitatory and inhibitory properties by combining a conditioning stimulus and a test stimulus at different interstimulus intervals 3,4,6-8. In this video article we will demonstrate the methodological and technical aspects of these techniques. Specifically, we will demonstrate single-pulse and paired-pulse TMS techniques as applied to the flexor carpi radialis (FCR) muscle as well as the erector spinae (ES) musculature. Our laboratory studies the FCR muscle as it is of interest to our research on the effects of wrist-hand cast immobilization on reduced muscle performance6,9, and we study the ES muscles due to these muscles clinical relevance as it relates to low back pain8. With this stated, we should note that TMS has been used to study many muscles of the hand, arm and legs, and should iterate that our demonstrations in the FCR and ES muscle groups are only selected examples of TMS being used to study the human neuromuscular system.
Medicine, Issue 59, neuroscience, muscle, electromyography, physiology, TMS, strength, motor control. sarcopenia, dynapenia, lumbar
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
Play Button
Collecting And Measuring Wound Exudate Biochemical Mediators In Surgical Wounds
Authors: Brendan Carvalho, David J Clark, David Yeomans, Martin S Angst.
Institutions: Stanford University School of Medicine .
We describe a methodology by which we are able to collect and measure biochemical inflammatory and nociceptive mediators at the surgical wound site. Collecting site-specific biochemical markers is important to understand the relationship between levels in serum and surgical wound, determine any associations between mediator release, pain, analgesic use and other outcomes of interest, and evaluate the effect of systemic and peripheral drug administration on surgical wound biochemistry. This methodology has been applied to healthy women undergoing elective cesarean delivery with spinal anesthesia. We have measured wound exudate and serum mediators at the same time intervals as patient's pain scores and analgesics consumption for up to 48 hours post-cesarean delivery. Using this methodology we have been able to detect various biochemical mediators including nerve growth factor (NGF), prostaglandin E2 (PG-E2) substance P, IL-1β, IL-2, IL-4, IL-6, IL-7, IL-8, IL-10, IL-12, IL-13, IL-17, TNFα, INFγ, G-CSF, GM-CSF, MCP-1 and MIP-1β. Studies applying this human surgical wound bioassay have found no correlations between wound and serum cytokine concentrations or their time-release profile (J Pain. 2008; 9(7):650-7).1 We also documented the utility of the technique to identify drug-mediated changes in wound cytokine content (Anesth Analg 2010; 111:1452-9).2
Medicine, Issue 68, Biochemistry, Anatomy, Physiology, Cytokines, Cesarean Section, Wound Healing, Wounds and Injuries, Surgical Procedures, Operative, Surgical wound, Exudate, cytokines, Substance P, Interleukin 10, Interleukin 6, Nerve growth factor, Prostaglandin E2, Cesarean, Analgesia
Play Button
Light/dark Transition Test for Mice
Authors: Keizo Takao, Tsuyoshi Miyakawa.
Institutions: Graduate School of Medicine, Kyoto University.
Although all of the mouse genome sequences have been determined, we do not yet know the functions of most of these genes. Gene-targeting techniques, however, can be used to delete or manipulate a specific gene in mice. The influence of a given gene on a specific behavior can then be determined by conducting behavioral analyses of the mutant mice. As a test for behavioral phenotyping of mutant mice, the light/dark transition test is one of the most widely used tests to measure anxiety-like behavior in mice. The test is based on the natural aversion of mice to brightly illuminated areas and on their spontaneous exploratory behavior in novel environments. The test is sensitive to anxiolytic drug treatment. The apparatus consists of a dark chamber and a brightly illuminated chamber. Mice are allowed to move freely between the two chambers. The number of entries into the bright chamber and the duration of time spent there are indices of bright-space anxiety in mice. To obtain phenotyping results of a strain of mutant mice that can be readily reproduced and compared with those of other mutants, the behavioral test methods should be as identical as possible between laboratories. The procedural differences that exist between laboratories, however, make it difficult to replicate or compare the results among laboratories. Here, we present our protocol for the light/dark transition test as a movie so that the details of the protocol can be demonstrated. In our laboratory, we have assessed more than 60 strains of mutant mice using the protocol shown in the movie. Those data will be disclosed as a part of a public database that we are now constructing. Visualization of the protocol will facilitate understanding of the details of the entire experimental procedure, allowing for standardization of the protocols used across laboratories and comparisons of the behavioral phenotypes of various strains of mutant mice assessed using this test.
Neuroscience, Issue 1, knockout mice, transgenic mice, behavioral test, phenotyping
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.