JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Measurement of ulnar variance on uncalibrated digital radiographic images.
Hand (N Y)
PUBLISHED: 10-27-2009
Uncalibrated digital radiographs used in multicenter trials hinder quantitative measures such as articular step and ulnar variance. This investigation tested the reliability of alternative measures of ulnar variance that are scaled to the length of the capitate. A sample of 30 sets of radiographs from patients enrolled in a prospective study of operative treatment of fractures of the distal radius were blinded and randomized. Five observers measured the ulnar variance (UV) and longitudinal length of the capitate (CH) on two separate occasions with greater than 2 weeks between measurements. During each measurement session, the observers made the measurements on both a calibrated and a noncalibrated workstation. The ratio of the ulnar variance to the length of capitate was calculated (UV/CH ratio). Paired t tests were used to compare two rounds of measurements for both methods. Intra- and interobserver reliability was assessed by the Pearson product-moment correlation coefficients. The ratios were compared using analysis of variance with a Bonferroni correction. The intraobserver reliability was excellent for each of the three variables (UV, CH, UV/CH ratio) for each workstation. The interobserver reliability of the UV/CH ratios obtained for each workstation was moderate to excellent as judged by the Pearson correlations between observers. The Bland-Altman method indicated a mean difference in UV/CH between calibrated and uncalibrated measurement techniques of 0.002 with limits of agreement of -0.11 to 0.11. Measurements of ulnar variance that are scaled to the length of the capitate may be useful measures of deformity in studies that utilize uncalibrated digital radiographs.
Authors: Thomas Z. Thompson, Farres Obeidin, Alisa A. Davidoff, Cody L. Hightower, Christohper Z. Johnson, Sonya L. Rice, Rebecca-Lyn Sokolove, Brandon K. Taylor, John M. Tuck, William G. Pearson, Jr..
Published: 05-06-2014
Characterizing hyolaryngeal movement is important to dysphagia research. Prior methods require multiple measurements to obtain one kinematic measurement whereas coordinate mapping of hyolaryngeal mechanics using Modified Barium Swallow (MBS) uses one set of coordinates to calculate multiple variables of interest. For demonstration purposes, ten kinematic measurements were generated from one set of coordinates to determine differences in swallowing two different bolus types. Calculations of hyoid excursion against the vertebrae and mandible are correlated to determine the importance of axes of reference. To demonstrate coordinate mapping methodology, 40 MBS studies were randomly selected from a dataset of healthy normal subjects with no known swallowing impairment. A 5 ml thin-liquid bolus and a 5 ml pudding swallows were measured from each subject. Nine coordinates, mapping the cranial base, mandible, vertebrae and elements of the hyolaryngeal complex, were recorded at the frames of minimum and maximum hyolaryngeal excursion. Coordinates were mathematically converted into ten variables of hyolaryngeal mechanics. Inter-rater reliability was evaluated by Intraclass correlation coefficients (ICC). Two-tailed t-tests were used to evaluate differences in kinematics by bolus viscosity. Hyoid excursion measurements against different axes of reference were correlated. Inter-rater reliability among six raters for the 18 coordinates ranged from ICC = 0.90 - 0.97. A slate of ten kinematic measurements was compared by subject between the six raters. One outlier was rejected, and the mean of the remaining reliability scores was ICC = 0.91, 0.84 - 0.96, 95% CI. Two-tailed t-tests with Bonferroni corrections comparing ten kinematic variables (5 ml thin-liquid vs. 5 ml pudding swallows) showed statistically significant differences in hyoid excursion, superior laryngeal movement, and pharyngeal shortening (p < 0.005). Pearson correlations of hyoid excursion measurements from two different axes of reference were: r = 0.62, r2 = 0.38, (thin-liquid); r = 0.52, r2 = 0.27, (pudding). Obtaining landmark coordinates is a reliable method to generate multiple kinematic variables from video fluoroscopic images useful in dysphagia research.
25 Related JoVE Articles!
Play Button
Nanomoulding of Functional Materials, a Versatile Complementary Pattern Replication Method to Nanoimprinting
Authors: Corsin Battaglia, Karin Söderström, Jordi Escarré, Franz-Josef Haug, Matthieu Despeisse, Christophe Ballif.
Institutions: Ecole Polytechnique Fédérale de Lausanne (EPFL), University of California, Berkeley .
We describe a nanomoulding technique which allows low-cost nanoscale patterning of functional materials, materials stacks and full devices. Nanomoulding combined with layer transfer enables the replication of arbitrary surface patterns from a master structure onto the functional material. Nanomoulding can be performed on any nanoimprinting setup and can be applied to a wide range of materials and deposition processes. In particular we demonstrate the fabrication of patterned transparent zinc oxide electrodes for light trapping applications in solar cells.
Materials Science, Issue 71, Nanotechnology, Mechanical Engineering, Electrical Engineering, Computer Sciences, Physics, dielectrics (electronic application), light emitting diodes (LED), lithography (circuit fabrication), nanodevices (electronic), optoelectronics (applications), photoelectric devices, semiconductor devices, solar cells (electrical design), Surface patterning, nanoimprinting, nanomoulding, transfer moulding, functional materials, transparent conductive oxides, microengineering, photovoltaics
Play Button
The Tail Suspension Test
Authors: Adem Can, David T. Dao, Chantelle E. Terrillion, Sean C. Piantadosi, Shambhu Bhat, Todd D. Gould.
Institutions: University of Maryland School of Medicine, Tulane University School of Medicine, University of Maryland , University of Maryland School of Medicine.
The tail-suspension test is a mouse behavioral test useful in the screening of potential antidepressant drugs, and assessing of other manipulations that are expected to affect depression related behaviors. Mice are suspended by their tails with tape, in such a position that it cannot escape or hold on to nearby surfaces. During this test, typically six minutes in duration, the resulting escape oriented behaviors are quantified. The tail-suspension test is a valuable tool in drug discovery for high-throughput screening of prospective antidepressant compounds. Here, we describe the details required for implementation of this test with additional emphasis on potential problems that may occur and how to avoid them. We also offer a solution to the tail climbing behavior, a common problem that renders this test useless in some mouse strains, such as the widely used C57BL/6. Specifically, we prevent tail climbing behaviors by passing mouse tails through a small plastic cylinder prior to suspension. Finally, we detail how to manually score the behaviors that are manifested in this test.
Neuroscience, Issue 59, animal models, behavioral analysis, neuroscience, neurobiology, mood disorder, depression, mood stabilizer, antidepressant
Play Button
From Fast Fluorescence Imaging to Molecular Diffusion Law on Live Cell Membranes in a Commercial Microscope
Authors: Carmine Di Rienzo, Enrico Gratton, Fabio Beltram, Francesco Cardarelli.
Institutions: Scuola Normale Superiore, Instituto Italiano di Tecnologia, University of California, Irvine.
It has become increasingly evident that the spatial distribution and the motion of membrane components like lipids and proteins are key factors in the regulation of many cellular functions. However, due to the fast dynamics and the tiny structures involved, a very high spatio-temporal resolution is required to catch the real behavior of molecules. Here we present the experimental protocol for studying the dynamics of fluorescently-labeled plasma-membrane proteins and lipids in live cells with high spatiotemporal resolution. Notably, this approach doesn’t need to track each molecule, but it calculates population behavior using all molecules in a given region of the membrane. The starting point is a fast imaging of a given region on the membrane. Afterwards, a complete spatio-temporal autocorrelation function is calculated correlating acquired images at increasing time delays, for example each 2, 3, n repetitions. It is possible to demonstrate that the width of the peak of the spatial autocorrelation function increases at increasing time delay as a function of particle movement due to diffusion. Therefore, fitting of the series of autocorrelation functions enables to extract the actual protein mean square displacement from imaging (iMSD), here presented in the form of apparent diffusivity vs average displacement. This yields a quantitative view of the average dynamics of single molecules with nanometer accuracy. By using a GFP-tagged variant of the Transferrin Receptor (TfR) and an ATTO488 labeled 1-palmitoyl-2-hydroxy-sn-glycero-3-phosphoethanolamine (PPE) it is possible to observe the spatiotemporal regulation of protein and lipid diffusion on µm-sized membrane regions in the micro-to-milli-second time range.
Bioengineering, Issue 92, fluorescence, protein dynamics, lipid dynamics, membrane heterogeneity, transient confinement, single molecule, GFP
Play Button
Accuracy in Dental Medicine, A New Way to Measure Trueness and Precision
Authors: Andreas Ender, Albert Mehl.
Institutions: University of Zürich.
Reference scanners are used in dental medicine to verify a lot of procedures. The main interest is to verify impression methods as they serve as a base for dental restorations. The current limitation of many reference scanners is the lack of accuracy scanning large objects like full dental arches, or the limited possibility to assess detailed tooth surfaces. A new reference scanner, based on focus variation scanning technique, was evaluated with regards to highest local and general accuracy. A specific scanning protocol was tested to scan original tooth surface from dental impressions. Also, different model materials were verified. The results showed a high scanning accuracy of the reference scanner with a mean deviation of 5.3 ± 1.1 µm for trueness and 1.6 ± 0.6 µm for precision in case of full arch scans. Current dental impression methods showed much higher deviations (trueness: 20.4 ± 2.2 µm, precision: 12.5 ± 2.5 µm) than the internal scanning accuracy of the reference scanner. Smaller objects like single tooth surface can be scanned with an even higher accuracy, enabling the system to assess erosive and abrasive tooth surface loss. The reference scanner can be used to measure differences for a lot of dental research fields. The different magnification levels combined with a high local and general accuracy can be used to assess changes of single teeth or restorations up to full arch changes.
Medicine, Issue 86, Laboratories, Dental, Calibration, Technology, Dental impression, Accuracy, Trueness, Precision, Full arch scan, Abrasion
Play Button
Creating Objects and Object Categories for Studying Perception and Perceptual Learning
Authors: Karin Hauffen, Eugene Bart, Mark Brady, Daniel Kersten, Jay Hegdé.
Institutions: Georgia Health Sciences University, Georgia Health Sciences University, Georgia Health Sciences University, Palo Alto Research Center, Palo Alto Research Center, University of Minnesota .
In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2. Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics15,16. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects9,13. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper. We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have. Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis.
Neuroscience, Issue 69, machine learning, brain, classification, category learning, cross-modal perception, 3-D prototyping, inference
Play Button
Tumor Treating Field Therapy in Combination with Bevacizumab for the Treatment of Recurrent Glioblastoma
Authors: Ayman I. Omar.
Institutions: Southern Illinois University School of Medicine.
A novel device that employs TTF therapy has recently been developed and is currently in use for the treatment of recurrent glioblastoma (rGBM). It was FDA approved in April 2011 for the treatment of patients 22 years or older with rGBM. The device delivers alternating electric fields and is programmed to ensure maximal tumor cell kill1. Glioblastoma is the most common type of glioma and has an estimated incidence of approximately 10,000 new cases per year in the United States alone2. This tumor is particularly resistant to treatment and is uniformly fatal especially in the recurrent setting3-5. Prior to the approval of the TTF System, the only FDA approved treatment for rGBM was bevacizumab6. Bevacizumab is a humanized monoclonal antibody targeted against the vascular endothelial growth factor (VEGF) protein that drives tumor angiogenesis7. By blocking the VEGF pathway, bevacizumab can result in a significant radiographic response (pseudoresponse), improve progression free survival and reduce corticosteroid requirements in rGBM patients8,9. Bevacizumab however failed to prolong overall survival in a recent phase III trial26. A pivotal phase III trial (EF-11) demonstrated comparable overall survival between physicians’ choice chemotherapy and TTF Therapy but better quality of life were observed in the TTF arm10. There is currently an unmet need to develop novel approaches designed to prolong overall survival and/or improve quality of life in this unfortunate patient population. One appealing approach would be to combine the two currently approved treatment modalities namely bevacizumab and TTF Therapy. These two treatments are currently approved as monotherapy11,12, but their combination has never been evaluated in a clinical trial. We have developed an approach for combining those two treatment modalities and treated 2 rGBM patients. Here we describe a detailed methodology outlining this novel treatment protocol and present representative data from one of the treated patients.
Medicine, Issue 92, Tumor Treating Fields, TTF System, TTF Therapy, Recurrent Glioblastoma, Bevacizumab, Brain Tumor
Play Button
Electric Cell-substrate Impedance Sensing for the Quantification of Endothelial Proliferation, Barrier Function, and Motility
Authors: Robert Szulcek, Harm Jan Bogaard, Geerten P. van Nieuw Amerongen.
Institutions: Institute for Cardiovascular Research, VU University Medical Center, Institute for Cardiovascular Research, VU University Medical Center.
Electric Cell-substrate Impedance Sensing (ECIS) is an in vitro impedance measuring system to quantify the behavior of cells within adherent cell layers. To this end, cells are grown in special culture chambers on top of opposing, circular gold electrodes. A constant small alternating current is applied between the electrodes and the potential across is measured. The insulating properties of the cell membrane create a resistance towards the electrical current flow resulting in an increased electrical potential between the electrodes. Measuring cellular impedance in this manner allows the automated study of cell attachment, growth, morphology, function, and motility. Although the ECIS measurement itself is straightforward and easy to learn, the underlying theory is complex and selection of the right settings and correct analysis and interpretation of the data is not self-evident. Yet, a clear protocol describing the individual steps from the experimental design to preparation, realization, and analysis of the experiment is not available. In this article the basic measurement principle as well as possible applications, experimental considerations, advantages and limitations of the ECIS system are discussed. A guide is provided for the study of cell attachment, spreading and proliferation; quantification of cell behavior in a confluent layer, with regard to barrier function, cell motility, quality of cell-cell and cell-substrate adhesions; and quantification of wound healing and cellular responses to vasoactive stimuli. Representative results are discussed based on human microvascular (MVEC) and human umbilical vein endothelial cells (HUVEC), but are applicable to all adherent growing cells.
Bioengineering, Issue 85, ECIS, Impedance Spectroscopy, Resistance, TEER, Endothelial Barrier, Cell Adhesions, Focal Adhesions, Proliferation, Migration, Motility, Wound Healing
Play Button
Magnetic Tweezers for the Measurement of Twist and Torque
Authors: Jan Lipfert, Mina Lee, Orkide Ordu, Jacob W. J. Kerssemakers, Nynke H. Dekker.
Institutions: Delft University of Technology.
Single-molecule techniques make it possible to investigate the behavior of individual biological molecules in solution in real time. These techniques include so-called force spectroscopy approaches such as atomic force microscopy, optical tweezers, flow stretching, and magnetic tweezers. Amongst these approaches, magnetic tweezers have distinguished themselves by their ability to apply torque while maintaining a constant stretching force. Here, it is illustrated how such a “conventional” magnetic tweezers experimental configuration can, through a straightforward modification of its field configuration to minimize the magnitude of the transverse field, be adapted to measure the degree of twist in a biological molecule. The resulting configuration is termed the freely-orbiting magnetic tweezers. Additionally, it is shown how further modification of the field configuration can yield a transverse field with a magnitude intermediate between that of the “conventional” magnetic tweezers and the freely-orbiting magnetic tweezers, which makes it possible to directly measure the torque stored in a biological molecule. This configuration is termed the magnetic torque tweezers. The accompanying video explains in detail how the conversion of conventional magnetic tweezers into freely-orbiting magnetic tweezers and magnetic torque tweezers can be accomplished, and demonstrates the use of these techniques. These adaptations maintain all the strengths of conventional magnetic tweezers while greatly expanding the versatility of this powerful instrument.
Bioengineering, Issue 87, magnetic tweezers, magnetic torque tweezers, freely-orbiting magnetic tweezers, twist, torque, DNA, single-molecule techniques
Play Button
Utility of Dissociated Intrinsic Hand Muscle Atrophy in the Diagnosis of Amyotrophic Lateral Sclerosis
Authors: Parvathi Menon, Steve Vucic.
Institutions: Westmead Hospital, University of Sydney, Australia.
The split hand phenomenon refers to predominant wasting of thenar muscles and is an early and specific feature of amyotrophic lateral sclerosis (ALS). A novel split hand index (SI) was developed to quantify the split hand phenomenon, and its diagnostic utility was assessed in ALS patients. The split hand index was derived by dividing the product of the compound muscle action potential (CMAP) amplitude recorded over the abductor pollicis brevis and first dorsal interosseous muscles by the CMAP amplitude recorded over the abductor digiti minimi muscle. In order to assess the diagnostic utility of the split hand index, ALS patients were prospectively assessed and their results were compared to neuromuscular disorder patients. The split hand index was significantly reduced in ALS when compared to neuromuscular disorder patients (P<0.0001). Limb-onset ALS patients exhibited the greatest reduction in the split hand index, and a value of 5.2 or less reliably differentiated ALS from other neuromuscular disorders. Consequently, the split hand index appears to be a novel diagnostic biomarker for ALS, perhaps facilitating an earlier diagnosis.
Medicine, Issue 85, Amyotrophic Lateral Sclerosis (ALS), dissociated muscle atrophy, hypothenar muscles, motor neuron disease, split-hand index, thenar muscles
Play Button
Digital Inline Holographic Microscopy (DIHM) of Weakly-scattering Subjects
Authors: Camila B. Giuliano, Rongjing Zhang, Laurence G. Wilson.
Institutions: Harvard University, Universidade Estadual Paulista.
Weakly-scattering objects, such as small colloidal particles and most biological cells, are frequently encountered in microscopy. Indeed, a range of techniques have been developed to better visualize these phase objects; phase contrast and DIC are among the most popular methods for enhancing contrast. However, recording position and shape in the out-of-imaging-plane direction remains challenging. This report introduces a simple experimental method to accurately determine the location and geometry of objects in three dimensions, using digital inline holographic microscopy (DIHM). Broadly speaking, the accessible sample volume is defined by the camera sensor size in the lateral direction, and the illumination coherence in the axial direction. Typical sample volumes range from 200 µm x 200 µm x 200 µm using LED illumination, to 5 mm x 5 mm x 5 mm or larger using laser illumination. This illumination light is configured so that plane waves are incident on the sample. Objects in the sample volume then scatter light, which interferes with the unscattered light to form interference patterns perpendicular to the illumination direction. This image (the hologram) contains the depth information required for three-dimensional reconstruction, and can be captured on a standard imaging device such as a CMOS or CCD camera. The Rayleigh-Sommerfeld back propagation method is employed to numerically refocus microscope images, and a simple imaging heuristic based on the Gouy phase anomaly is used to identify scattering objects within the reconstructed volume. This simple but robust method results in an unambiguous, model-free measurement of the location and shape of objects in microscopic samples.
Basic Protocol, Issue 84, holography, digital inline holographic microscopy (DIHM), Microbiology, microscopy, 3D imaging, Streptococcus bacteria
Play Button
Clinical Assessment of Spatiotemporal Gait Parameters in Patients and Older Adults
Authors: Julia F. Item-Glatthorn, Nicola A. Maffiuletti.
Institutions: Schulthess Clinic.
Spatial and temporal characteristics of human walking are frequently evaluated to identify possible gait impairments, mainly in orthopedic and neurological patients1-4, but also in healthy older adults5,6. The quantitative gait analysis described in this protocol is performed with a recently-introduced photoelectric system (see Materials table) which has the potential to be used in the clinic because it is portable, easy to set up (no subject preparation is required before a test), and does not require maintenance and sensor calibration. The photoelectric system consists of series of high-density floor-based photoelectric cells with light-emitting and light-receiving diodes that are placed parallel to each other to create a corridor, and are oriented perpendicular to the line of progression7. The system simply detects interruptions in light signal, for instance due to the presence of feet within the recording area. Temporal gait parameters and 1D spatial coordinates of consecutive steps are subsequently calculated to provide common gait parameters such as step length, single limb support and walking velocity8, whose validity against a criterion instrument has recently been demonstrated7,9. The measurement procedures are very straightforward; a single patient can be tested in less than 5 min and a comprehensive report can be generated in less than 1 min.
Medicine, Issue 93, gait analysis, walking, floor-based photocells, spatiotemporal, elderly, orthopedic patients, neurological patients
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Play Button
Contextual and Cued Fear Conditioning Test Using a Video Analyzing System in Mice
Authors: Hirotaka Shoji, Keizo Takao, Satoko Hattori, Tsuyoshi Miyakawa.
Institutions: Fujita Health University, Core Research for Evolutionary Science and Technology (CREST), National Institutes of Natural Sciences.
The contextual and cued fear conditioning test is one of the behavioral tests that assesses the ability of mice to learn and remember an association between environmental cues and aversive experiences. In this test, mice are placed into a conditioning chamber and are given parings of a conditioned stimulus (an auditory cue) and an aversive unconditioned stimulus (an electric footshock). After a delay time, the mice are exposed to the same conditioning chamber and a differently shaped chamber with presentation of the auditory cue. Freezing behavior during the test is measured as an index of fear memory. To analyze the behavior automatically, we have developed a video analyzing system using the ImageFZ application software program, which is available as a free download at Here, to show the details of our protocol, we demonstrate our procedure for the contextual and cued fear conditioning test in C57BL/6J mice using the ImageFZ system. In addition, we validated our protocol and the video analyzing system performance by comparing freezing time measured by the ImageFZ system or a photobeam-based computer measurement system with that scored by a human observer. As shown in our representative results, the data obtained by ImageFZ were similar to those analyzed by a human observer, indicating that the behavioral analysis using the ImageFZ system is highly reliable. The present movie article provides detailed information regarding the test procedures and will promote understanding of the experimental situation.
Behavior, Issue 85, Fear, Learning, Memory, ImageFZ program, Mouse, contextual fear, cued fear
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Quantification of Orofacial Phenotypes in Xenopus
Authors: Allyson E. Kennedy, Amanda J. Dickinson.
Institutions: Virginia Commonwealth University.
Xenopus has become an important tool for dissecting the mechanisms governing craniofacial development and defects. A method to quantify orofacial development will allow for more rigorous analysis of orofacial phenotypes upon abrogation with substances that can genetically or molecularly manipulate gene expression or protein function. Using two dimensional images of the embryonic heads, traditional size dimensions-such as orofacial width, height and area- are measured. In addition, a roundness measure of the embryonic mouth opening is used to describe the shape of the mouth. Geometric morphometrics of these two dimensional images is also performed to provide a more sophisticated view of changes in the shape of the orofacial region. Landmarks are assigned to specific points in the orofacial region and coordinates are created. A principle component analysis is used to reduce landmark coordinates to principle components that then discriminate the treatment groups. These results are displayed as a scatter plot in which individuals with similar orofacial shapes cluster together. It is also useful to perform a discriminant function analysis, which statistically compares the positions of the landmarks between two treatment groups. This analysis is displayed on a transformation grid where changes in landmark position are viewed as vectors. A grid is superimposed on these vectors so that a warping pattern is displayed to show where significant landmark positions have changed. Shape changes in the discriminant function analysis are based on a statistical measure, and therefore can be evaluated by a p-value. This analysis is simple and accessible, requiring only a stereoscope and freeware software, and thus will be a valuable research and teaching resource.
Developmental Biology, Issue 93, Orofacial quantification, geometric morphometrics, Xenopus, orofacial development, orofacial defects, shape changes, facial dimensions
Play Button
Measurement of Tension Release During Laser Induced Axon Lesion to Evaluate Axonal Adhesion to the Substrate at Piconewton and Millisecond Resolution
Authors: Massimo Vassalli, Michele Basso, Francesco Difato.
Institutions: National Research Council of Italy, Università di Firenze, Istituto Italiano di Tecnologia.
The formation of functional connections in a developing neuronal network is influenced by extrinsic cues. The neurite growth of developing neurons is subject to chemical and mechanical signals, and the mechanisms by which it senses and responds to mechanical signals are poorly understood. Elucidating the role of forces in cell maturation will enable the design of scaffolds that can promote cell adhesion and cytoskeletal coupling to the substrate, and therefore improve the capacity of different neuronal types to regenerate after injury. Here, we describe a method to apply simultaneous force spectroscopy measurements during laser induced cell lesion. We measure tension release in the partially lesioned axon by simultaneous interferometric tracking of an optically trapped probe adhered to the membrane of the axon. Our experimental protocol detects the tension release with piconewton sensitivity, and the dynamic of the tension release at millisecond time resolution. Therefore, it offers a high-resolution method to study how the mechanical coupling between cells and substrates can be modulated by pharmacological treatment and/or by distinct mechanical properties of the substrate.
Bioengineering, Issue 75, Biophysics, Neuroscience, Cellular Biology, Biomedical Engineering, Engineering (General), Life Sciences (General), Physics (General), Axon, tension release, Laser dissector, optical tweezers, force spectroscopy, neurons, neurites, cytoskeleton, adhesion, cell culture, microscopy
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Authors: Jeremy D. Smith, Abbie E. Ferris, Gary D. Heise, Richard N. Hinrichs, Philip E. Martin.
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Play Button
Minimal Erythema Dose (MED) Testing
Authors: Carolyn J. Heckman, Rachel Chandler, Jacqueline D. Kloss, Amy Benson, Deborah Rooney, Teja Munshi, Susan D. Darlow, Clifford Perlis, Sharon L. Manne, David W. Oslin.
Institutions: Fox Chase Cancer Center , University of Pennsylvania , Drexel University , Fox Chase Cancer Center , The Cancer Institute of New Jersey.
Ultraviolet radiation (UV) therapy is sometimes used as a treatment for various common skin conditions, including psoriasis, acne, and eczema. The dosage of UV light is prescribed according to an individual's skin sensitivity. Thus, to establish the proper dosage of UV light to administer to a patient, the patient is sometimes screened to determine a minimal erythema dose (MED), which is the amount of UV radiation that will produce minimal erythema (sunburn or redness caused by engorgement of capillaries) of an individual's skin within a few hours following exposure. This article describes how to conduct minimal erythema dose (MED) testing. There is currently no easy way to determine an appropriate UV dose for clinical or research purposes without conducting formal MED testing, requiring observation hours after testing, or informal trial and error testing with the risks of under- or over-dosing. However, some alternative methods are discussed.
Medicine, Issue 75, Anatomy, Physiology, Dermatology, Analytical, Diagnostic, Therapeutic Techniques, Equipment, Health Care, Minimal erythema dose (MED) testing, skin sensitivity, ultraviolet radiation, spectrophotometry, UV exposure, psoriasis, acne, eczema, clinical techniques
Play Button
Concentration Determination of Nucleic Acids and Proteins Using the Micro-volume Bio-spec Nano Spectrophotometer
Authors: Suja Sukumaran.
Institutions: Scientific Instruments.
Nucleic Acid quantitation procedures have advanced significantly in the last three decades. More and more, molecular biologists require consistent small-volume analysis of nucleic acid samples for their experiments. The BioSpec-nano provides a potential solution to the problems of inaccurate, non-reproducible results, inherent in current DNA quantitation methods, via specialized optics and a sensitive PDA detector. The BioSpec-nano also has automated functionality such that mounting, measurement, and cleaning are done by the instrument, thereby eliminating tedious, repetitive, and inconsistent placement of the fiber optic element and manual cleaning. In this study, data is presented on the quantification of DNA and protein, as well as on measurement reproducibility and accuracy. Automated sample contact and rapid scanning allows measurement in three seconds, resulting in excellent throughput. Data analysis is carried out using the built-in features of the software. The formula used for calculating DNA concentration is: Sample Concentration = DF · (OD260-OD320)· NACF (1) Where DF = sample dilution factor and NACF = nucleic acid concentration factor. The Nucleic Acid concentration factor is set in accordance with the analyte selected1. Protein concentration results can be expressed as μg/ mL or as moles/L by entering e280 and molecular weight values respectively. When residue values for Tyr, Trp and Cysteine (S-S bond) are entered in the e280Calc tab, the extinction coefficient values are calculated as e280 = 5500 x (Trp residues) + 1490 x (Tyr residues) + 125 x (cysteine S-S bond). The e280 value is used by the software for concentration calculation. In addition to concentration determination of nucleic acids and protein, the BioSpec-nano can be used as an ultra micro-volume spectrophotometer for many other analytes or as a standard spectrophotometer using 5 mm pathlength cells.
Molecular Biology, Issue 48, Nucleic acid quantitation, protein quantitation, micro-volume analysis, label quantitation
Play Button
Deep Neuromuscular Blockade Leads to a Larger Intraabdominal Volume During Laparoscopy
Authors: Astrid Listov Lindekaer, Henrik Halvor Springborg, Olav Istre.
Institutions: Aleris-Hamlet Hospitals, Soeborg, Denmark, Aleris-Hamlet Hospitals, Soeborg, Denmark.
Shoulder pain is a commonly reported symptom following laparoscopic procedures such as myomectomy or hysterectomy, and recent studies have shown that lowering the insufflation pressure during surgery may reduce the risk of post-operative pain. In this pilot study, a method is presented for measuring the intra-abdominal space available to the surgeon during laproscopy, in order to examine whether the relaxation produced by deep neuromuscular blockade can increase the working surgical space sufficiently to permit a reduction in the CO2 insufflation pressure. Using the laproscopic grasper, the distance from the promontory to the skin is measured at two different insufflation pressures: 8 mm Hg and 12 mm Hg. After the initial measurements, a neuromuscular blocking agent (rocuronium) is administered to the patient and the intra-abdominal volume is measured again. Pilot data collected from 15 patients shows that the intra-abdominal space at 8 mm Hg with blockade is comparable to the intra-abdominal space measured at 12 mm Hg without blockade. The impact of neuromuscular blockade was not correlated with patient height, weight, BMI, and age. Thus, using neuromuscular blockade to maintain a steady volume while reducing insufflation pressure may produce improved patient outcomes.
Medicine, Issue 76, Anatomy, Physiology, Neurobiology, Surgery, gynecology, laparoscopy, deep neuromuscular blockade, reversal, rocuronium, sugammadex, laparoscopic surgery, clinical techniques, surgical techniques
Play Button
Non-invasive Assessment of Microvascular and Endothelial Function
Authors: Cynthia Cheng, Constantine Daskalakis, Bonita Falkner.
Institutions: Thomas Jefferson University , Thomas Jefferson University, Thomas Jefferson University .
The authors have utilized capillaroscopy and forearm blood flow techniques to investigate the role of microvascular dysfunction in pathogenesis of cardiovascular disease. Capillaroscopy is a non-invasive, relatively inexpensive methodology for directly visualizing the microcirculation. Percent capillary recruitment is assessed by dividing the increase in capillary density induced by postocclusive reactive hyperemia (postocclusive reactive hyperemia capillary density minus baseline capillary density), by the maximal capillary density (observed during passive venous occlusion). Percent perfused capillaries represents the proportion of all capillaries present that are perfused (functionally active), and is calculated by dividing postocclusive reactive hyperemia capillary density by the maximal capillary density. Both percent capillary recruitment and percent perfused capillaries reflect the number of functional capillaries. The forearm blood flow (FBF) technique provides accepted non-invasive measures of endothelial function: The ratio FBFmax/FBFbase is computed as an estimate of vasodilation, by dividing the mean of the four FBFmax values by the mean of the four FBFbase values. Forearm vascular resistance at maximal vasodilation (FVRmax) is calculated as the mean arterial pressure (MAP) divided by FBFmax. Both the capillaroscopy and forearm techniques are readily acceptable to patients and can be learned quickly. The microvascular and endothelial function measures obtained using the methodologies described in this paper may have future utility in clinical patient cardiovascular risk-reduction strategies. As we have published reports demonstrating that microvascular and endothelial dysfunction are found in initial stages of hypertension including prehypertension, microvascular and endothelial function measures may eventually aid in early identification, risk-stratification and prevention of end-stage vascular pathology, with its potentially fatal consequences.
Medicine, Issue 71, Anatomy, Physiology, Immunology, Pharmacology, Hematology, Diseases, Health Care, Life sciences, Microcirculation, endothelial dysfunction, capillary density, microvascular function, blood vessels, capillaries, capillary, venous occlusion, circulation, experimental therapeutics, capillaroscopy
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.