JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Generating correlation matrices based on the boundaries of their coefficients.
Correlation coefficients among multiple variables are commonly described in the form of matrices. Applications of such correlation matrices can be found in many fields, such as finance, engineering, statistics, and medicine. This article proposes an efficient way to sequentially obtain the theoretical bounds of correlation coefficients together with an algorithm to generate n × n correlation matrices using any bounded random variables. Interestingly, the correlation matrices generated by this method using uniform random variables as an example produce more extreme relationships among the variables than other methods, which might be useful for modeling complex biological systems where rare cases are very important.
Authors: Thomas Z. Thompson, Farres Obeidin, Alisa A. Davidoff, Cody L. Hightower, Christohper Z. Johnson, Sonya L. Rice, Rebecca-Lyn Sokolove, Brandon K. Taylor, John M. Tuck, William G. Pearson, Jr..
Published: 05-06-2014
Characterizing hyolaryngeal movement is important to dysphagia research. Prior methods require multiple measurements to obtain one kinematic measurement whereas coordinate mapping of hyolaryngeal mechanics using Modified Barium Swallow (MBS) uses one set of coordinates to calculate multiple variables of interest. For demonstration purposes, ten kinematic measurements were generated from one set of coordinates to determine differences in swallowing two different bolus types. Calculations of hyoid excursion against the vertebrae and mandible are correlated to determine the importance of axes of reference. To demonstrate coordinate mapping methodology, 40 MBS studies were randomly selected from a dataset of healthy normal subjects with no known swallowing impairment. A 5 ml thin-liquid bolus and a 5 ml pudding swallows were measured from each subject. Nine coordinates, mapping the cranial base, mandible, vertebrae and elements of the hyolaryngeal complex, were recorded at the frames of minimum and maximum hyolaryngeal excursion. Coordinates were mathematically converted into ten variables of hyolaryngeal mechanics. Inter-rater reliability was evaluated by Intraclass correlation coefficients (ICC). Two-tailed t-tests were used to evaluate differences in kinematics by bolus viscosity. Hyoid excursion measurements against different axes of reference were correlated. Inter-rater reliability among six raters for the 18 coordinates ranged from ICC = 0.90 - 0.97. A slate of ten kinematic measurements was compared by subject between the six raters. One outlier was rejected, and the mean of the remaining reliability scores was ICC = 0.91, 0.84 - 0.96, 95% CI. Two-tailed t-tests with Bonferroni corrections comparing ten kinematic variables (5 ml thin-liquid vs. 5 ml pudding swallows) showed statistically significant differences in hyoid excursion, superior laryngeal movement, and pharyngeal shortening (p < 0.005). Pearson correlations of hyoid excursion measurements from two different axes of reference were: r = 0.62, r2 = 0.38, (thin-liquid); r = 0.52, r2 = 0.27, (pudding). Obtaining landmark coordinates is a reliable method to generate multiple kinematic variables from video fluoroscopic images useful in dysphagia research.
19 Related JoVE Articles!
Play Button
Easy Measurement of Diffusion Coefficients of EGFP-tagged Plasma Membrane Proteins Using k-Space Image Correlation Spectroscopy
Authors: Eva C. Arnspang, Jennifer S. Koffman, Saw Marlar, Paul W. Wiseman, Lene N. Nejsum.
Institutions: Aarhus University, McGill University.
Lateral diffusion and compartmentalization of plasma membrane proteins are tightly regulated in cells and thus, studying these processes will reveal new insights to plasma membrane protein function and regulation. Recently, k-Space Image Correlation Spectroscopy (kICS)1 was developed to enable routine measurements of diffusion coefficients directly from images of fluorescently tagged plasma membrane proteins, that avoided systematic biases introduced by probe photophysics. Although the theoretical basis for the analysis is complex, the method can be implemented by nonexperts using a freely available code to measure diffusion coefficients of proteins. kICS calculates a time correlation function from a fluorescence microscopy image stack after Fourier transformation of each image to reciprocal (k-) space. Subsequently, circular averaging, natural logarithm transform and linear fits to the correlation function yields the diffusion coefficient. This paper provides a step-by-step guide to the image analysis and measurement of diffusion coefficients via kICS. First, a high frame rate image sequence of a fluorescently labeled plasma membrane protein is acquired using a fluorescence microscope. Then, a region of interest (ROI) avoiding intracellular organelles, moving vesicles or protruding membrane regions is selected. The ROI stack is imported into a freely available code and several defined parameters (see Method section) are set for kICS analysis. The program then generates a "slope of slopes" plot from the k-space time correlation functions, and the diffusion coefficient is calculated from the slope of the plot. Below is a step-by-step kICS procedure to measure the diffusion coefficient of a membrane protein using the renal water channel aquaporin-3 tagged with EGFP as a canonical example.
Biophysics, Issue 87, Amino Acids, Peptides and Proteins, Computer Programming and Software, Diffusion coefficient, Aquaporin-3, k-Space Image Correlation Spectroscopy, Analysis
Play Button
Dithranol as a Matrix for Matrix Assisted Laser Desorption/Ionization Imaging on a Fourier Transform Ion Cyclotron Resonance Mass Spectrometer
Authors: Cuong H. Le, Jun Han, Christoph H. Borchers.
Institutions: University of Victoria, University of Victoria.
Mass spectrometry imaging (MSI) determines the spatial localization and distribution patterns of compounds on the surface of a tissue section, mainly using MALDI (matrix assisted laser desorption/ionization)-based analytical techniques. New matrices for small-molecule MSI, which can improve the analysis of low-molecular weight (MW) compounds, are needed. These matrices should provide increased analyte signals while decreasing MALDI background signals. In addition, the use of ultrahigh-resolution instruments, such as Fourier transform ion cyclotron resonance (FTICR) mass spectrometers, has the ability to resolve analyte signals from matrix signals, and this can partially overcome many problems associated with the background originating from the MALDI matrix. The reduction in the intensities of the metastable matrix clusters by FTICR MS can also help to overcome some of the interferences associated with matrix peaks on other instruments. High-resolution instruments such as the FTICR mass spectrometers are advantageous as they can produce distribution patterns of many compounds simultaneously while still providing confidence in chemical identifications. Dithranol (DT; 1,8-dihydroxy-9,10-dihydroanthracen-9-one) has previously been reported as a MALDI matrix for tissue imaging. In this work, a protocol for the use of DT for MALDI imaging of endogenous lipids from the surfaces of mammalian tissue sections, by positive-ion MALDI-MS, on an ultrahigh-resolution hybrid quadrupole FTICR instrument has been provided.
Basic Protocol, Issue 81, eye, molecular imaging, chemistry technique, analytical, mass spectrometry, matrix assisted laser desorption/ionization (MALDI), tandem mass spectrometry, lipid, tissue imaging, bovine lens, dithranol, matrix, FTICR (Fourier Transform Ion Cyclotron Resonance)
Play Button
Recapitulation of an Ion Channel IV Curve Using Frequency Components
Authors: John R. Rigby, Steven Poelzing.
Institutions: University of Utah.
INTRODUCTION: Presently, there are no established methods to measure multiple ion channel types simultaneously and decompose the measured current into portions attributable to each channel type. This study demonstrates how impedance spectroscopy may be used to identify specific frequencies that highly correlate with the steady state current amplitude measured during voltage clamp experiments. The method involves inserting a noise function containing specific frequencies into the voltage step protocol. In the work presented, a model cell is used to demonstrate that no high correlations are introduced by the voltage clamp circuitry, and also that the noise function itself does not introduce any high correlations when no ion channels are present. This validation is necessary before the technique can be applied to preparations containing ion channels. The purpose of the protocol presented is to demonstrate how to characterize the frequency response of a single ion channel type to a noise function. Once specific frequencies have been identified in an individual channel type, they can be used to reproduce the steady state current voltage (IV) curve. Frequencies that highly correlate with one channel type and minimally correlate with other channel types may then be used to estimate the current contribution of multiple channel types measured simultaneously. METHODS: Voltage clamp measurements were performed on a model cell using a standard voltage step protocol (-150 to +50 mV, 5mV steps). Noise functions containing equal magnitudes of 1-15 kHz frequencies (zero to peak amplitudes: 50 or 100mV) were inserted into each voltage step. The real component of the Fast Fourier transform (FFT) of the output signal was calculated with and without noise for each step potential. The magnitude of each frequency as a function of voltage step was correlated with the current amplitude at the corresponding voltages. RESULTS AND CONCLUSIONS: In the absence of noise (control), magnitudes of all frequencies except the DC component correlated poorly (|R|<0.5) with the IV curve, whereas the DC component had a correlation coefficient greater than 0.999 in all measurements. The quality of correlation between individual frequencies and the IV curve did not change when a noise function was added to the voltage step protocol. Likewise, increasing the amplitude of the noise function also did not increase the correlation. Control measurements demonstrate that the voltage clamp circuitry by itself does not cause any frequencies above 0 Hz to highly correlate with the steady-state IV curve. Likewise, measurements in the presence of the noise function demonstrate that the noise function does not cause any frequencies above 0 Hz to correlate with the steady-state IV curve when no ion channels are present. Based on this verification, the method can now be applied to preparations containing a single ion channel type with the intent of identifying frequencies whose amplitudes correlate specifically with that channel type.
Biophysics, Issue 48, Ion channel, Kir2.1, impedance spectroscopy, frequency response, voltage clamp, electrophysiology
Play Button
Generation of Comprehensive Thoracic Oncology Database - Tool for Translational Research
Authors: Mosmi Surati, Matthew Robinson, Suvobroto Nandi, Leonardo Faoro, Carley Demchuk, Rajani Kanteti, Benjamin Ferguson, Tara Gangadhar, Thomas Hensing, Rifat Hasina, Aliya Husain, Mark Ferguson, Theodore Karrison, Ravi Salgia.
Institutions: University of Chicago, University of Chicago, Northshore University Health Systems, University of Chicago, University of Chicago, University of Chicago.
The Thoracic Oncology Program Database Project was created to serve as a comprehensive, verified, and accessible repository for well-annotated cancer specimens and clinical data to be available to researchers within the Thoracic Oncology Research Program. This database also captures a large volume of genomic and proteomic data obtained from various tumor tissue studies. A team of clinical and basic science researchers, a biostatistician, and a bioinformatics expert was convened to design the database. Variables of interest were clearly defined and their descriptions were written within a standard operating manual to ensure consistency of data annotation. Using a protocol for prospective tissue banking and another protocol for retrospective banking, tumor and normal tissue samples from patients consented to these protocols were collected. Clinical information such as demographics, cancer characterization, and treatment plans for these patients were abstracted and entered into an Access database. Proteomic and genomic data have been included in the database and have been linked to clinical information for patients described within the database. The data from each table were linked using the relationships function in Microsoft Access to allow the database manager to connect clinical and laboratory information during a query. The queried data can then be exported for statistical analysis and hypothesis generation.
Medicine, Issue 47, Database, Thoracic oncology, Bioinformatics, Biorepository, Microsoft Access, Proteomics, Genomics
Play Button
Time Multiplexing Super Resolving Technique for Imaging from a Moving Platform
Authors: Asaf Ilovitsh, Shlomo Zach, Zeev Zalevsky.
Institutions: Bar-Ilan University, Kfar Saba, Israel.
We propose a method for increasing the resolution of an object and overcoming the diffraction limit of an optical system installed on top of a moving imaging system, such as an airborne platform or satellite. The resolution improvement is obtained in a two-step process. First, three low resolution differently defocused images are being captured and the optical phase is retrieved using an improved iterative Gerchberg-Saxton based algorithm. The phase retrieval allows to numerically back propagate the field to the aperture plane. Second, the imaging system is shifted and the first step is repeated. The obtained optical fields at the aperture plane are combined and a synthetically increased lens aperture is generated along the direction of movement, yielding higher imaging resolution. The method resembles a well-known approach from the microwave regime called the Synthetic Aperture Radar (SAR) in which the antenna size is synthetically increased along the platform propagation direction. The proposed method is demonstrated through laboratory experiment.
Physics, Issue 84, Superresolution, Fourier optics, Remote Sensing and Sensors, Digital Image Processing, optics, resolution
Play Button
Rapid Point-of-Care Assay of Enoxaparin Anticoagulant Efficacy in Whole Blood
Authors: Mario A. Inchiosa Jr., Suryanarayana Pothula, Keshar Kubal, Vajubhai T. Sanchala, Iris Navarro.
Institutions: New York Medical College , New York Medical College .
There is the need for a clinical assay to determine the extent to which a patient's blood is effectively anticoagulated by the low-molecular-weight-heparin (LMWH), enoxaparin. There are also urgent clinical situations where it would be important if this could be determined rapidly. The present assay is designed to accomplish this. We only assayed human blood samples that were spiked with known concentrations of enoxaparin. The essential feature of the present assay is the quantification of the efficacy of enoxaparin in a patient's blood sample by degrading it to complete inactivity with heparinase. Two blood samples were drawn into Vacutainer tubes (Becton-Dickenson; Franklin Lakes, NJ) that were spiked with enoxaparin; one sample was digested with heparinase for 5 min at 37 °C, the other sample represented the patient's baseline anticoagulated status. The percent shortening of clotting time in the heparinase-treated sample, as compared to the baseline state, yielded the anticoagulant contribution of enoxaparin. We used the portable, battery operated Hemochron 801 apparatus for measurements of clotting times (International Technidyne Corp., Edison, NJ). The apparatus has 2 thermostatically controlled (37 °C) assay tube wells. We conducted the assays in two types of assay cartridges that are available from the manufacturer of the instrument. One cartridge was modified to increase its sensitivity. We removed the kaolin from the FTK-ACT cartridge by extensive rinsing with distilled water, leaving only the glass surface of the tube, and perhaps the detection magnet, as activators. We called this our minimally activated assay (MAA). The use of a minimally activated assay has been studied by us and others. 2-4 The second cartridge that was studied was an activated partial thromboplastin time (aPTT) assay (A104). This was used as supplied from the manufacturer. The thermostated wells of the instrument were used for both the heparinase digestion and coagulation assays. The assay can be completed within 10 min. The MAA assay showed robust changes in clotting time after heparinase digestion of enoxaparin over a typical clinical concentration range. At 0.2 anti-Xa I.U. of enoxaparin per ml of blood sample, heparinase digestion caused an average decrease of 9.8% (20.4 sec) in clotting time; at 1.0 I.U. per ml of enoxaparin there was a 41.4% decrease (148.8 sec). This report only presents the experimental application of the assay; its value in a clinical setting must still be established.
Medicine, Issue 68, Immunology, Physiology, Pharmacology, low-molecular-weight-heparin, low-molecular-weight-heparin assay, LMWH point-of-care assay, anti-Factor-Xa activity, enoxaparin, heparinase, whole blood, assay
Play Button
Clinical Assessment of Spatiotemporal Gait Parameters in Patients and Older Adults
Authors: Julia F. Item-Glatthorn, Nicola A. Maffiuletti.
Institutions: Schulthess Clinic.
Spatial and temporal characteristics of human walking are frequently evaluated to identify possible gait impairments, mainly in orthopedic and neurological patients1-4, but also in healthy older adults5,6. The quantitative gait analysis described in this protocol is performed with a recently-introduced photoelectric system (see Materials table) which has the potential to be used in the clinic because it is portable, easy to set up (no subject preparation is required before a test), and does not require maintenance and sensor calibration. The photoelectric system consists of series of high-density floor-based photoelectric cells with light-emitting and light-receiving diodes that are placed parallel to each other to create a corridor, and are oriented perpendicular to the line of progression7. The system simply detects interruptions in light signal, for instance due to the presence of feet within the recording area. Temporal gait parameters and 1D spatial coordinates of consecutive steps are subsequently calculated to provide common gait parameters such as step length, single limb support and walking velocity8, whose validity against a criterion instrument has recently been demonstrated7,9. The measurement procedures are very straightforward; a single patient can be tested in less than 5 min and a comprehensive report can be generated in less than 1 min.
Medicine, Issue 93, gait analysis, walking, floor-based photocells, spatiotemporal, elderly, orthopedic patients, neurological patients
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Authors: Alla Gagarinova, Mohan Babu, Jack Greenblatt, Andrew Emili.
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g. protein-protein) and functional (e.g. gene-gene or genetic) interactions (GI)1. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7, but GI information remains sparse for prokaryotes8, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10. Here, we present the key steps required to perform quantitative E. coli Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format. Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g. the 'Keio' collection11) and essential gene hypomorphic mutations (i.e. alleles conferring reduced protein expression, stability, or activity9, 12, 13) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e. slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2 as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9.
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Isolation and Quantification of Botulinum Neurotoxin From Complex Matrices Using the BoTest Matrix Assays
Authors: F. Mark Dunning, Timothy M. Piazza, Füsûn N. Zeytin, Ward C. Tucker.
Institutions: BioSentinel Inc., Madison, WI.
Accurate detection and quantification of botulinum neurotoxin (BoNT) in complex matrices is required for pharmaceutical, environmental, and food sample testing. Rapid BoNT testing of foodstuffs is needed during outbreak forensics, patient diagnosis, and food safety testing while accurate potency testing is required for BoNT-based drug product manufacturing and patient safety. The widely used mouse bioassay for BoNT testing is highly sensitive but lacks the precision and throughput needed for rapid and routine BoNT testing. Furthermore, the bioassay's use of animals has resulted in calls by drug product regulatory authorities and animal-rights proponents in the US and abroad to replace the mouse bioassay for BoNT testing. Several in vitro replacement assays have been developed that work well with purified BoNT in simple buffers, but most have not been shown to be applicable to testing in highly complex matrices. Here, a protocol for the detection of BoNT in complex matrices using the BoTest Matrix assays is presented. The assay consists of three parts: The first part involves preparation of the samples for testing, the second part is an immunoprecipitation step using anti-BoNT antibody-coated paramagnetic beads to purify BoNT from the matrix, and the third part quantifies the isolated BoNT's proteolytic activity using a fluorogenic reporter. The protocol is written for high throughput testing in 96-well plates using both liquid and solid matrices and requires about 2 hr of manual preparation with total assay times of 4-26 hr depending on the sample type, toxin load, and desired sensitivity. Data are presented for BoNT/A testing with phosphate-buffered saline, a drug product, culture supernatant, 2% milk, and fresh tomatoes and includes discussion of critical parameters for assay success.
Neuroscience, Issue 85, Botulinum, food testing, detection, quantification, complex matrices, BoTest Matrix, Clostridium, potency testing
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Play Button
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Authors: Jeremy D. Smith, Abbie E. Ferris, Gary D. Heise, Richard N. Hinrichs, Philip E. Martin.
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Play Button
Non-invasive Optical Measurement of Cerebral Metabolism and Hemodynamics in Infants
Authors: Pei-Yi Lin, Nadege Roche-Labarbe, Mathieu Dehaes, Stefan Carp, Angela Fenoglio, Beniamino Barbieri, Katherine Hagan, P. Ellen Grant, Maria Angela Franceschini.
Institutions: Massachusetts General Hospital, Harvard Medical School, Université de Caen Basse-Normandie, Boston Children's Hospital, Harvard Medical School, ISS, INC..
Perinatal brain injury remains a significant cause of infant mortality and morbidity, but there is not yet an effective bedside tool that can accurately screen for brain injury, monitor injury evolution, or assess response to therapy. The energy used by neurons is derived largely from tissue oxidative metabolism, and neural hyperactivity and cell death are reflected by corresponding changes in cerebral oxygen metabolism (CMRO2). Thus, measures of CMRO2 are reflective of neuronal viability and provide critical diagnostic information, making CMRO2 an ideal target for bedside measurement of brain health. Brain-imaging techniques such as positron emission tomography (PET) and single-photon emission computed tomography (SPECT) yield measures of cerebral glucose and oxygen metabolism, but these techniques require the administration of radionucleotides, so they are used in only the most acute cases. Continuous-wave near-infrared spectroscopy (CWNIRS) provides non-invasive and non-ionizing radiation measures of hemoglobin oxygen saturation (SO2) as a surrogate for cerebral oxygen consumption. However, SO2 is less than ideal as a surrogate for cerebral oxygen metabolism as it is influenced by both oxygen delivery and consumption. Furthermore, measurements of SO2 are not sensitive enough to detect brain injury hours after the insult 1,2, because oxygen consumption and delivery reach equilibrium after acute transients 3. We investigated the possibility of using more sophisticated NIRS optical methods to quantify cerebral oxygen metabolism at the bedside in healthy and brain-injured newborns. More specifically, we combined the frequency-domain NIRS (FDNIRS) measure of SO2 with the diffuse correlation spectroscopy (DCS) measure of blood flow index (CBFi) to yield an index of CMRO2 (CMRO2i) 4,5. With the combined FDNIRS/DCS system we are able to quantify cerebral metabolism and hemodynamics. This represents an improvement over CWNIRS for detecting brain health, brain development, and response to therapy in neonates. Moreover, this method adheres to all neonatal intensive care unit (NICU) policies on infection control and institutional policies on laser safety. Future work will seek to integrate the two instruments to reduce acquisition time at the bedside and to implement real-time feedback on data quality to reduce the rate of data rejection.
Medicine, Issue 73, Developmental Biology, Neurobiology, Neuroscience, Biomedical Engineering, Anatomy, Physiology, Near infrared spectroscopy, diffuse correlation spectroscopy, cerebral hemodynamic, cerebral metabolism, brain injury screening, brain health, brain development, newborns, neonates, imaging, clinical techniques
Play Button
Observing and Quantifying Fibroblast-mediated Fibrin Gel Compaction
Authors: Aribet M. De Jesús, Edward A. Sander.
Institutions: University of Iowa.
Cells embedded in collagen and fibrin gels attach and exert traction forces on the fibers of the gel. These forces can lead to local and global reorganization and realignment of the gel microstructure. This process proceeds in a complex manner that is dependent in part on the interplay between the location of the cells, the geometry of the gel, and the mechanical constraints on the gel. To better understand how these variables produce global fiber alignment patterns, we use time-lapse differential interference contrast (DIC) microscopy coupled with an environmentally controlled bioreactor to observe the compaction process between geometrically spaced explants (clusters of fibroblasts). The images are then analyzed with a custom image processing algorithm to obtain maps of the strain. The information obtained from this technique can be used to probe the mechanobiology of various cell-matrix interactions, which has important implications for understanding processes in wound healing, disease development, and tissue engineering applications.
Bioengineering, Issue 83, Fibrin, bioreactor, compaction, anisotropy, time-lapse microscopy, mechanobiology
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
A Comprehensive Protocol for Manual Segmentation of the Medial Temporal Lobe Structures
Authors: Matthew Moore, Yifan Hu, Sarah Woo, Dylan O'Hearn, Alexandru D. Iordan, Sanda Dolcos, Florin Dolcos.
Institutions: University of Illinois Urbana-Champaign, University of Illinois Urbana-Champaign, University of Illinois Urbana-Champaign.
The present paper describes a comprehensive protocol for manual tracing of the set of brain regions comprising the medial temporal lobe (MTL): amygdala, hippocampus, and the associated parahippocampal regions (perirhinal, entorhinal, and parahippocampal proper). Unlike most other tracing protocols available, typically focusing on certain MTL areas (e.g., amygdala and/or hippocampus), the integrative perspective adopted by the present tracing guidelines allows for clear localization of all MTL subregions. By integrating information from a variety of sources, including extant tracing protocols separately targeting various MTL structures, histological reports, and brain atlases, and with the complement of illustrative visual materials, the present protocol provides an accurate, intuitive, and convenient guide for understanding the MTL anatomy. The need for such tracing guidelines is also emphasized by illustrating possible differences between automatic and manual segmentation protocols. This knowledge can be applied toward research involving not only structural MRI investigations but also structural-functional colocalization and fMRI signal extraction from anatomically defined ROIs, in healthy and clinical groups alike.
Neuroscience, Issue 89, Anatomy, Segmentation, Medial Temporal Lobe, MRI, Manual Tracing, Amygdala, Hippocampus, Perirhinal Cortex, Entorhinal Cortex, Parahippocampal Cortex
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.