Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
27 Related JoVE Articles!
Fast and Accurate Exhaled Breath Ammonia Measurement
Institutions: St. Luke's University Hospital, Johns Hopkins School of Medicine, Johns Hopkins University.
This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels.
Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive.
The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations.
Medicine, Issue 88, Breath, ammonia, breath measurement, breath analysis, QEPAS, volatile organic compound
Determining Cell Number During Cell Culture using the Scepter Cell Counter
Institutions: Millipore Inc.
Counting cells is often a necessary but tedious step for in vitro
cell culture. Consistent cell concentrations ensure experimental reproducibility and accuracy. Cell counts are important for monitoring cell health and proliferation rate, assessing immortalization or transformation, seeding cells for subsequent experiments, transfection or infection, and preparing for cell-based assays. It is important that cell counts be accurate, consistent, and fast, particularly for quantitative measurements of cellular responses.
Despite this need for speed and accuracy in cell counting, 71% of 400 researchers surveyed1
who count cells using a hemocytometer. While hemocytometry is inexpensive, it is laborious and subject to user bias and misuse, which results in inaccurate counts. Hemocytometers are made of special optical glass on which cell suspensions are loaded in specified volumes and counted under a microscope. Sources of errors in hemocytometry include: uneven cell distribution in the sample, too many or too few cells in the sample, subjective decisions as to whether a given cell falls within the defined counting area, contamination of the hemocytometer, user-to-user variation, and variation of hemocytometer filling rate2
To alleviate the tedium associated with manual counting, 29% of researchers count cells using automated cell counting devices; these include vision-based counters, systems that detect cells using the Coulter principle, or flow cytometry1
. For most researchers, the main barrier to using an automated system is the price associated with these large benchtop instruments1
The Scepter cell counter is an automated handheld device that offers the automation and accuracy of Coulter counting at a relatively low cost. The system employs the Coulter principle of impedance-based particle detection3
in a miniaturized format using a combination of analog and digital hardware for sensing, signal processing, data storage, and graphical display. The disposable tip is engineered with a microfabricated, cell- sensing zone that enables discrimination by cell size and cell volume at sub-micron and sub-picoliter resolution. Enhanced with precision liquid-handling channels and electronics, the Scepter cell counter reports cell population statistics graphically displayed as a histogram.
Cellular Biology, Issue 45, Scepter, cell counting, cell culture, hemocytometer, Coulter, Impedance-based particle detection
Magnetic Resonance Derived Myocardial Strain Assessment Using Feature Tracking
Institutions: Cincinnati Children Hospital Medical Center (CCHMC), Imaging Systems GmbH, Advanced Medical Imaging Development SRL, The Christ Hospital.
Purpose: An accurate and practical method to measure parameters like strain in myocardial tissue is of great clinical value, since it has been shown, that strain is a more sensitive and earlier marker for contractile dysfunction than the frequently used parameter EF. Current technologies for CMR are time consuming and difficult to implement in clinical practice. Feature tracking is a technology that can lead to more automization and robustness of quantitative analysis of medical images with less time consumption than comparable methods.
Methods: An automatic or manual input in a single phase serves as an initialization from which the system starts to track the displacement of individual patterns representing anatomical structures over time. The specialty of this method is that the images do not need to be manipulated in any way beforehand like e.g. tagging of CMR images.
Results: The method is very well suited for tracking muscular tissue and with this allowing quantitative elaboration of myocardium and also blood flow.
Conclusions: This new method offers a robust and time saving procedure to quantify myocardial tissue and blood with displacement, velocity and deformation parameters on regular sequences of CMR imaging. It therefore can be implemented in clinical practice.
Medicine, Issue 48, feature tracking, strain, displacement, CMR
Flat-floored Air-lifted Platform: A New Method for Combining Behavior with Microscopy or Electrophysiology on Awake Freely Moving Rodents
Institutions: University of Helsinki, Neurotar LTD, University of Eastern Finland, University of Helsinki.
It is widely acknowledged that the use of general anesthetics can undermine the relevance of electrophysiological or microscopical data obtained from a living animal’s brain. Moreover, the lengthy recovery from anesthesia limits the frequency of repeated recording/imaging episodes in longitudinal studies. Hence, new methods that would allow stable recordings from non-anesthetized behaving mice are expected to advance the fields of cellular and cognitive neurosciences. Existing solutions range from mere physical restraint to more sophisticated approaches, such as linear and spherical treadmills used in combination with computer-generated virtual reality. Here, a novel method is described where a head-fixed mouse can move around an air-lifted mobile homecage and explore its environment under stress-free conditions. This method allows researchers to perform behavioral tests (e.g.
, learning, habituation or novel object recognition) simultaneously with two-photon microscopic imaging and/or patch-clamp recordings, all combined in a single experiment. This video-article describes the use of the awake animal head fixation device (mobile homecage), demonstrates the procedures of animal habituation, and exemplifies a number of possible applications of the method.
Empty Value, Issue 88, awake, in vivo two-photon microscopy, blood vessels, dendrites, dendritic spines, Ca2+ imaging, intrinsic optical imaging, patch-clamp
Proton Transfer and Protein Conformation Dynamics in Photosensitive Proteins by Time-resolved Step-scan Fourier-transform Infrared Spectroscopy
Institutions: Freie Universität Berlin.
Monitoring the dynamics of protonation and protein backbone conformation changes during the function of a protein is an essential step towards understanding its mechanism. Protonation and conformational changes affect the vibration pattern of amino acid side chains and of the peptide bond, respectively, both of which can be probed by infrared (IR) difference spectroscopy. For proteins whose function can be repetitively and reproducibly triggered by light, it is possible to obtain infrared difference spectra with (sub)microsecond resolution over a broad spectral range using the step-scan Fourier transform infrared technique. With ~102
repetitions of the photoreaction, the minimum number to complete a scan at reasonable spectral resolution and bandwidth, the noise level in the absorption difference spectra can be as low as ~10-4
, sufficient to follow the kinetics of protonation changes from a single amino acid. Lower noise levels can be accomplished by more data averaging and/or mathematical processing. The amount of protein required for optimal results is between 5-100 µg, depending on the sampling technique used. Regarding additional requirements, the protein needs to be first concentrated in a low ionic strength buffer and then dried to form a film. The protein film is hydrated prior to the experiment, either with little droplets of water or under controlled atmospheric humidity. The attained hydration level (g of water / g of protein) is gauged from an IR absorption spectrum. To showcase the technique, we studied the photocycle of the light-driven proton-pump bacteriorhodopsin in its native purple membrane environment, and of the light-gated ion channel channelrhodopsin-2 solubilized in detergent.
Biophysics, Issue 88, bacteriorhodopsin, channelrhodopsin, attenuated total reflection, proton transfer, protein dynamics, infrared spectroscopy, time-resolved spectroscopy, step-scan, membrane proteins, singular value decomposition
Multi-step Preparation Technique to Recover Multiple Metabolite Compound Classes for In-depth and Informative Metabolomic Analysis
Institutions: National Jewish Health, University of Colorado Denver.
Metabolomics is an emerging field which enables profiling of samples from living organisms in order to obtain insight into biological processes. A vital aspect of metabolomics is sample preparation whereby inconsistent techniques generate unreliable results. This technique encompasses protein precipitation, liquid-liquid extraction, and solid-phase extraction as a means of fractionating metabolites into four distinct classes. Improved enrichment of low abundance molecules with a resulting increase in sensitivity is obtained, and ultimately results in more confident identification of molecules. This technique has been applied to plasma, bronchoalveolar lavage fluid, and cerebrospinal fluid samples with volumes as low as 50 µl. Samples can be used for multiple downstream applications; for example, the pellet resulting from protein precipitation can be stored for later analysis. The supernatant from that step undergoes liquid-liquid extraction using water and strong organic solvent to separate the hydrophilic and hydrophobic compounds. Once fractionated, the hydrophilic layer can be processed for later analysis or discarded if not needed. The hydrophobic fraction is further treated with a series of solvents during three solid-phase extraction steps to separate it into fatty acids, neutral lipids, and phospholipids. This allows the technician the flexibility to choose which class of compounds is preferred for analysis. It also aids in more reliable metabolite identification since some knowledge of chemical class exists.
Bioengineering, Issue 89, plasma, chemistry techniques, analytical, solid phase extraction, mass spectrometry, metabolomics, fluids and secretions, profiling, small molecules, lipids, liquid chromatography, liquid-liquid extraction, cerebrospinal fluid, bronchoalveolar lavage fluid
Characterization of Recombination Effects in a Liquid Ionization Chamber Used for the Dosimetry of a Radiosurgical Accelerator
Institutions: Centre Oscar Lambret.
Most modern radiation therapy devices allow the use of very small fields, either through beamlets in Intensity-Modulated Radiation Therapy (IMRT) or via stereotactic radiotherapy where positioning accuracy allows delivering very high doses per fraction in a small volume of the patient. Dosimetric measurements on medical accelerators are conventionally realized using air-filled ionization chambers. However, in small beams these are subject to nonnegligible perturbation effects. This study focuses on liquid ionization chambers, which offer advantages in terms of spatial resolution and low fluence perturbation. Ion recombination effects are investigated for the microLion detector (PTW) used with the Cyberknife system (Accuray). The method consists of performing a series of water tank measurements at different source-surface distances, and applying corrections to the liquid detector readings based on simultaneous gaseous detector measurements. This approach facilitates isolating the recombination effects arising from the high density of the liquid sensitive medium and obtaining correction factors to apply to the detector readings. The main difficulty resides in achieving a sufficient level of accuracy in the setup to be able to detect small changes in the chamber response.
Physics, Issue 87, Radiation therapy, dosimetry, small fields, Cyberknife, liquid ionization, recombination effects
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Accuracy in Dental Medicine, A New Way to Measure Trueness and Precision
Institutions: University of Zürich.
Reference scanners are used in dental medicine to verify a lot of procedures. The main interest is to verify impression methods as they serve as a base for dental restorations. The current limitation of many reference scanners is the lack of accuracy scanning large objects like full dental arches, or the limited possibility to assess detailed tooth surfaces. A new reference scanner, based on focus variation scanning technique, was evaluated with regards to highest local and general accuracy. A specific scanning protocol was tested to scan original tooth surface from dental impressions. Also, different model materials were verified. The results showed a high scanning accuracy of the reference scanner with a mean deviation of 5.3 ± 1.1 µm for trueness and 1.6 ± 0.6 µm for precision in case of full arch scans. Current dental impression methods showed much higher deviations (trueness: 20.4 ± 2.2 µm, precision: 12.5 ± 2.5 µm) than the internal scanning accuracy of the reference scanner. Smaller objects like single tooth surface can be scanned with an even higher accuracy, enabling the system to assess erosive and abrasive tooth surface loss. The reference scanner can be used to measure differences for a lot of dental research fields. The different magnification levels combined with a high local and general accuracy can be used to assess changes of single teeth or restorations up to full arch changes.
Medicine, Issue 86, Laboratories, Dental, Calibration, Technology, Dental impression, Accuracy, Trueness, Precision, Full arch scan, Abrasion
Fabrication And Characterization Of Photonic Crystal Slow Light Waveguides And Cavities
Institutions: University of St Andrews.
Slow light has been one of the hot topics in the photonics community in the past decade, generating great interest both from a fundamental point of view and for its considerable potential for practical applications. Slow light photonic crystal waveguides, in particular, have played a major part and have been successfully employed for delaying optical signals1-4
and the enhancement of both linear5-7
and nonlinear devices.8-11
Photonic crystal cavities achieve similar effects to that of slow light waveguides, but over a reduced band-width. These cavities offer high Q-factor/volume ratio, for the realization of optically12
pumped ultra-low threshold lasers and the enhancement of nonlinear effects.14-16
Furthermore, passive filters17
have been demonstrated, exhibiting ultra-narrow line-width, high free-spectral range and record values of low energy consumption.
To attain these exciting results, a robust repeatable fabrication protocol must be developed. In this paper we take an in-depth look at our fabrication protocol which employs electron-beam lithography for the definition of photonic crystal patterns and uses wet and dry etching techniques. Our optimised fabrication recipe results in photonic crystals that do not suffer from vertical asymmetry and exhibit very good edge-wall roughness. We discuss the results of varying the etching parameters and the detrimental effects that they can have on a device, leading to a diagnostic route that can be taken to identify and eliminate similar issues.
The key to evaluating slow light waveguides is the passive characterization of transmission and group index spectra. Various methods have been reported, most notably resolving the Fabry-Perot fringes of the transmission spectrum20-21
and interferometric techniques.22-25
Here, we describe a direct, broadband measurement technique combining spectral interferometry with Fourier transform analysis.26
Our method stands out for its simplicity and power, as we can characterise a bare photonic crystal with access waveguides, without need for on-chip interference components, and the setup only consists of a Mach-Zehnder interferometer, with no need for moving parts and delay scans.
When characterising photonic crystal cavities, techniques involving internal sources21
or external waveguides directly coupled to the cavity27
impact on the performance of the cavity itself, thereby distorting the measurement. Here, we describe a novel and non-intrusive technique that makes use of a cross-polarised probe beam and is known as resonant scattering (RS), where the probe is coupled out-of plane into the cavity through an objective. The technique was first demonstrated by McCutcheon et al.28
and further developed by Galli et al.29
Physics, Issue 69, Optics and Photonics, Astronomy, light scattering, light transmission, optical waveguides, photonics, photonic crystals, Slow-light, Cavities, Waveguides, Silicon, SOI, Fabrication, Characterization
Osteopathic Manipulative Treatment as a Useful Adjunctive Tool for Pneumonia
Institutions: New York Institute of Technology College of Osteopathic Medicine.
Pneumonia, the inflammatory state of lung tissue primarily due to microbial infection, claimed 52,306 lives in the United States in 20071
and resulted in the hospitalization of 1.1 million patients2
. With an average length of in-patient hospital stay of five days2
, pneumonia and influenza comprise significant financial burden costing the United States $40.2 billion in 20053
. Under the current Infectious Disease Society of America/American Thoracic Society guidelines, standard-of-care recommendations include the rapid administration of an appropriate antibiotic regiment, fluid replacement, and ventilation (if necessary). Non-standard therapies include the use of corticosteroids and statins; however, these therapies lack conclusive supporting evidence4
. (Figure 1)
Osteopathic Manipulative Treatment (OMT) is a cost-effective adjunctive treatment of pneumonia that has been shown to reduce patients’ length of hospital stay, duration of intravenous antibiotics, and incidence of respiratory failure or death when compared to subjects who received conventional care alone5
. The use of manual manipulation techniques for pneumonia was first recorded as early as the Spanish influenza pandemic of 1918, when patients treated with standard medical care had an estimated mortality rate of 33%, compared to a 10% mortality rate in patients treated by osteopathic physicians6
. When applied to the management of pneumonia, manual manipulation techniques bolster lymphatic flow, respiratory function, and immunological defense by targeting anatomical structures involved in the these systems7,8, 9, 10
The objective of this review video-article is three-fold: a) summarize the findings of randomized controlled studies on the efficacy of OMT in adult patients with diagnosed pneumonia, b) demonstrate established protocols utilized by osteopathic physicians treating pneumonia, c) elucidate the physiological mechanisms behind manual manipulation of the respiratory and lymphatic systems. Specifically, we will discuss and demonstrate four routine techniques that address autonomics, lymph drainage, and rib cage mobility: 1) Rib Raising, 2) Thoracic Pump, 3) Doming of the Thoracic Diaphragm, and 4) Muscle Energy for Rib 1.5,11
Medicine, Issue 87, Pneumonia, osteopathic manipulative medicine (OMM) and techniques (OMT), lymphatic, rib raising, thoracic pump, muscle energy, doming diaphragm, alternative treatment
Characterizing the Composition of Molecular Motors on Moving Axonal Cargo Using "Cargo Mapping" Analysis
Institutions: The Scripps Research Institute, University of California San Diego, University of California San Diego, University of California San Diego School of Medicine.
Understanding the mechanisms by which molecular motors coordinate their activities to transport vesicular cargoes within neurons requires the quantitative analysis of motor/cargo associations at the single vesicle level. The goal of this protocol is to use quantitative fluorescence microscopy to correlate (“map”) the position and directionality of movement of live cargo to the composition and relative amounts of motors associated with the same cargo. “Cargo mapping” consists of live imaging of fluorescently labeled cargoes moving in axons cultured on microfluidic devices, followed by chemical fixation during recording of live movement, and subsequent immunofluorescence (IF) staining of the exact same axonal regions with antibodies against motors. Colocalization between cargoes and their associated motors is assessed by assigning sub-pixel position coordinates to motor and cargo channels, by fitting Gaussian functions to the diffraction-limited point spread functions representing individual fluorescent point sources. Fixed cargo and motor images are subsequently superimposed to plots of cargo movement, to “map” them to their tracked trajectories. The strength of this protocol is the combination of live and IF data to record both the transport of vesicular cargoes in live cells and to determine the motors associated to these exact same vesicles. This technique overcomes previous challenges that use biochemical methods to determine the average motor composition of purified heterogeneous bulk vesicle populations, as these methods do not reveal compositions on single moving cargoes. Furthermore, this protocol can be adapted for the analysis of other transport and/or trafficking pathways in other cell types to correlate the movement of individual intracellular structures with their protein composition. Limitations of this protocol are the relatively low throughput due to low transfection efficiencies of cultured primary neurons and a limited field of view available for high-resolution imaging. Future applications could include methods to increase the number of neurons expressing fluorescently labeled cargoes.
Neuroscience, Issue 92, kinesin, dynein, single vesicle, axonal transport, microfluidic devices, primary hippocampal neurons, quantitative fluorescence microscopy
Irrelevant Stimuli and Action Control: Analyzing the Influence of Ignored Stimuli via the Distractor-Response Binding Paradigm
Institutions: Trier University, Trier University.
Selection tasks in which simple stimuli (e.g.
letters) are presented and a target stimulus has to be selected against one or more distractor stimuli are frequently used in the research on human action control. One important question in these settings is how distractor stimuli, competing with the target stimulus for a response, influence actions. The distractor-response binding paradigm can be used to investigate this influence. It is particular useful to separately analyze response retrieval and distractor inhibition effects. Computer-based experiments are used to collect the data (reaction times and error rates). In a number of sequentially presented pairs of stimulus arrays (prime-probe design), participants respond to targets while ignoring distractor stimuli. Importantly, the factors response relation in the arrays of each pair (repetition vs. change) and distractor relation (repetition vs. change) are varied orthogonally. The repetition of the same distractor then has a different effect depending on response relation (repetition vs. change) between arrays. This result pattern can be explained by response retrieval due to distractor repetition. In addition, distractor inhibition effects are indicated by a general advantage due to distractor repetition. The described paradigm has proven useful to determine relevant parameters for response retrieval effects on human action.
Behavior, Issue 87, stimulus-response binding, distractor-response binding, response retrieval, distractor inhibition, event file, action control, selection task
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
High Efficiency Differentiation of Human Pluripotent Stem Cells to Cardiomyocytes and Characterization by Flow Cytometry
Institutions: Medical College of Wisconsin, Stanford University School of Medicine, Medical College of Wisconsin, Hong Kong University, Johns Hopkins University School of Medicine, Medical College of Wisconsin.
There is an urgent need to develop approaches for repairing the damaged heart, discovering new therapeutic drugs that do not have toxic effects on the heart, and improving strategies to accurately model heart disease. The potential of exploiting human induced pluripotent stem cell (hiPSC) technology to generate cardiac muscle “in a dish” for these applications continues to generate high enthusiasm. In recent years, the ability to efficiently generate cardiomyogenic cells from human pluripotent stem cells (hPSCs) has greatly improved, offering us new opportunities to model very early stages of human cardiac development not otherwise accessible. In contrast to many previous methods, the cardiomyocyte differentiation protocol described here does not require cell aggregation or the addition of Activin A or BMP4 and robustly generates cultures of cells that are highly positive for cardiac troponin I and T (TNNI3, TNNT2), iroquois-class homeodomain protein IRX-4 (IRX4), myosin regulatory light chain 2, ventricular/cardiac muscle isoform (MLC2v) and myosin regulatory light chain 2, atrial isoform (MLC2a) by day 10 across all human embryonic stem cell (hESC) and hiPSC lines tested to date. Cells can be passaged and maintained for more than 90 days in culture. The strategy is technically simple to implement and cost-effective. Characterization of cardiomyocytes derived from pluripotent cells often includes the analysis of reference markers, both at the mRNA and protein level. For protein analysis, flow cytometry is a powerful analytical tool for assessing quality of cells in culture and determining subpopulation homogeneity. However, technical variation in sample preparation can significantly affect quality of flow cytometry data. Thus, standardization of staining protocols should facilitate comparisons among various differentiation strategies. Accordingly, optimized staining protocols for the analysis of IRX4, MLC2v, MLC2a, TNNI3, and TNNT2 by flow cytometry are described.
Cellular Biology, Issue 91, human induced pluripotent stem cell, flow cytometry, directed differentiation, cardiomyocyte, IRX4, TNNI3, TNNT2, MCL2v, MLC2a
DNA-affinity-purified Chip (DAP-chip) Method to Determine Gene Targets for Bacterial Two component Regulatory Systems
Institutions: Lawrence Berkeley National Laboratory.
methods such as ChIP-chip are well-established techniques used to determine global gene targets for transcription factors. However, they are of limited use in exploring bacterial two component regulatory systems with uncharacterized activation conditions. Such systems regulate transcription only when activated in the presence of unique signals. Since these signals are often unknown, the in vitro
microarray based method described in this video article can be used to determine gene targets and binding sites for response regulators. This DNA-affinity-purified-chip method may be used for any purified regulator in any organism with a sequenced genome. The protocol involves allowing the purified tagged protein to bind to sheared genomic DNA and then affinity purifying the protein-bound DNA, followed by fluorescent labeling of the DNA and hybridization to a custom tiling array. Preceding steps that may be used to optimize the assay for specific regulators are also described. The peaks generated by the array data analysis are used to predict binding site motifs, which are then experimentally validated. The motif predictions can be further used to determine gene targets of orthologous response regulators in closely related species. We demonstrate the applicability of this method by determining the gene targets and binding site motifs and thus predicting the function for a sigma54-dependent response regulator DVU3023 in the environmental bacterium Desulfovibrio vulgaris
Genetics, Issue 89, DNA-Affinity-Purified-chip, response regulator, transcription factor binding site, two component system, signal transduction, Desulfovibrio, lactate utilization regulator, ChIP-chip
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Measuring Cation Transport by Na,K- and H,K-ATPase in Xenopus Oocytes by Atomic Absorption Spectrophotometry: An Alternative to Radioisotope Assays
Institutions: Technical University of Berlin, Oregon Health & Science University.
Whereas cation transport by the electrogenic membrane transporter Na+
-ATPase can be measured by electrophysiology, the electroneutrally operating gastric H+
-ATPase is more difficult to investigate. Many transport assays utilize radioisotopes to achieve a sufficient signal-to-noise ratio, however, the necessary security measures impose severe restrictions regarding human exposure or assay design. Furthermore, ion transport across cell membranes is critically influenced by the membrane potential, which is not straightforwardly controlled in cell culture or in proteoliposome preparations. Here, we make use of the outstanding sensitivity of atomic absorption spectrophotometry (AAS) towards trace amounts of chemical elements to measure Rb+
transport by Na+
- or gastric H+
-ATPase in single cells. Using Xenopus
oocytes as expression system, we determine the amount of Rb+
) transported into the cells by measuring samples of single-oocyte homogenates in an AAS device equipped with a transversely heated graphite atomizer (THGA) furnace, which is loaded from an autosampler. Since the background of unspecific Rb+
uptake into control oocytes or during application of ATPase-specific inhibitors is very small, it is possible to implement complex kinetic assay schemes involving a large number of experimental conditions simultaneously, or to compare the transport capacity and kinetics of site-specifically mutated transporters with high precision. Furthermore, since cation uptake is determined on single cells, the flux experiments can be carried out in combination with two-electrode voltage-clamping (TEVC) to achieve accurate control of the membrane potential and current. This allowed e.g.
to quantitatively determine the 3Na+
transport stoichiometry of the Na+
-ATPase and enabled for the first time to investigate the voltage dependence of cation transport by the electroneutrally operating gastric H+
-ATPase. In principle, the assay is not limited to K+
-transporting membrane proteins, but it may work equally well to address the activity of heavy or transition metal transporters, or uptake of chemical elements by endocytotic processes.
Biochemistry, Issue 72, Chemistry, Biophysics, Bioengineering, Physiology, Molecular Biology, electrochemical processes, physical chemistry, spectrophotometry (application), spectroscopic chemical analysis (application), life sciences, temperature effects (biological, animal and plant), Life Sciences (General), Na+,K+-ATPase, H+,K+-ATPase, Cation Uptake, P-type ATPases, Atomic Absorption Spectrophotometry (AAS), Two-Electrode Voltage-Clamp, Xenopus Oocytes, Rb+ Flux, Transversely Heated Graphite Atomizer (THGA) Furnace, electrophysiology, animal model
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (https://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Combining Magnetic Sorting of Mother Cells and Fluctuation Tests to Analyze Genome Instability During Mitotic Cell Aging in Saccharomyces cerevisiae
Institutions: Rensselaer Polytechnic Institute.
has been an excellent model system for examining mechanisms and consequences of genome instability. Information gained from this yeast model is relevant to many organisms, including humans, since DNA repair and DNA damage response factors are well conserved across diverse species. However, S. cerevisiae
has not yet been used to fully address whether the rate of accumulating mutations changes with increasing replicative (mitotic) age due to technical constraints. For instance, measurements of yeast replicative lifespan through micromanipulation involve very small populations of cells, which prohibit detection of rare mutations. Genetic methods to enrich for mother cells in populations by inducing death of daughter cells have been developed, but population sizes are still limited by the frequency with which random mutations that compromise the selection systems occur. The current protocol takes advantage of magnetic sorting of surface-labeled yeast mother cells to obtain large enough populations of aging mother cells to quantify rare mutations through phenotypic selections. Mutation rates, measured through fluctuation tests, and mutation frequencies are first established for young cells and used to predict the frequency of mutations in mother cells of various replicative ages. Mutation frequencies are then determined for sorted mother cells, and the age of the mother cells is determined using flow cytometry by staining with a fluorescent reagent that detects bud scars formed on their cell surfaces during cell division. Comparison of predicted mutation frequencies based on the number of cell divisions to the frequencies experimentally observed for mother cells of a given replicative age can then identify whether there are age-related changes in the rate of accumulating mutations. Variations of this basic protocol provide the means to investigate the influence of alterations in specific gene functions or specific environmental conditions on mutation accumulation to address mechanisms underlying genome instability during replicative aging.
Microbiology, Issue 92, Aging, mutations, genome instability, Saccharomyces cerevisiae, fluctuation test, magnetic sorting, mother cell, replicative aging
Measuring Cardiac Autonomic Nervous System (ANS) Activity in Children
Institutions: Academic Medical Center - University of Amsterdam, Public Health Service of Amsterdam (GGD), VU University, VU University Medical Center, VU University, VU University Medical Center.
The autonomic nervous system (ANS) controls mainly automatic bodily functions that are engaged in homeostasis, like heart rate, digestion, respiratory rate, salivation, perspiration and renal function. The ANS has two main branches: the sympathetic nervous system, preparing the human body for action in times of danger and stress, and the parasympathetic nervous system, which regulates the resting state of the body.
ANS activity can be measured invasively, for instance by radiotracer techniques or microelectrode recording from superficial nerves, or it can be measured non-invasively by using changes in an organ's response as a proxy for changes in ANS activity, for instance of the sweat glands or the heart. Invasive measurements have the highest validity but are very poorly feasible in large scale samples where non-invasive measures are the preferred approach. Autonomic effects on the heart can be reliably quantified by the recording of the electrocardiogram (ECG) in combination with the impedance cardiogram (ICG), which reflects the changes in thorax impedance in response to respiration and the ejection of blood from the ventricle into the aorta. From the respiration and ECG signals, respiratory sinus arrhythmia can be extracted as a measure of cardiac parasympathetic control. From the ECG and the left ventricular ejection signals, the preejection period can be extracted as a measure of cardiac sympathetic control. ECG and ICG recording is mostly done in laboratory settings. However, having the subjects report to a laboratory greatly reduces ecological validity, is not always doable in large scale epidemiological studies, and can be intimidating for young children. An ambulatory device for ECG and ICG simultaneously resolves these three problems.
Here, we present a study design for a minimally invasive and rapid assessment of cardiac autonomic control in children, using a validated ambulatory device 1-5
, the VU University Ambulatory Monitoring System (VU-AMS, Amsterdam, the Netherlands, www.vu-ams.nl).
Medicine, Issue 74, Neurobiology, Neuroscience, Anatomy, Physiology, Pediatrics, Cardiology, Heart, Central Nervous System, stress (psychological effects, human), effects of stress (psychological, human), sympathetic nervous system, parasympathetic nervous system, autonomic nervous system, ANS, childhood, ambulatory monitoring system, electrocardiogram, ECG, clinical techniques
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Linearization of the Bradford Protein Assay
Institutions: Tel Aviv University.
Determination of microgram quantities of protein in the Bradford Coomassie brilliant blue assay is accomplished by measurement of absorbance at 590 nm. This most common assay enables rapid and simple protein quantification in cell lysates, cellular fractions, or recombinant protein samples, for the purpose of normalization of biochemical measurements. However, an intrinsic nonlinearity compromises the sensitivity and accuracy of this method. It is shown that under standard assay conditions, the ratio of the absorbance measurements at 590 nm and 450 nm is strictly linear with protein concentration. This simple procedure increases the accuracy and improves the sensitivity of the assay about 10-fold, permitting quantification down to 50 ng of bovine serum albumin. Furthermore, the interference commonly introduced by detergents that are used to create the cell lysates is greatly reduced by the new protocol. A linear equation developed on the basis of mass action and Beer's law perfectly fits the experimental data.
Cellular Biology, Issue 38, Bradford, protein assay, protein quantification, Coomassie brilliant blue
A Protocol for Detecting and Scavenging Gas-phase Free Radicals in Mainstream Cigarette Smoke
Institutions: CDCF-AOX Lab, Cornell University.
Cigarette smoking is associated with human cancers. It has been reported that most of the lung cancer deaths are caused by cigarette smoking 5,6,7,12
. Although tobacco tars and related products in the particle phase of cigarette smoke are major causes of carcinogenic and mutagenic related diseases, cigarette smoke contains significant amounts of free radicals that are also considered as an important group of carcinogens9,10
. Free radicals attack cell constituents by damaging protein structure, lipids and DNA sequences and increase the risks of developing various types of cancers. Inhaled radicals produce adducts that contribute to many of the negative health effects of tobacco smoke in the lung3
. Studies have been conducted to reduce free radicals in cigarette smoke to decrease risks of the smoking-induced damage. It has been reported that haemoglobin and heme-containing compounds could partially scavenge nitric oxide, reactive oxidants and carcinogenic volatile nitrosocompounds of cigarette smoke4
. A 'bio-filter' consisted of haemoglobin and activated carbon was used to scavenge the free radicals and to remove up to 90% of the free radicals from cigarette smoke14
. However, due to the cost-ineffectiveness, it has not been successfully commercialized. Another study showed good scavenging efficiency of shikonin, a component of Chinese herbal medicine8
. In the present study, we report a protocol for introducing common natural antioxidant extracts into the cigarette filter for scavenging gas phase free radicals in cigarette smoke and measurement of the scavenge effect on gas phase free radicals in mainstream cigarette smoke (MCS) using spin-trapping Electron Spin Resonance (ESR) Spectroscopy1,2,14
. We showed high scavenging capacity of lycopene and grape seed extract which could point to their future application in cigarette filters. An important advantage of these prospective scavengers is that they can be obtained in large quantities from byproducts of tomato or wine industry respectively11,13
Bioengineering, Issue 59, Cigarette smoke, free radical, spin-trap, ESR
Methods for ECG Evaluation of Indicators of Cardiac Risk, and Susceptibility to Aconitine-induced Arrhythmias in Rats Following Status Epilepticus
Institutions: University of Utah.
Lethal cardiac arrhythmias contribute to mortality in a number of pathological conditions. Several parameters obtained from a
non-invasive, easily obtained electrocardiogram (ECG) are established, well-validated prognostic indicators of cardiac risk in patients suffering
from a number of cardiomyopathies. Increased heart rate, decreased heart rate variability (HRV), and increased duration and variability of cardiac
ventricular electrical activity (QT interval) are all indicative of enhanced cardiac risk 1-4
. In animal models, it is valuable to compare
these ECG-derived variables and susceptibility to experimentally induced arrhythmias. Intravenous infusion of the arrhythmogenic agent aconitine has
been widely used to evaluate susceptibility to arrhythmias in a range of experimental conditions, including animal models of depression 5
, following exercise 7
and exposure to air pollutants 8
, as well as determination of the antiarrhythmic efficacy of pharmacological
It should be noted that QT dispersion in humans is a measure of QT interval variation across the full set of leads from a
standard 12-lead ECG. Consequently, the measure of QT dispersion from the 2-lead ECG in the rat described in this protocol is different than that
calculated from human ECG records. This represents a limitation in the translation of the data obtained from rodents to human clinical medicine.
Status epilepticus (SE) is a single seizure or series of continuously recurring seizures lasting more than 30 min
, and results in mortality in 20% of cases 13
. Many individuals survive the SE, but die within 30 days 14,15
The mechanism(s) of this delayed mortality is not fully understood. It has been suggested that lethal ventricular arrhythmias contribute to many of these
. In addition to SE, patients experiencing spontaneously recurring seizures, i.e. epilepsy, are at risk of premature
sudden and unexpected death associated with epilepsy (SUDEP) 18
. As with SE, the precise mechanisms mediating SUDEP are not known.
It has been proposed that ventricular abnormalities and resulting arrhythmias make a significant contribution 18-22
To investigate the mechanisms of seizure-related cardiac death, and the efficacy of cardioprotective therapies, it is
necessary to obtain both ECG-derived indicators of risk and evaluate susceptibility to cardiac arrhythmias in animal models of seizure disorders
. Here we describe methods for implanting ECG electrodes in the Sprague-Dawley laboratory rat (Rattus norvegicus), following SE,
collection and analysis of ECG recordings, and induction of arrhythmias during iv infusion of aconitine.
These procedures can be used to directly determine the relationships between ECG-derived measures of cardiac electrical
activity and susceptibility to ventricular arrhythmias in rat models of seizure disorders, or any pathology associated with increased risk of sudden
Medicine, Issue 50, cardiac, seizure disorders, QTc, QTd, cardiac arrhythmias, rat
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques