JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Lessons from history for designing and validating epidemiological surveillance in uncounted populations.
PUBLISHED: 01-22-2011
Due to scanty individual health data in low- and middle-income countries (LMICs), health planners often use imperfect data sources. Frequent national-level data are considered essential, even if their depth and quality are questionable. However, quality in-depth data from local sentinel populations may be better than scanty national data, if such local data can be considered as nationally representative. The difficulty is the lack of any theoretical or empirical basis for demonstrating that local data are representative where data on the wider population are unavailable. Thus these issues can only be explored empirically in a complete individual dataset at national and local levels, relating to a LMIC population profile.
Authors: Madeleine E. Hackney, Kathleen McKee.
Published: 12-09-2014
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
26 Related JoVE Articles!
Play Button
Electrochemotherapy of Tumours
Authors: Gregor Sersa, Damijan Miklavcic.
Institutions: Institute of Oncology Ljubljana, University of Ljubljana.
Electrochemotherapy is a combined use of certain chemotherapeutic drugs and electric pulses applied to the treated tumour nodule. Local application of electric pulses to the tumour increases drug delivery into cells, specifically at the site of electric pulse application. Drug uptake by delivery of electric pulses is increased for only those chemotherapeutic drugs whose transport through the plasma membrane is impeded. Among many drugs that have been tested so far, bleomycin and cisplatin found their way from preclinical testing to clinical use. Clinical data collected within a number of clinical studies indicate that approximately 80% of the treated cutaneous and subcutaneous tumour nodules of different malignancies are in an objective response, from these, approximately 70% in complete response after a single application of electrochemotherapy. Usually only one treatment is needed, however, electrochemotherapy can be repeated several times every few weeks with equal effectiveness each time. The treatment results in an effective eradication of the treated nodules, with a good cosmetic effect without tissue scarring.
Medicine, Issue 22, electrochemotherapy, electroporation, cisplatin, bleomycin, malignant tumours, cutaneous lesions
Play Button
Intra-Operative Behavioral Tasks in Awake Humans Undergoing Deep Brain Stimulation Surgery
Authors: John T. Gale, Clarissa Martinez-Rubio, Sameer A. Sheth, Emad N. Eskandar.
Institutions: Harvard Medical School, Massachusetts General Hospital.
Deep brain stimulation (DBS) is a surgical procedure that directs chronic, high frequency electrical stimulation to specific targets in the brain through implanted electrodes. Deep brain stimulation was first implemented as a therapeutic modality by Benabid et al. in the late 1980s, when he used this technique to stimulate the ventral intermediate nucleus of the thalamus for the treatment of tremor 1. Currently, the procedure is used to treat patients who fail to respond adequately to medical management for diseases such as Parkinson's, dystonia, and essential tremor. The efficacy of this procedure for the treatment of Parkinson's disease has been demonstrated in well-powered, randomized controlled trials 2. Presently, the U.S. Food and Drug Administration has approved DBS as a treatment for patients with medically refractory essential tremor, Parkinson's disease, and dystonia. Additionally, DBS is currently being evaluated for the treatment of other psychiatric and neurological disorders, such as obsessive compulsive disorder, major depressive disorder, and epilepsy. DBS has not only been shown to help people by improving their quality of life, it also provides researchers with the unique opportunity to study and understand the human brain. Microelectrode recordings are routinely performed during DBS surgery in order to enhance the precision of anatomical targeting. Firing patterns of individual neurons can therefore be recorded while the subject performs a behavioral task. Early studies using these data focused on descriptive aspects, including firing and burst rates, and frequency modulation 3. More recent studies have focused on cognitive aspects of behavior in relation to neuronal activity 4,5. This article will provide a description of the intra-operative methods used to perform behavioral tasks and record neuronal data with awake patients during DBS cases. Our exposition of the process of acquiring electrophysiological data will illuminate the current scope and limitations of intra-operative human experiments.
Medicine, Issue 47, Intra-Operative Physiology, Cognitive Neuroscience, Behavioral Neuroscience, Subthalamic Nucleus, Single-Unit Activity, Parkinson Disease, Deep Brain Stimulation
Play Button
Mapping Inhibitory Neuronal Circuits by Laser Scanning Photostimulation
Authors: Taruna Ikrar, Nicholas D. Olivas, Yulin Shi, Xiangmin Xu.
Institutions: University of California, Irvine, University of California, Irvine.
Inhibitory neurons are crucial to cortical function. They comprise about 20% of the entire cortical neuronal population and can be further subdivided into diverse subtypes based on their immunochemical, morphological, and physiological properties1-4. Although previous research has revealed much about intrinsic properties of individual types of inhibitory neurons, knowledge about their local circuit connections is still relatively limited3,5,6. Given that each individual neuron's function is shaped by its excitatory and inhibitory synaptic input within cortical circuits, we have been using laser scanning photostimulation (LSPS) to map local circuit connections to specific inhibitory cell types. Compared to conventional electrical stimulation or glutamate puff stimulation, LSPS has unique advantages allowing for extensive mapping and quantitative analysis of local functional inputs to individually recorded neurons3,7-9. Laser photostimulation via glutamate uncaging selectively activates neurons perisomatically, without activating axons of passage or distal dendrites, which ensures a sub-laminar mapping resolution. The sensitivity and efficiency of LSPS for mapping inputs from many stimulation sites over a large region are well suited for cortical circuit analysis. Here we introduce the technique of LSPS combined with whole-cell patch clamping for local inhibitory circuit mapping. Targeted recordings of specific inhibitory cell types are facilitated by use of transgenic mice expressing green fluorescent proteins (GFP) in limited inhibitory neuron populations in the cortex3,10, which enables consistent sampling of the targeted cell types and unambiguous identification of the cell types recorded. As for LSPS mapping, we outline the system instrumentation, describe the experimental procedure and data acquisition, and present examples of circuit mapping in mouse primary somatosensory cortex. As illustrated in our experiments, caged glutamate is activated in a spatially restricted region of the brain slice by UV laser photolysis; simultaneous voltage-clamp recordings allow detection of photostimulation-evoked synaptic responses. Maps of either excitatory or inhibitory synaptic input to the targeted neuron are generated by scanning the laser beam to stimulate hundreds of potential presynaptic sites. Thus, LSPS enables the construction of detailed maps of synaptic inputs impinging onto specific types of inhibitory neurons through repeated experiments. Taken together, the photostimulation-based technique offers neuroscientists a powerful tool for determining the functional organization of local cortical circuits.
Neuroscience, Issue 56, glutamate uncaging, whole cell recording, GFP, transgenic, interneurons
Play Button
How to Measure Cortical Folding from MR Images: a Step-by-Step Tutorial to Compute Local Gyrification Index
Authors: Marie Schaer, Meritxell Bach Cuadra, Nick Schmansky, Bruce Fischl, Jean-Philippe Thiran, Stephan Eliez.
Institutions: University of Geneva School of Medicine, École Polytechnique Fédérale de Lausanne, University Hospital Center and University of Lausanne, Massachusetts General Hospital.
Cortical folding (gyrification) is determined during the first months of life, so that adverse events occurring during this period leave traces that will be identifiable at any age. As recently reviewed by Mangin and colleagues2, several methods exist to quantify different characteristics of gyrification. For instance, sulcal morphometry can be used to measure shape descriptors such as the depth, length or indices of inter-hemispheric asymmetry3. These geometrical properties have the advantage of being easy to interpret. However, sulcal morphometry tightly relies on the accurate identification of a given set of sulci and hence provides a fragmented description of gyrification. A more fine-grained quantification of gyrification can be achieved with curvature-based measurements, where smoothed absolute mean curvature is typically computed at thousands of points over the cortical surface4. The curvature is however not straightforward to comprehend, as it remains unclear if there is any direct relationship between the curvedness and a biologically meaningful correlate such as cortical volume or surface. To address the diverse issues raised by the measurement of cortical folding, we previously developed an algorithm to quantify local gyrification with an exquisite spatial resolution and of simple interpretation. Our method is inspired of the Gyrification Index5, a method originally used in comparative neuroanatomy to evaluate the cortical folding differences across species. In our implementation, which we name local Gyrification Index (lGI1), we measure the amount of cortex buried within the sulcal folds as compared with the amount of visible cortex in circular regions of interest. Given that the cortex grows primarily through radial expansion6, our method was specifically designed to identify early defects of cortical development. In this article, we detail the computation of local Gyrification Index, which is now freely distributed as a part of the FreeSurfer Software (, Martinos Center for Biomedical Imaging, Massachusetts General Hospital). FreeSurfer provides a set of automated reconstruction tools of the brain's cortical surface from structural MRI data. The cortical surface extracted in the native space of the images with sub-millimeter accuracy is then further used for the creation of an outer surface, which will serve as a basis for the lGI calculation. A circular region of interest is then delineated on the outer surface, and its corresponding region of interest on the cortical surface is identified using a matching algorithm as described in our validation study1. This process is repeatedly iterated with largely overlapping regions of interest, resulting in cortical maps of gyrification for subsequent statistical comparisons (Fig. 1). Of note, another measurement of local gyrification with a similar inspiration was proposed by Toro and colleagues7, where the folding index at each point is computed as the ratio of the cortical area contained in a sphere divided by the area of a disc with the same radius. The two implementations differ in that the one by Toro et al. is based on Euclidian distances and thus considers discontinuous patches of cortical area, whereas ours uses a strict geodesic algorithm and include only the continuous patch of cortical area opening at the brain surface in a circular region of interest.
Medicine, Issue 59, neuroimaging, brain, cortical complexity, cortical development
Play Button
Examining Local Network Processing using Multi-contact Laminar Electrode Recording
Authors: Bryan J. Hansen, Sarah Eagleman, Valentin Dragoi.
Institutions: University of Texas , University of Texas .
Cortical layers are ubiquitous structures throughout neocortex1-4 that consist of highly recurrent local networks. In recent years, significant progress has been made in our understanding of differences in response properties of neurons in different cortical layers5-8, yet there is still a great deal left to learn about whether and how neuronal populations encode information in a laminar-specific manner. Existing multi-electrode array techniques, although informative for measuring responses across many millimeters of cortical space along the cortical surface, are unsuitable to approach the issue of laminar cortical circuits. Here, we present our method for setting up and recording individual neurons and local field potentials (LFPs) across cortical layers of primary visual cortex (V1) utilizing multi-contact laminar electrodes (Figure 1; Plextrode U-Probe, Plexon Inc). The methods included are recording device construction, identification of cortical layers, and identification of receptive fields of individual neurons. To identify cortical layers, we measure the evoked response potentials (ERPs) of the LFP time-series using full-field flashed stimuli. We then perform current-source density (CSD) analysis to identify the polarity inversion accompanied by the sink-source configuration at the base of layer 4 (the sink is inside layer 4, subsequently referred to as granular layer9-12). Current-source density is useful because it provides an index of the location, direction, and density of transmembrane current flow, allowing us to accurately position electrodes to record from all layers in a single penetration6, 11, 12.
Neuroscience, Issue 55, laminar probes, cortical layers, local-field potentials, population coding
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Play Button
Fundus Photography as a Convenient Tool to Study Microvascular Responses to Cardiovascular Disease Risk Factors in Epidemiological Studies
Authors: Patrick De Boever, Tijs Louwies, Eline Provost, Luc Int Panis, Tim S. Nawrot.
Institutions: Flemish Institute for Technological Research (VITO), Hasselt University, Hasselt University, Leuven University.
The microcirculation consists of blood vessels with diameters less than 150 µm. It makes up a large part of the circulatory system and plays an important role in maintaining cardiovascular health. The retina is a tissue that lines the interior of the eye and it is the only tissue that allows for a non-invasive analysis of the microvasculature. Nowadays, high-quality fundus images can be acquired using digital cameras. Retinal images can be collected in 5 min or less, even without dilatation of the pupils. This unobtrusive and fast procedure for visualizing the microcirculation is attractive to apply in epidemiological studies and to monitor cardiovascular health from early age up to old age. Systemic diseases that affect the circulation can result in progressive morphological changes in the retinal vasculature. For example, changes in the vessel calibers of retinal arteries and veins have been associated with hypertension, atherosclerosis, and increased risk of stroke and myocardial infarction. The vessel widths are derived using image analysis software and the width of the six largest arteries and veins are summarized in the Central Retinal Arteriolar Equivalent (CRAE) and the Central Retinal Venular Equivalent (CRVE). The latter features have been shown useful to study the impact of modifiable lifestyle and environmental cardiovascular disease risk factors. The procedures to acquire fundus images and the analysis steps to obtain CRAE and CRVE are described. Coefficients of variation of repeated measures of CRAE and CRVE are less than 2% and within-rater reliability is very high. Using a panel study, the rapid response of the retinal vessel calibers to short-term changes in particulate air pollution, a known risk factor for cardiovascular mortality and morbidity, is reported. In conclusion, retinal imaging is proposed as a convenient and instrumental tool for epidemiological studies to study microvascular responses to cardiovascular disease risk factors.
Medicine, Issue 92, retina, microvasculature, image analysis, Central Retinal Arteriolar Equivalent, Central Retinal Venular Equivalent, air pollution, particulate matter, black carbon
Play Button
Collection, Isolation, and Flow Cytometric Analysis of Human Endocervical Samples
Authors: Jennifer A. Juno, Genevieve Boily-Larouche, Julie Lajoie, Keith R. Fowke.
Institutions: University of Manitoba, University of Manitoba.
Despite the public health importance of mucosal pathogens (including HIV), relatively little is known about mucosal immunity, particularly at the female genital tract (FGT). Because heterosexual transmission now represents the dominant mechanism of HIV transmission, and given the continual spread of sexually transmitted infections (STIs), it is critical to understand the interplay between host and pathogen at the genital mucosa. The substantial gaps in knowledge around FGT immunity are partially due to the difficulty in successfully collecting and processing mucosal samples. In order to facilitate studies with sufficient sample size, collection techniques must be minimally invasive and efficient. To this end, a protocol for the collection of cervical cytobrush samples and subsequent isolation of cervical mononuclear cells (CMC) has been optimized. Using ex vivo flow cytometry-based immunophenotyping, it is possible to accurately and reliably quantify CMC lymphocyte/monocyte population frequencies and phenotypes. This technique can be coupled with the collection of cervical-vaginal lavage (CVL), which contains soluble immune mediators including cytokines, chemokines and anti-proteases, all of which can be used to determine the anti- or pro-inflammatory environment in the vagina.
Medicine, Issue 89, mucosal, immunology, FGT, lavage, cervical, CMC
Play Button
Murine Ileocolic Bowel Resection with Primary Anastomosis
Authors: Troy Perry, Anna Borowiec, Bryan Dicken, Richard Fedorak, Karen Madsen.
Institutions: University of Alberta, University of Alberta.
Intestinal resections are frequently required for treatment of diseases involving the gastrointestinal tract, with Crohn’s disease and colon cancer being two common examples. Despite the frequency of these procedures, a significant knowledge gap remains in describing the inherent effects of intestinal resection on host physiology and disease pathophysiology. This article provides detailed instructions for an ileocolic resection with primary end-to-end anastomosis in mice, as well as essential aspects of peri-operative care to maximize post-operative success. When followed closely, this procedure yields a 95% long-term survival rate, no failure to thrive, and minimizes post-operative complications of bowel obstruction and anastomotic leak. The technical challenges of performing the procedure in mice are a barrier to its wide spread use in research. The skills described in this article can be acquired without previous surgical experience. Once mastered, the murine ileocolic resection procedure will provide a reproducible tool for studying the effects of intestinal resection in models of human disease.
Medicine, Issue 92, Ileocolic resection, anastomosis, Crohn's disease, mouse models, intestinal adaptation, short bowel syndrome
Play Button
Isolation of Fidelity Variants of RNA Viruses and Characterization of Virus Mutation Frequency
Authors: Stéphanie Beaucourt, Antonio V. Bordería, Lark L. Coffey, Nina F. Gnädig, Marta Sanz-Ramos, Yasnee Beeharry, Marco Vignuzzi.
Institutions: Institut Pasteur .
RNA viruses use RNA dependent RNA polymerases to replicate their genomes. The intrinsically high error rate of these enzymes is a large contributor to the generation of extreme population diversity that facilitates virus adaptation and evolution. Increasing evidence shows that the intrinsic error rates, and the resulting mutation frequencies, of RNA viruses can be modulated by subtle amino acid changes to the viral polymerase. Although biochemical assays exist for some viral RNA polymerases that permit quantitative measure of incorporation fidelity, here we describe a simple method of measuring mutation frequencies of RNA viruses that has proven to be as accurate as biochemical approaches in identifying fidelity altering mutations. The approach uses conventional virological and sequencing techniques that can be performed in most biology laboratories. Based on our experience with a number of different viruses, we have identified the key steps that must be optimized to increase the likelihood of isolating fidelity variants and generating data of statistical significance. The isolation and characterization of fidelity altering mutations can provide new insights into polymerase structure and function1-3. Furthermore, these fidelity variants can be useful tools in characterizing mechanisms of virus adaptation and evolution4-7.
Immunology, Issue 52, Polymerase fidelity, RNA virus, mutation frequency, mutagen, RNA polymerase, viral evolution
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Play Button
Experimental Protocol for Manipulating Plant-induced Soil Heterogeneity
Authors: Angela J. Brandt, Gaston A. del Pino, Jean H. Burns.
Institutions: Case Western Reserve University.
Coexistence theory has often treated environmental heterogeneity as being independent of the community composition; however biotic feedbacks such as plant-soil feedbacks (PSF) have large effects on plant performance, and create environmental heterogeneity that depends on the community composition. Understanding the importance of PSF for plant community assembly necessitates understanding of the role of heterogeneity in PSF, in addition to mean PSF effects. Here, we describe a protocol for manipulating plant-induced soil heterogeneity. Two example experiments are presented: (1) a field experiment with a 6-patch grid of soils to measure plant population responses and (2) a greenhouse experiment with 2-patch soils to measure individual plant responses. Soils can be collected from the zone of root influence (soils from the rhizosphere and directly adjacent to the rhizosphere) of plants in the field from conspecific and heterospecific plant species. Replicate collections are used to avoid pseudoreplicating soil samples. These soils are then placed into separate patches for heterogeneous treatments or mixed for a homogenized treatment. Care should be taken to ensure that heterogeneous and homogenized treatments experience the same degree of soil disturbance. Plants can then be placed in these soil treatments to determine the effect of plant-induced soil heterogeneity on plant performance. We demonstrate that plant-induced heterogeneity results in different outcomes than predicted by traditional coexistence models, perhaps because of the dynamic nature of these feedbacks. Theory that incorporates environmental heterogeneity influenced by the assembling community and additional empirical work is needed to determine when heterogeneity intrinsic to the assembling community will result in different assembly outcomes compared with heterogeneity extrinsic to the community composition.
Environmental Sciences, Issue 85, Coexistence, community assembly, environmental drivers, plant-soil feedback, soil heterogeneity, soil microbial communities, soil patch
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Characterization of Surface Modifications by White Light Interferometry: Applications in Ion Sputtering, Laser Ablation, and Tribology Experiments
Authors: Sergey V. Baryshev, Robert A. Erck, Jerry F. Moore, Alexander V. Zinovev, C. Emil Tripa, Igor V. Veryovkin.
Institutions: Argonne National Laboratory, Argonne National Laboratory, MassThink LLC.
In materials science and engineering it is often necessary to obtain quantitative measurements of surface topography with micrometer lateral resolution. From the measured surface, 3D topographic maps can be subsequently analyzed using a variety of software packages to extract the information that is needed. In this article we describe how white light interferometry, and optical profilometry (OP) in general, combined with generic surface analysis software, can be used for materials science and engineering tasks. In this article, a number of applications of white light interferometry for investigation of surface modifications in mass spectrometry, and wear phenomena in tribology and lubrication are demonstrated. We characterize the products of the interaction of semiconductors and metals with energetic ions (sputtering), and laser irradiation (ablation), as well as ex situ measurements of wear of tribological test specimens. Specifically, we will discuss: Aspects of traditional ion sputtering-based mass spectrometry such as sputtering rates/yields measurements on Si and Cu and subsequent time-to-depth conversion. Results of quantitative characterization of the interaction of femtosecond laser irradiation with a semiconductor surface. These results are important for applications such as ablation mass spectrometry, where the quantities of evaporated material can be studied and controlled via pulse duration and energy per pulse. Thus, by determining the crater geometry one can define depth and lateral resolution versus experimental setup conditions. Measurements of surface roughness parameters in two dimensions, and quantitative measurements of the surface wear that occur as a result of friction and wear tests. Some inherent drawbacks, possible artifacts, and uncertainty assessments of the white light interferometry approach will be discussed and explained.
Materials Science, Issue 72, Physics, Ion Beams (nuclear interactions), Light Reflection, Optical Properties, Semiconductor Materials, White Light Interferometry, Ion Sputtering, Laser Ablation, Femtosecond Lasers, Depth Profiling, Time-of-flight Mass Spectrometry, Tribology, Wear Analysis, Optical Profilometry, wear, friction, atomic force microscopy, AFM, scanning electron microscopy, SEM, imaging, visualization
Play Button
Multi-step Preparation Technique to Recover Multiple Metabolite Compound Classes for In-depth and Informative Metabolomic Analysis
Authors: Charmion Cruickshank-Quinn, Kevin D. Quinn, Roger Powell, Yanhui Yang, Michael Armstrong, Spencer Mahaffey, Richard Reisdorph, Nichole Reisdorph.
Institutions: National Jewish Health, University of Colorado Denver.
Metabolomics is an emerging field which enables profiling of samples from living organisms in order to obtain insight into biological processes. A vital aspect of metabolomics is sample preparation whereby inconsistent techniques generate unreliable results. This technique encompasses protein precipitation, liquid-liquid extraction, and solid-phase extraction as a means of fractionating metabolites into four distinct classes. Improved enrichment of low abundance molecules with a resulting increase in sensitivity is obtained, and ultimately results in more confident identification of molecules. This technique has been applied to plasma, bronchoalveolar lavage fluid, and cerebrospinal fluid samples with volumes as low as 50 µl.  Samples can be used for multiple downstream applications; for example, the pellet resulting from protein precipitation can be stored for later analysis. The supernatant from that step undergoes liquid-liquid extraction using water and strong organic solvent to separate the hydrophilic and hydrophobic compounds. Once fractionated, the hydrophilic layer can be processed for later analysis or discarded if not needed. The hydrophobic fraction is further treated with a series of solvents during three solid-phase extraction steps to separate it into fatty acids, neutral lipids, and phospholipids. This allows the technician the flexibility to choose which class of compounds is preferred for analysis. It also aids in more reliable metabolite identification since some knowledge of chemical class exists.
Bioengineering, Issue 89, plasma, chemistry techniques, analytical, solid phase extraction, mass spectrometry, metabolomics, fluids and secretions, profiling, small molecules, lipids, liquid chromatography, liquid-liquid extraction, cerebrospinal fluid, bronchoalveolar lavage fluid
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Determining Cell Number During Cell Culture using the Scepter Cell Counter
Authors: Kathleen Ongena, Chandreyee Das, Janet L. Smith, Sónia Gil, Grace Johnston.
Institutions: Millipore Inc.
Counting cells is often a necessary but tedious step for in vitro cell culture. Consistent cell concentrations ensure experimental reproducibility and accuracy. Cell counts are important for monitoring cell health and proliferation rate, assessing immortalization or transformation, seeding cells for subsequent experiments, transfection or infection, and preparing for cell-based assays. It is important that cell counts be accurate, consistent, and fast, particularly for quantitative measurements of cellular responses. Despite this need for speed and accuracy in cell counting, 71% of 400 researchers surveyed1 who count cells using a hemocytometer. While hemocytometry is inexpensive, it is laborious and subject to user bias and misuse, which results in inaccurate counts. Hemocytometers are made of special optical glass on which cell suspensions are loaded in specified volumes and counted under a microscope. Sources of errors in hemocytometry include: uneven cell distribution in the sample, too many or too few cells in the sample, subjective decisions as to whether a given cell falls within the defined counting area, contamination of the hemocytometer, user-to-user variation, and variation of hemocytometer filling rate2. To alleviate the tedium associated with manual counting, 29% of researchers count cells using automated cell counting devices; these include vision-based counters, systems that detect cells using the Coulter principle, or flow cytometry1. For most researchers, the main barrier to using an automated system is the price associated with these large benchtop instruments1. The Scepter cell counter is an automated handheld device that offers the automation and accuracy of Coulter counting at a relatively low cost. The system employs the Coulter principle of impedance-based particle detection3 in a miniaturized format using a combination of analog and digital hardware for sensing, signal processing, data storage, and graphical display. The disposable tip is engineered with a microfabricated, cell- sensing zone that enables discrimination by cell size and cell volume at sub-micron and sub-picoliter resolution. Enhanced with precision liquid-handling channels and electronics, the Scepter cell counter reports cell population statistics graphically displayed as a histogram.
Cellular Biology, Issue 45, Scepter, cell counting, cell culture, hemocytometer, Coulter, Impedance-based particle detection
Play Button
Facilitating the Analysis of Immunological Data with Visual Analytic Techniques
Authors: David C. Shih, Kevin C. Ho, Kyle M. Melnick, Ronald A. Rensink, Tobias R. Kollmann, Edgardo S. Fortuno III.
Institutions: University of British Columbia, University of British Columbia, University of British Columbia.
Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.
Immunology, Issue 47, Visual analytics, flow cytometry, Luminex, Tableau, cytokine, innate immunity, single nucleotide polymorphism
Play Button
Population Replacement Strategies for Controlling Vector Populations and the Use of Wolbachia pipientis for Genetic Drive
Authors: Jason Rasgon.
Institutions: Johns Hopkins University.
In this video, Jason Rasgon discusses population replacement strategies to control vector-borne diseases such as malaria and dengue. "Population replacement" is the replacement of wild vector populations (that are competent to transmit pathogens) with those that are not competent to transmit pathogens. There are several theoretical strategies to accomplish this. One is to exploit the maternally-inherited symbiotic bacteria Wolbachia pipientis. Wolbachia is a widespread reproductive parasite that spreads in a selfish manner at the extent of its host's fitness. Jason Rasgon discusses, in detail, the basic biology of this bacterial symbiont and various ways to use it for control of vector-borne diseases.
Cellular Biology, Issue 5, mosquito, malaria, genetics, infectious disease, Wolbachia
Play Button
Collecting And Measuring Wound Exudate Biochemical Mediators In Surgical Wounds
Authors: Brendan Carvalho, David J Clark, David Yeomans, Martin S Angst.
Institutions: Stanford University School of Medicine .
We describe a methodology by which we are able to collect and measure biochemical inflammatory and nociceptive mediators at the surgical wound site. Collecting site-specific biochemical markers is important to understand the relationship between levels in serum and surgical wound, determine any associations between mediator release, pain, analgesic use and other outcomes of interest, and evaluate the effect of systemic and peripheral drug administration on surgical wound biochemistry. This methodology has been applied to healthy women undergoing elective cesarean delivery with spinal anesthesia. We have measured wound exudate and serum mediators at the same time intervals as patient's pain scores and analgesics consumption for up to 48 hours post-cesarean delivery. Using this methodology we have been able to detect various biochemical mediators including nerve growth factor (NGF), prostaglandin E2 (PG-E2) substance P, IL-1β, IL-2, IL-4, IL-6, IL-7, IL-8, IL-10, IL-12, IL-13, IL-17, TNFα, INFγ, G-CSF, GM-CSF, MCP-1 and MIP-1β. Studies applying this human surgical wound bioassay have found no correlations between wound and serum cytokine concentrations or their time-release profile (J Pain. 2008; 9(7):650-7).1 We also documented the utility of the technique to identify drug-mediated changes in wound cytokine content (Anesth Analg 2010; 111:1452-9).2
Medicine, Issue 68, Biochemistry, Anatomy, Physiology, Cytokines, Cesarean Section, Wound Healing, Wounds and Injuries, Surgical Procedures, Operative, Surgical wound, Exudate, cytokines, Substance P, Interleukin 10, Interleukin 6, Nerve growth factor, Prostaglandin E2, Cesarean, Analgesia
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.