JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Child mortality estimation 2013: an overview of updates in estimation methods by the United Nations Inter-agency Group for Child Mortality Estimation.
PLoS ONE
PUBLISHED: 01-01-2014
In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used.
ABSTRACT
The ability to adjust behavior to sudden changes in the environment develops gradually in childhood and adolescence. For example, in the Dimensional Change Card Sort task, participants switch from sorting cards one way, such as shape, to sorting them a different way, such as color. Adjusting behavior in this way exacts a small performance cost, or switch cost, such that responses are typically slower and more error-prone on switch trials in which the sorting rule changes as compared to repeat trials in which the sorting rule remains the same. The ability to flexibly adjust behavior is often said to develop gradually, in part because behavioral costs such as switch costs typically decrease with increasing age. Why aspects of higher-order cognition, such as behavioral flexibility, develop so gradually remains an open question. One hypothesis is that these changes occur in association with functional changes in broad-scale cognitive control networks. On this view, complex mental operations, such as switching, involve rapid interactions between several distributed brain regions, including those that update and maintain task rules, re-orient attention, and select behaviors. With development, functional connections between these regions strengthen, leading to faster and more efficient switching operations. The current video describes a method of testing this hypothesis through the collection and multivariate analysis of fMRI data from participants of different ages.
21 Related JoVE Articles!
Play Button
Long-term Lethal Toxicity Test with the Crustacean Artemia franciscana
Authors: Loredana Manfra, Federica Savorelli, Marco Pisapia, Erika Magaletti, Anna Maria Cicero.
Institutions: Institute for Environmental Protection and Research, Regional Agency for Environmental Protection in Emilia-Romagna.
Our research activities target the use of biological methods for the evaluation of environmental quality, with particular reference to saltwater/brackish water and sediment. The choice of biological indicators must be based on reliable scientific knowledge and, possibly, on the availability of standardized procedures. In this article, we present a standardized protocol that used the marine crustacean Artemia to evaluate the toxicity of chemicals and/or of marine environmental matrices. Scientists propose that the brine shrimp (Artemia) is a suitable candidate for the development of a standard bioassay for worldwide utilization. A number of papers have been published on the toxic effects of various chemicals and toxicants on brine shrimp (Artemia). The major advantage of this crustacean for toxicity studies is the overall availability of the dry cysts; these can be immediately used in testing and difficult cultivation is not demanded1,2. Cyst-based toxicity assays are cheap, continuously available, simple and reliable and are thus an important answer to routine needs of toxicity screening, for industrial monitoring requirements or for regulatory purposes3. The proposed method involves the mortality as an endpoint. The numbers of survivors were counted and percentage of deaths were calculated. Larvae were considered dead if they did not exhibit any internal or external movement during several seconds of observation4. This procedure was standardized testing a reference substance (Sodium Dodecyl Sulfate); some results are reported in this work. This article accompanies a video that describes the performance of procedural toxicity testing, showing all the steps related to the protocol.
Chemistry, Issue 62, Artemia franciscana, bioassays, chemical substances, crustaceans, marine environment
3790
Play Button
Bioassays for Monitoring Insecticide Resistance
Authors: Audra L.E. Miller, Kelly Tindall, B. Rogers Leonard.
Institutions: University of Missouri, Delta Research Center, Louisiana State University Agricultural Center.
Pest resistance to pesticides is an increasing problem because pesticides are an integral part of high-yielding production agriculture. When few products are labeled for an individual pest within a particular crop system, chemical control options are limited. Therefore, the same product(s) are used repeatedly and continual selection pressure is placed on the target pest. There are both financial and environmental costs associated with the development of resistant populations. The cost of pesticide resistance has been estimated at approximately $ 1.5 billion annually in the United States. This paper will describe protocols, currently used to monitor arthropod (specifically insects) populations for the development of resistance. The adult vial test is used to measure the toxicity to contact insecticides and a modification of this test is used for plant-systemic insecticides. In these bioassays, insects are exposed to technical grade insecticide and responses (mortality) recorded at a specific post-exposure interval. The mortality data are subjected to Log Dose probit analysis to generate estimates of a lethal concentration that provides mortality to 50% (LC50) of the target populations and a series of confidence limits (CL's) as estimates of data variability. When these data are collected for a range of insecticide-susceptible populations, the LC50 can be used as baseline data for future monitoring purposes. After populations have been exposed to products, the results can be compared to a previously determined LC50 using the same methodology.
Microbiology, Issue 46, Resistance monitoring, Insecticide Resistance, Pesticide Resistance, glass-vial bioassay
2129
Play Button
A Low Mortality Rat Model to Assess Delayed Cerebral Vasospasm After Experimental Subarachnoid Hemorrhage
Authors: Rahul V. Dudhani, Michele Kyle, Christina Dedeo, Margaret Riordan, Eric M. Deshaies.
Institutions: SUNY Upstate Medical University, SUNY Upstate Medical University.
Objective: To characterize and establish a reproducible model that demonstrates delayed cerebral vasospasm after aneurysmal subarachnoid hemorrhage (SAH) in rats, in order to identify the initiating events, pathophysiological changes and potential targets for treatment. Methods: Twenty-eight male Sprague-Dawley rats (250 - 300 g) were arbitrarily assigned to one of two groups - SAH or saline control. Rat subarachnoid hemorrhage in the SAH group (n=15) was induced by double injection of autologous blood, 48 hr apart, into the cisterna magna. Similarly, normal saline (n=13) was injected into the cisterna magna of the saline control group. Rats were sacrificed on day five after the second blood injection and the brains were preserved for histological analysis. The degree of vasospasm was measured using sections of the basilar artery, by measuring the internal luminal cross sectional area using NIH Image-J software. The significance was tested using Tukey/Kramer's statistical analysis. Results: After analysis of histological sections, basilar artery luminal cross sectional area were smaller in the SAH than in the saline group, consistent with cerebral vasospasm in the former group. In the SAH group, basilar artery internal area (.056 μm ± 3) were significantly smaller from vasospasm five days after the second blood injection (seven days after the initial blood injection), compared to the saline control group with internal area (.069 ± 3; p=0.004). There were no mortalities from cerebral vasospasm. Conclusion: The rat double SAH model induces a mild, survivable, basilar artery vasospasm that can be used to study the pathophysiological mechanisms of cerebral vasospasm in a small animal model. A low and acceptable mortality rate is a significant criterion to be satisfied for an ideal SAH animal model so that the mechanisms of vasospasm can be elucidated 7, 8. Further modifications of the model can be made to adjust for increased severity of vasospasm and neurological exams.
Medicine, Issue 71, Anatomy, Physiology, Neurobiology, Neuroscience, Immunology, Surgery, Aneurysm, cerebral, hemorrhage, model, mortality, rat, rodent, subarachnoid, vasospasm, animal model
4157
Play Button
Capsular Serotyping of Streptococcus pneumoniae by Latex Agglutination
Authors: Barbara D. Porter, Belinda D. Ortika, Catherine Satzke.
Institutions: Murdoch Childrens Research Institute, The University of Melbourne.
Latex agglutination reagents are widely used in microbial diagnosis, identification and serotyping. Streptococcus pneumoniae (the pneumococcus) is a major cause of morbidity and mortality world-wide. Current vaccines target the pneumococcal capsule, and there are over 90 capsular serotypes. Serotyping pneumococcal isolates is therefore important for assessing the impact of vaccination programs and for epidemiological purposes. The World Health Organization has recommended latex agglutination as an alternative method to the ‘gold standard’ Quellung test for serotyping pneumococci. Latex agglutination is a relatively simple, quick and inexpensive method; and is therefore suitable for resource-poor settings as well as laboratories with high-volume workloads. Latex agglutination reagents can be prepared in-house utilizing commercially-sourced antibodies that are passively attached to latex particles. This manuscript describes a method of production and quality control of latex agglutination reagents, and details a sequential testing approach which is time- and cost-effective. This method of production and quality control may also be suitable for other testing purposes.
Immunology, Issue 91, Antisera, pneumococci, polysaccharide capsule, slide agglutination
51747
Play Button
Trajectory Data Analyses for Pedestrian Space-time Activity Study
Authors: Feng Qi, Fei Du.
Institutions: Kean University, University of Wisconsin-Madison.
It is well recognized that human movement in the spatial and temporal dimensions has direct influence on disease transmission1-3. An infectious disease typically spreads via contact between infected and susceptible individuals in their overlapped activity spaces. Therefore, daily mobility-activity information can be used as an indicator to measure exposures to risk factors of infection. However, a major difficulty and thus the reason for paucity of studies of infectious disease transmission at the micro scale arise from the lack of detailed individual mobility data. Previously in transportation and tourism research detailed space-time activity data often relied on the time-space diary technique, which requires subjects to actively record their activities in time and space. This is highly demanding for the participants and collaboration from the participants greatly affects the quality of data4. Modern technologies such as GPS and mobile communications have made possible the automatic collection of trajectory data. The data collected, however, is not ideal for modeling human space-time activities, limited by the accuracies of existing devices. There is also no readily available tool for efficient processing of the data for human behavior study. We present here a suite of methods and an integrated ArcGIS desktop-based visual interface for the pre-processing and spatiotemporal analyses of trajectory data. We provide examples of how such processing may be used to model human space-time activities, especially with error-rich pedestrian trajectory data, that could be useful in public health studies such as infectious disease transmission modeling. The procedure presented includes pre-processing, trajectory segmentation, activity space characterization, density estimation and visualization, and a few other exploratory analysis methods. Pre-processing is the cleaning of noisy raw trajectory data. We introduce an interactive visual pre-processing interface as well as an automatic module. Trajectory segmentation5 involves the identification of indoor and outdoor parts from pre-processed space-time tracks. Again, both interactive visual segmentation and automatic segmentation are supported. Segmented space-time tracks are then analyzed to derive characteristics of one's activity space such as activity radius etc. Density estimation and visualization are used to examine large amount of trajectory data to model hot spots and interactions. We demonstrate both density surface mapping6 and density volume rendering7. We also include a couple of other exploratory data analyses (EDA) and visualizations tools, such as Google Earth animation support and connection analysis. The suite of analytical as well as visual methods presented in this paper may be applied to any trajectory data for space-time activity studies.
Environmental Sciences, Issue 72, Computer Science, Behavior, Infectious Diseases, Geography, Cartography, Data Display, Disease Outbreaks, cartography, human behavior, Trajectory data, space-time activity, GPS, GIS, ArcGIS, spatiotemporal analysis, visualization, segmentation, density surface, density volume, exploratory data analysis, modelling
50130
Play Button
Patient-specific Modeling of the Heart: Estimation of Ventricular Fiber Orientations
Authors: Fijoy Vadakkumpadan, Hermenegild Arevalo, Natalia A. Trayanova.
Institutions: Johns Hopkins University.
Patient-specific simulations of heart (dys)function aimed at personalizing cardiac therapy are hampered by the absence of in vivo imaging technology for clinically acquiring myocardial fiber orientations. The objective of this project was to develop a methodology to estimate cardiac fiber orientations from in vivo images of patient heart geometries. An accurate representation of ventricular geometry and fiber orientations was reconstructed, respectively, from high-resolution ex vivo structural magnetic resonance (MR) and diffusion tensor (DT) MR images of a normal human heart, referred to as the atlas. Ventricular geometry of a patient heart was extracted, via semiautomatic segmentation, from an in vivo computed tomography (CT) image. Using image transformation algorithms, the atlas ventricular geometry was deformed to match that of the patient. Finally, the deformation field was applied to the atlas fiber orientations to obtain an estimate of patient fiber orientations. The accuracy of the fiber estimates was assessed using six normal and three failing canine hearts. The mean absolute difference between inclination angles of acquired and estimated fiber orientations was 15.4 °. Computational simulations of ventricular activation maps and pseudo-ECGs in sinus rhythm and ventricular tachycardia indicated that there are no significant differences between estimated and acquired fiber orientations at a clinically observable level.The new insights obtained from the project will pave the way for the development of patient-specific models of the heart that can aid physicians in personalized diagnosis and decisions regarding electrophysiological interventions.
Bioengineering, Issue 71, Biomedical Engineering, Medicine, Anatomy, Physiology, Cardiology, Myocytes, Cardiac, Image Processing, Computer-Assisted, Magnetic Resonance Imaging, MRI, Diffusion Magnetic Resonance Imaging, Cardiac Electrophysiology, computerized simulation (general), mathematical modeling (systems analysis), Cardiomyocyte, biomedical image processing, patient-specific modeling, Electrophysiology, simulation
50125
Play Button
An Inverse Analysis Approach to the Characterization of Chemical Transport in Paints
Authors: Matthew P. Willis, Shawn M. Stevenson, Thomas P. Pearl, Brent A. Mantooth.
Institutions: U.S. Army Edgewood Chemical Biological Center, OptiMetrics, Inc., a DCS Company.
The ability to directly characterize chemical transport and interactions that occur within a material (i.e., subsurface dynamics) is a vital component in understanding contaminant mass transport and the ability to decontaminate materials. If a material is contaminated, over time, the transport of highly toxic chemicals (such as chemical warfare agent species) out of the material can result in vapor exposure or transfer to the skin, which can result in percutaneous exposure to personnel who interact with the material. Due to the high toxicity of chemical warfare agents, the release of trace chemical quantities is of significant concern. Mapping subsurface concentration distribution and transport characteristics of absorbed agents enables exposure hazards to be assessed in untested conditions. Furthermore, these tools can be used to characterize subsurface reaction dynamics to ultimately design improved decontaminants or decontamination procedures. To achieve this goal, an inverse analysis mass transport modeling approach was developed that utilizes time-resolved mass spectroscopy measurements of vapor emission from contaminated paint coatings as the input parameter for calculation of subsurface concentration profiles. Details are provided on sample preparation, including contaminant and material handling, the application of mass spectrometry for the measurement of emitted contaminant vapor, and the implementation of inverse analysis using a physics-based diffusion model to determine transport properties of live chemical warfare agents including distilled mustard (HD) and the nerve agent VX.
Chemistry, Issue 90, Vacuum, vapor emission, chemical warfare agent, contamination, mass transport, inverse analysis, volatile organic compound, paint, coating
51825
Play Button
Measuring Attentional Biases for Threat in Children and Adults
Authors: Vanessa LoBue.
Institutions: Rutgers University.
Investigators have long been interested in the human propensity for the rapid detection of threatening stimuli. However, until recently, research in this domain has focused almost exclusively on adult participants, completely ignoring the topic of threat detection over the course of development. One of the biggest reasons for the lack of developmental work in this area is likely the absence of a reliable paradigm that can measure perceptual biases for threat in children. To address this issue, we recently designed a modified visual search paradigm similar to the standard adult paradigm that is appropriate for studying threat detection in preschool-aged participants. Here we describe this new procedure. In the general paradigm, we present participants with matrices of color photographs, and ask them to find and touch a target on the screen. Latency to touch the target is recorded. Using a touch-screen monitor makes the procedure simple and easy, allowing us to collect data in participants ranging from 3 years of age to adults. Thus far, the paradigm has consistently shown that both adults and children detect threatening stimuli (e.g., snakes, spiders, angry/fearful faces) more quickly than neutral stimuli (e.g., flowers, mushrooms, happy/neutral faces). Altogether, this procedure provides an important new tool for researchers interested in studying the development of attentional biases for threat.
Behavior, Issue 92, Detection, threat, attention, attentional bias, anxiety, visual search
52190
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
50645
Play Button
Eye Tracking Young Children with Autism
Authors: Noah J. Sasson, Jed T. Elison.
Institutions: University of Texas at Dallas, University of North Carolina at Chapel Hill.
The rise of accessible commercial eye-tracking systems has fueled a rapid increase in their use in psychological and psychiatric research. By providing a direct, detailed and objective measure of gaze behavior, eye-tracking has become a valuable tool for examining abnormal perceptual strategies in clinical populations and has been used to identify disorder-specific characteristics1, promote early identification2, and inform treatment3. In particular, investigators of autism spectrum disorders (ASD) have benefited from integrating eye-tracking into their research paradigms4-7. Eye-tracking has largely been used in these studies to reveal mechanisms underlying impaired task performance8 and abnormal brain functioning9, particularly during the processing of social information1,10-11. While older children and adults with ASD comprise the preponderance of research in this area, eye-tracking may be especially useful for studying young children with the disorder as it offers a non-invasive tool for assessing and quantifying early-emerging developmental abnormalities2,12-13. Implementing eye-tracking with young children with ASD, however, is associated with a number of unique challenges, including issues with compliant behavior resulting from specific task demands and disorder-related psychosocial considerations. In this protocol, we detail methodological considerations for optimizing research design, data acquisition and psychometric analysis while eye-tracking young children with ASD. The provided recommendations are also designed to be more broadly applicable for eye-tracking children with other developmental disabilities. By offering guidelines for best practices in these areas based upon lessons derived from our own work, we hope to help other investigators make sound research design and analysis choices while avoiding common pitfalls that can compromise data acquisition while eye-tracking young children with ASD or other developmental difficulties.
Medicine, Issue 61, eye tracking, autism, neurodevelopmental disorders, toddlers, perception, attention, social cognition
3675
Play Button
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Authors: Jeremy D. Smith, Abbie E. Ferris, Gary D. Heise, Richard N. Hinrichs, Philip E. Martin.
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
50977
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
51242
Play Button
Laboratory-determined Phosphorus Flux from Lake Sediments as a Measure of Internal Phosphorus Loading
Authors: Mary E. Ogdahl, Alan D. Steinman, Maggie E. Weinert.
Institutions: Grand Valley State University.
Eutrophication is a water quality issue in lakes worldwide, and there is a critical need to identify and control nutrient sources. Internal phosphorus (P) loading from lake sediments can account for a substantial portion of the total P load in eutrophic, and some mesotrophic, lakes. Laboratory determination of P release rates from sediment cores is one approach for determining the role of internal P loading and guiding management decisions. Two principal alternatives to experimental determination of sediment P release exist for estimating internal load: in situ measurements of changes in hypolimnetic P over time and P mass balance. The experimental approach using laboratory-based sediment incubations to quantify internal P load is a direct method, making it a valuable tool for lake management and restoration. Laboratory incubations of sediment cores can help determine the relative importance of internal vs. external P loads, as well as be used to answer a variety of lake management and research questions. We illustrate the use of sediment core incubations to assess the effectiveness of an aluminum sulfate (alum) treatment for reducing sediment P release. Other research questions that can be investigated using this approach include the effects of sediment resuspension and bioturbation on P release. The approach also has limitations. Assumptions must be made with respect to: extrapolating results from sediment cores to the entire lake; deciding over what time periods to measure nutrient release; and addressing possible core tube artifacts. A comprehensive dissolved oxygen monitoring strategy to assess temporal and spatial redox status in the lake provides greater confidence in annual P loads estimated from sediment core incubations.
Environmental Sciences, Issue 85, Limnology, internal loading, eutrophication, nutrient flux, sediment coring, phosphorus, lakes
51617
Play Button
Test Samples for Optimizing STORM Super-Resolution Microscopy
Authors: Daniel J. Metcalf, Rebecca Edwards, Neelam Kumarswami, Alex E. Knight.
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
50579
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
Estimating Virus Production Rates in Aquatic Systems
Authors: Audrey R. Matteson, Charles R. Budinoff, Claire E. Campbell, Alison Buchan, Steven W. Wilhelm.
Institutions: University of Tennessee.
Viruses are pervasive components of marine and freshwater systems, and are known to be significant agents of microbial mortality. Developing quantitative estimates of this process is critical as we can then develop better models of microbial community structure and function as well as advance our understanding of how viruses work to alter aquatic biogeochemical cycles. The virus reduction technique allows researchers to estimate the rate at which virus particles are released from the endemic microbial community. In brief, the abundance of free (extracellular) viruses is reduced in a sample while the microbial community is maintained at near ambient concentration. The microbial community is then incubated in the absence of free viruses and the rate at which viruses reoccur in the sample (through the lysis of already infected members of the community) can be quantified by epifluorescence microscopy or, in the case of specific viruses, quantitative PCR. These rates can then be used to estimate the rate of microbial mortality due to virus-mediated cell lysis.
Infectious Diseases, Issue 43, Viruses, seawater, lakes, viral lysis, marine microbiology, freshwater microbiology, epifluorescence microscopy
2196
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
A Novel Application of Musculoskeletal Ultrasound Imaging
Authors: Avinash Eranki, Nelson Cortes, Zrinka Gregurić Ferenček, Siddhartha Sikdar.
Institutions: George Mason University, George Mason University, George Mason University, George Mason University.
Ultrasound is an attractive modality for imaging muscle and tendon motion during dynamic tasks and can provide a complementary methodological approach for biomechanical studies in a clinical or laboratory setting. Towards this goal, methods for quantification of muscle kinematics from ultrasound imagery are being developed based on image processing. The temporal resolution of these methods is typically not sufficient for highly dynamic tasks, such as drop-landing. We propose a new approach that utilizes a Doppler method for quantifying muscle kinematics. We have developed a novel vector tissue Doppler imaging (vTDI) technique that can be used to measure musculoskeletal contraction velocity, strain and strain rate with sub-millisecond temporal resolution during dynamic activities using ultrasound. The goal of this preliminary study was to investigate the repeatability and potential applicability of the vTDI technique in measuring musculoskeletal velocities during a drop-landing task, in healthy subjects. The vTDI measurements can be performed concurrently with other biomechanical techniques, such as 3D motion capture for joint kinematics and kinetics, electromyography for timing of muscle activation and force plates for ground reaction force. Integration of these complementary techniques could lead to a better understanding of dynamic muscle function and dysfunction underlying the pathogenesis and pathophysiology of musculoskeletal disorders.
Medicine, Issue 79, Anatomy, Physiology, Joint Diseases, Diagnostic Imaging, Muscle Contraction, ultrasonic applications, Doppler effect (acoustics), Musculoskeletal System, biomechanics, musculoskeletal kinematics, dynamic function, ultrasound imaging, vector Doppler, strain, strain rate
50595
Play Button
Laboratory Estimation of Net Trophic Transfer Efficiencies of PCB Congeners to Lake Trout (Salvelinus namaycush) from Its Prey
Authors: Charles P. Madenjian, Richard R. Rediske, James P. O'Keefe, Solomon R. David.
Institutions: U. S. Geological Survey, Grand Valley State University, Shedd Aquarium.
A technique for laboratory estimation of net trophic transfer efficiency (γ) of polychlorinated biphenyl (PCB) congeners to piscivorous fish from their prey is described herein. During a 135-day laboratory experiment, we fed bloater (Coregonus hoyi) that had been caught in Lake Michigan to lake trout (Salvelinus namaycush) kept in eight laboratory tanks. Bloater is a natural prey for lake trout. In four of the tanks, a relatively high flow rate was used to ensure relatively high activity by the lake trout, whereas a low flow rate was used in the other four tanks, allowing for low lake trout activity. On a tank-by-tank basis, the amount of food eaten by the lake trout on each day of the experiment was recorded. Each lake trout was weighed at the start and end of the experiment. Four to nine lake trout from each of the eight tanks were sacrificed at the start of the experiment, and all 10 lake trout remaining in each of the tanks were euthanized at the end of the experiment. We determined concentrations of 75 PCB congeners in the lake trout at the start of the experiment, in the lake trout at the end of the experiment, and in bloaters fed to the lake trout during the experiment. Based on these measurements, γ was calculated for each of 75 PCB congeners in each of the eight tanks. Mean γ was calculated for each of the 75 PCB congeners for both active and inactive lake trout. Because the experiment was replicated in eight tanks, the standard error about mean γ could be estimated. Results from this type of experiment are useful in risk assessment models to predict future risk to humans and wildlife eating contaminated fish under various scenarios of environmental contamination.
Environmental Sciences, Issue 90, trophic transfer efficiency, polychlorinated biphenyl congeners, lake trout, activity, contaminants, accumulation, risk assessment, toxic equivalents
51496
Play Button
Quantifying Agonist Activity at G Protein-coupled Receptors
Authors: Frederick J. Ehlert, Hinako Suga, Michael T. Griffin.
Institutions: University of California, Irvine, University of California, Chapman University.
When an agonist activates a population of G protein-coupled receptors (GPCRs), it elicits a signaling pathway that culminates in the response of the cell or tissue. This process can be analyzed at the level of a single receptor, a population of receptors, or a downstream response. Here we describe how to analyze the downstream response to obtain an estimate of the agonist affinity constant for the active state of single receptors. Receptors behave as quantal switches that alternate between active and inactive states (Figure 1). The active state interacts with specific G proteins or other signaling partners. In the absence of ligands, the inactive state predominates. The binding of agonist increases the probability that the receptor will switch into the active state because its affinity constant for the active state (Kb) is much greater than that for the inactive state (Ka). The summation of the random outputs of all of the receptors in the population yields a constant level of receptor activation in time. The reciprocal of the concentration of agonist eliciting half-maximal receptor activation is equivalent to the observed affinity constant (Kobs), and the fraction of agonist-receptor complexes in the active state is defined as efficacy (ε) (Figure 2). Methods for analyzing the downstream responses of GPCRs have been developed that enable the estimation of the Kobs and relative efficacy of an agonist 1,2. In this report, we show how to modify this analysis to estimate the agonist Kb value relative to that of another agonist. For assays that exhibit constitutive activity, we show how to estimate Kb in absolute units of M-1. Our method of analyzing agonist concentration-response curves 3,4 consists of global nonlinear regression using the operational model 5. We describe a procedure using the software application, Prism (GraphPad Software, Inc., San Diego, CA). The analysis yields an estimate of the product of Kobs and a parameter proportional to efficacy (τ). The estimate of τKobs of one agonist, divided by that of another, is a relative measure of Kb (RAi) 6. For any receptor exhibiting constitutive activity, it is possible to estimate a parameter proportional to the efficacy of the free receptor complex (τsys). In this case, the Kb value of an agonist is equivalent to τKobssys 3. Our method is useful for determining the selectivity of an agonist for receptor subtypes and for quantifying agonist-receptor signaling through different G proteins.
Molecular Biology, Issue 58, agonist activity, active state, ligand bias, constitutive activity, G protein-coupled receptor
3179
Play Button
Functional Mapping with Simultaneous MEG and EEG
Authors: Hesheng Liu, Naoaki Tanaka, Steven Stufflebeam, Seppo Ahlfors, Matti Hämäläinen.
Institutions: MGH - Massachusetts General Hospital.
We use magnetoencephalography (MEG) and electroencephalography (EEG) to locate and determine the temporal evolution in brain areas involved in the processing of simple sensory stimuli. We will use somatosensory stimuli to locate the hand somatosensory areas, auditory stimuli to locate the auditory cortices, visual stimuli in four quadrants of the visual field to locate the early visual areas. These type of experiments are used for functional mapping in epileptic and brain tumor patients to locate eloquent cortices. In basic neuroscience similar experimental protocols are used to study the orchestration of cortical activity. The acquisition protocol includes quality assurance procedures, subject preparation for the combined MEG/EEG study, and acquisition of evoked-response data with somatosensory, auditory, and visual stimuli. We also demonstrate analysis of the data using the equivalent current dipole model and cortically-constrained minimum-norm estimates. Anatomical MRI data are employed in the analysis for visualization and for deriving boundaries of tissue boundaries for forward modeling and cortical location and orientation constraints for the minimum-norm estimates.
JoVE neuroscience, Issue 40, neuroscience, brain, MEG, EEG, functional imaging
1668
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.