JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Menage a quoi? Optimal number of peer reviewers.
PUBLISHED: 04-02-2015
Peer review represents the primary mechanism used by funding agencies to allocate financial support and by journals to select manuscripts for publication, yet recent Cochrane reviews determined literature on peer review best practice is sparse. Key to improving the process are reduction of inherent vulnerability to high degree of randomness and, from an economic perspective, limiting both the substantial indirect costs related to reviewer time invested and direct administrative costs to funding agencies, publishers and research institutions. Use of additional reviewers per application may increase reliability and decision consistency, but adds to overall cost and burden. The optimal number of reviewers per application, while not known, is thought to vary with accuracy of judges or evaluation methods. Here I use bootstrapping of replicated peer review data from a Post-doctoral Fellowships competition to show that five reviewers per application represents a practical optimum which avoids large random effects evident when fewer reviewers are used, a point where additional reviewers at increasing cost provides only diminishing incremental gains in chance-corrected consistency of decision outcomes. Random effects were most evident in the relative mid-range of competitiveness. Results support aggressive high- and low-end stratification or triaging of applications for subsequent stages of review, with the proportion and set of mid-range submissions to be retained for further consideration being dependent on overall success rate.
Authors: Alan Daugherty, Debra Rateri, Lu Hong, Anju Balakrishnan.
Published: 05-15-2009
The CODA 8-Channel High Throughput Non-Invasive Blood Pressure system measures the blood pressure in up to 8 mice or rats simultaneously. The CODA tail-cuff system uses Volume Pressure Recording (VPR) to measure the blood pressure by determining the tail blood volume. A specially designed differential pressure transducer and an occlusion tail-cuff measure the total blood volume in the tail without the need to obtain the individual pulse signal. Special attention is afforded to the length of the occlusion cuff in order to derive the most accurate blood pressure readings. VPR can easily obtain readings on dark-skinned rodents, such as C57BL6 mice and is MRI compatible. The CODA system provides you with measurements of six (6) different blood pressure parameters; systolic and diastolic blood pressure, heart rate, mean blood pressure, tail blood flow, and tail blood volume. Measurements can be made on either awake or anesthetized mice or rats. The CODA system includes a controller, laptop computer, software, cuffs, animal holders, infrared warming pads, and an infrared thermometer. There are seven different holder sizes for mice as small as 8 grams to rats as large as 900 grams.
27 Related JoVE Articles!
Play Button
Adaptation of Semiautomated Circulating Tumor Cell (CTC) Assays for Clinical and Preclinical Research Applications
Authors: Lori E. Lowes, Benjamin D. Hedley, Michael Keeney, Alison L. Allan.
Institutions: London Health Sciences Centre, Western University, London Health Sciences Centre, Lawson Health Research Institute, Western University.
The majority of cancer-related deaths occur subsequent to the development of metastatic disease. This highly lethal disease stage is associated with the presence of circulating tumor cells (CTCs). These rare cells have been demonstrated to be of clinical significance in metastatic breast, prostate, and colorectal cancers. The current gold standard in clinical CTC detection and enumeration is the FDA-cleared CellSearch system (CSS). This manuscript outlines the standard protocol utilized by this platform as well as two additional adapted protocols that describe the detailed process of user-defined marker optimization for protein characterization of patient CTCs and a comparable protocol for CTC capture in very low volumes of blood, using standard CSS reagents, for studying in vivo preclinical mouse models of metastasis. In addition, differences in CTC quality between healthy donor blood spiked with cells from tissue culture versus patient blood samples are highlighted. Finally, several commonly discrepant items that can lead to CTC misclassification errors are outlined. Taken together, these protocols will provide a useful resource for users of this platform interested in preclinical and clinical research pertaining to metastasis and CTCs.
Medicine, Issue 84, Metastasis, circulating tumor cells (CTCs), CellSearch system, user defined marker characterization, in vivo, preclinical mouse model, clinical research
Play Button
Novel 3D/VR Interactive Environment for MD Simulations, Visualization and Analysis
Authors: Benjamin N. Doblack, Tim Allis, Lilian P. Dávila.
Institutions: University of California Merced.
The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.
Physics, Issue 94, Computational systems, visualization and immersive environments, interactive learning, graphical processing unit accelerated simulations, molecular dynamics simulations, nanostructures.
Play Button
The Utilization of Oropharyngeal Intratracheal PAMP Administration and Bronchoalveolar Lavage to Evaluate the Host Immune Response in Mice
Authors: Irving C. Allen.
Institutions: Virginia Polytechnic Institute and State University.
The host immune response to pathogens is a complex biological process. The majority of in vivo studies classically employed to characterize host-pathogen interactions take advantage of intraperitoneal injections of select bacteria or pathogen associated molecular patterns (PAMPs) in mice. While these techniques have yielded tremendous data associated with infectious disease pathobiology, intraperitoneal injection models are not always appropriate for host-pathogen interaction studies in the lung. Utilizing an acute lung inflammation model in mice, it is possible to conduct a high resolution analysis of the host innate immune response utilizing lipopolysaccharide (LPS). Here, we describe the methods to administer LPS using nonsurgical oropharyngeal intratracheal administration, monitor clinical parameters associated with disease pathogenesis, and utilize bronchoalveolar lavage fluid to evaluate the host immune response. The techniques that are described are widely applicable for studying the host innate immune response to a diverse range of PAMPs and pathogens. Likewise, with minor modifications, these techniques can also be applied in studies evaluating allergic airway inflammation and in pharmacological applications.
Infection, Issue 86, LPS, Lipopolysaccharide, mouse, pneumonia, gram negative bacteria, inflammation, acute lung inflammation, innate immunity, host pathogen interaction, lung, respiratory disease
Play Button
Inducing Plasticity of Astrocytic Receptors by Manipulation of Neuronal Firing Rates
Authors: Alison X. Xie, Kelli Lauderdale, Thomas Murphy, Timothy L. Myers, Todd A. Fiacco.
Institutions: University of California Riverside, University of California Riverside, University of California Riverside.
Close to two decades of research has established that astrocytes in situ and in vivo express numerous G protein-coupled receptors (GPCRs) that can be stimulated by neuronally-released transmitter. However, the ability of astrocytic receptors to exhibit plasticity in response to changes in neuronal activity has received little attention. Here we describe a model system that can be used to globally scale up or down astrocytic group I metabotropic glutamate receptors (mGluRs) in acute brain slices. Included are methods on how to prepare parasagittal hippocampal slices, construct chambers suitable for long-term slice incubation, bidirectionally manipulate neuronal action potential frequency, load astrocytes and astrocyte processes with fluorescent Ca2+ indicator, and measure changes in astrocytic Gq GPCR activity by recording spontaneous and evoked astrocyte Ca2+ events using confocal microscopy. In essence, a “calcium roadmap” is provided for how to measure plasticity of astrocytic Gq GPCRs. Applications of the technique for study of astrocytes are discussed. Having an understanding of how astrocytic receptor signaling is affected by changes in neuronal activity has important implications for both normal synaptic function as well as processes underlying neurological disorders and neurodegenerative disease.
Neuroscience, Issue 85, astrocyte, plasticity, mGluRs, neuronal Firing, electrophysiology, Gq GPCRs, Bolus-loading, calcium, microdomains, acute slices, Hippocampus, mouse
Play Button
Combined DNA-RNA Fluorescent In situ Hybridization (FISH) to Study X Chromosome Inactivation in Differentiated Female Mouse Embryonic Stem Cells
Authors: Tahsin Stefan Barakat, Joost Gribnau.
Institutions: Erasmus MC - University Medical Center.
Fluorescent in situ hybridization (FISH) is a molecular technique which enables the detection of nucleic acids in cells. DNA FISH is often used in cytogenetics and cancer diagnostics, and can detect aberrations of the genome, which often has important clinical implications. RNA FISH can be used to detect RNA molecules in cells and has provided important insights in regulation of gene expression. Combining DNA and RNA FISH within the same cell is technically challenging, as conditions suitable for DNA FISH might be too harsh for fragile, single stranded RNA molecules. We here present an easily applicable protocol which enables the combined, simultaneous detection of Xist RNA and DNA encoded by the X chromosomes. This combined DNA-RNA FISH protocol can likely be applied to other systems where both RNA and DNA need to be detected.
Biochemistry, Issue 88, Fluorescent in situ hybridization (FISH), combined DNA-RNA FISH, ES cell, cytogenetics, single cell analysis, X chromosome inactivation (XCI), Xist, Bacterial artificial chromosome (BAC), DNA-probe, Rnf12
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Rapid Genotyping of Animals Followed by Establishing Primary Cultures of Brain Neurons
Authors: Jin-Young Koh, Sadahiro Iwabuchi, Zhengmin Huang, N. Charles Harata.
Institutions: University of Iowa Carver College of Medicine, University of Iowa Carver College of Medicine, EZ BioResearch LLC.
High-resolution analysis of the morphology and function of mammalian neurons often requires the genotyping of individual animals followed by the analysis of primary cultures of neurons. We describe a set of procedures for: labeling newborn mice to be genotyped, rapid genotyping, and establishing low-density cultures of brain neurons from these mice. Individual mice are labeled by tattooing, which allows for long-term identification lasting into adulthood. Genotyping by the described protocol is fast and efficient, and allows for automated extraction of nucleic acid with good reliability. This is useful under circumstances where sufficient time for conventional genotyping is not available, e.g., in mice that suffer from neonatal lethality. Primary neuronal cultures are generated at low density, which enables imaging experiments at high spatial resolution. This culture method requires the preparation of glial feeder layers prior to neuronal plating. The protocol is applied in its entirety to a mouse model of the movement disorder DYT1 dystonia (ΔE-torsinA knock-in mice), and neuronal cultures are prepared from the hippocampus, cerebral cortex and striatum of these mice. This protocol can be applied to mice with other genetic mutations, as well as to animals of other species. Furthermore, individual components of the protocol can be used for isolated sub-projects. Thus this protocol will have wide applications, not only in neuroscience but also in other fields of biological and medical sciences.
Neuroscience, Issue 95, AP2, genotyping, glial feeder layer, mouse tail, neuronal culture, nucleic-acid extraction, PCR, tattoo, torsinA
Play Button
The Neuromuscular Junction: Measuring Synapse Size, Fragmentation and Changes in Synaptic Protein Density Using Confocal Fluorescence Microscopy
Authors: Nigel Tse, Marco Morsch, Nazanin Ghazanfari, Louise Cole, Archunan Visvanathan, Catherine Leamey, William D. Phillips.
Institutions: University of Sydney, Macquarie University, University of Sydney.
The neuromuscular junction (NMJ) is the large, cholinergic relay synapse through which mammalian motor neurons control voluntary muscle contraction. Structural changes at the NMJ can result in neurotransmission failure, resulting in weakness, atrophy and even death of the muscle fiber. Many studies have investigated how genetic modifications or disease can alter the structure of the mouse NMJ. Unfortunately, it can be difficult to directly compare findings from these studies because they often employed different parameters and analytical methods. Three protocols are described here. The first uses maximum intensity projection confocal images to measure the area of acetylcholine receptor (AChR)-rich postsynaptic membrane domains at the endplate and the area of synaptic vesicle staining in the overlying presynaptic nerve terminal. The second protocol compares the relative intensities of immunostaining for synaptic proteins in the postsynaptic membrane. The third protocol uses Fluorescence Resonance Energy Transfer (FRET) to detect changes in the packing of postsynaptic AChRs at the endplate. The protocols have been developed and refined over a series of studies. Factors that influence the quality and consistency of results are discussed and normative data are provided for NMJs in healthy young adult mice.
Neuroscience, Issue 94, neuromuscular, motor endplate, motor control, sarcopenia, myasthenia gravis, amyotrophic lateral sclerosis, morphometry, confocal, immunofluorescence
Play Button
Adapting Human Videofluoroscopic Swallow Study Methods to Detect and Characterize Dysphagia in Murine Disease Models
Authors: Teresa E. Lever, Sabrina M. Braun, Ryan T. Brooks, Rebecca A. Harris, Loren L. Littrell, Ryan M. Neff, Cameron J. Hinkel, Mitchell J. Allen, Mollie A. Ulsas.
Institutions: University of Missouri, University of Missouri, University of Missouri.
This study adapted human videofluoroscopic swallowing study (VFSS) methods for use with murine disease models for the purpose of facilitating translational dysphagia research. Successful outcomes are dependent upon three critical components: test chambers that permit self-feeding while standing unrestrained in a confined space, recipes that mask the aversive taste/odor of commercially-available oral contrast agents, and a step-by-step test protocol that permits quantification of swallow physiology. Elimination of one or more of these components will have a detrimental impact on the study results. Moreover, the energy level capability of the fluoroscopy system will determine which swallow parameters can be investigated. Most research centers have high energy fluoroscopes designed for use with people and larger animals, which results in exceptionally poor image quality when testing mice and other small rodents. Despite this limitation, we have identified seven VFSS parameters that are consistently quantifiable in mice when using a high energy fluoroscope in combination with the new murine VFSS protocol. We recently obtained a low energy fluoroscopy system with exceptionally high imaging resolution and magnification capabilities that was designed for use with mice and other small rodents. Preliminary work using this new system, in combination with the new murine VFSS protocol, has identified 13 swallow parameters that are consistently quantifiable in mice, which is nearly double the number obtained using conventional (i.e., high energy) fluoroscopes. Identification of additional swallow parameters is expected as we optimize the capabilities of this new system. Results thus far demonstrate the utility of using a low energy fluoroscopy system to detect and quantify subtle changes in swallow physiology that may otherwise be overlooked when using high energy fluoroscopes to investigate murine disease models.
Medicine, Issue 97, mouse, murine, rodent, swallowing, deglutition, dysphagia, videofluoroscopy, radiation, iohexol, barium, palatability, taste, translational, disease models
Play Button
Mindfulness in Motion (MIM): An Onsite Mindfulness Based Intervention (MBI) for Chronically High Stress Work Environments to Increase Resiliency and Work Engagement
Authors: Maryanna Klatt, Beth Steinberg, Anne-Marie Duchemin.
Institutions: The Ohio State University College of Medicine, Wexner Medical Center, The Ohio State University College of Medicine.
A pragmatic mindfulness intervention to benefit personnel working in chronically high-stress environments, delivered onsite during the workday, is timely and valuable to employee and employer alike. Mindfulness in Motion (MIM) is a Mindfulness Based Intervention (MBI) offered as a modified, less time intensive method (compared to Mindfulness-Based Stress Reduction), delivered onsite, during work, and intends to enable busy working adults to experience the benefits of mindfulness. It teaches mindful awareness principles, rehearses mindfulness as a group, emphasizes the use of gentle yoga stretches, and utilizes relaxing music in the background of both the group sessions and individual mindfulness practice. MIM is delivered in a group format, for 1 hr/week/8 weeks. CDs and a DVD are provided to facilitate individual practice. The yoga movement is emphasized in the protocol to facilitate a quieting of the mind. The music is included for participants to associate the relaxed state experienced in the group session with their individual practice. To determine the intervention feasibility/efficacy we conducted a randomized wait-list control group in Intensive Care Units (ICUs). ICUs represent a high-stress work environment where personnel experience chronic exposure to catastrophic situations as they care for seriously injured/ill patients. Despite high levels of work-related stress, few interventions have been developed and delivered onsite for such environments. The intervention is delivered on site in the ICU, during work hours, with participants receiving time release to attend sessions. The intervention is well received with 97% retention rate. Work engagement and resiliency increase significantly in the intervention group, compared to the wait-list control group, while participant respiration rates decrease significantly pre-post in 6/8 of the weekly sessions. Participants value institutional support, relaxing music, and the instructor as pivotal to program success. This provides evidence that MIM is feasible, well accepted, and can be effectively implemented in a chronically high-stress work environment.
Behavior, Issue 101, Mindfulness, resiliency, work-engagement, stress-reduction, workplace, non-reactivity, Intensive-care, chronic stress, work environment
Play Button
A Rat Model of Ventricular Fibrillation and Resuscitation by Conventional Closed-chest Technique
Authors: Lorissa Lamoureux, Jeejabai Radhakrishnan, Raúl J. Gazmuri.
Institutions: Rosalind Franklin University of Medicine and Science.
A rat model of electrically-induced ventricular fibrillation followed by cardiac resuscitation using a closed chest technique that incorporates the basic components of cardiopulmonary resuscitation in humans is herein described. The model was developed in 1988 and has been used in approximately 70 peer-reviewed publications examining a myriad of resuscitation aspects including its physiology and pathophysiology, determinants of resuscitability, pharmacologic interventions, and even the effects of cell therapies. The model featured in this presentation includes: (1) vascular catheterization to measure aortic and right atrial pressures, to measure cardiac output by thermodilution, and to electrically induce ventricular fibrillation; and (2) tracheal intubation for positive pressure ventilation with oxygen enriched gas and assessment of the end-tidal CO2. A typical sequence of intervention entails: (1) electrical induction of ventricular fibrillation, (2) chest compression using a mechanical piston device concomitantly with positive pressure ventilation delivering oxygen-enriched gas, (3) electrical shocks to terminate ventricular fibrillation and reestablish cardiac activity, (4) assessment of post-resuscitation hemodynamic and metabolic function, and (5) assessment of survival and recovery of organ function. A robust inventory of measurements is available that includes – but is not limited to – hemodynamic, metabolic, and tissue measurements. The model has been highly effective in developing new resuscitation concepts and examining novel therapeutic interventions before their testing in larger and translationally more relevant animal models of cardiac arrest and resuscitation.
Medicine, Issue 98, Cardiopulmonary resuscitation, Hemodynamics, Myocardial ischemia, Rats, Reperfusion, Ventilation, Ventricular fibrillation, Ventricular function, Translational medical research
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
A Method for Investigating Age-related Differences in the Functional Connectivity of Cognitive Control Networks Associated with Dimensional Change Card Sort Performance
Authors: Bianca DeBenedictis, J. Bruce Morton.
Institutions: University of Western Ontario.
The ability to adjust behavior to sudden changes in the environment develops gradually in childhood and adolescence. For example, in the Dimensional Change Card Sort task, participants switch from sorting cards one way, such as shape, to sorting them a different way, such as color. Adjusting behavior in this way exacts a small performance cost, or switch cost, such that responses are typically slower and more error-prone on switch trials in which the sorting rule changes as compared to repeat trials in which the sorting rule remains the same. The ability to flexibly adjust behavior is often said to develop gradually, in part because behavioral costs such as switch costs typically decrease with increasing age. Why aspects of higher-order cognition, such as behavioral flexibility, develop so gradually remains an open question. One hypothesis is that these changes occur in association with functional changes in broad-scale cognitive control networks. On this view, complex mental operations, such as switching, involve rapid interactions between several distributed brain regions, including those that update and maintain task rules, re-orient attention, and select behaviors. With development, functional connections between these regions strengthen, leading to faster and more efficient switching operations. The current video describes a method of testing this hypothesis through the collection and multivariate analysis of fMRI data from participants of different ages.
Behavior, Issue 87, Neurosciences, fMRI, Cognitive Control, Development, Functional Connectivity
Play Button
Using Learning Outcome Measures to assess Doctoral Nursing Education
Authors: Glenn H. Raup, Jeff King, Romana J. Hughes, Natasha Faidley.
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
Play Button
One Dimensional Turing-Like Handshake Test for Motor Intelligence
Authors: Amir Karniel, Guy Avraham, Bat-Chen Peles, Shelly Levy-Tzedek, Ilana Nisky.
Institutions: Ben-Gurion University.
In the Turing test, a computer model is deemed to "think intelligently" if it can generate answers that are not distinguishable from those of a human. However, this test is limited to the linguistic aspects of machine intelligence. A salient function of the brain is the control of movement, and the movement of the human hand is a sophisticated demonstration of this function. Therefore, we propose a Turing-like handshake test, for machine motor intelligence. We administer the test through a telerobotic system in which the interrogator is engaged in a task of holding a robotic stylus and interacting with another party (human or artificial). Instead of asking the interrogator whether the other party is a person or a computer program, we employ a two-alternative forced choice method and ask which of two systems is more human-like. We extract a quantitative grade for each model according to its resemblance to the human handshake motion and name it "Model Human-Likeness Grade" (MHLG). We present three methods to estimate the MHLG. (i) By calculating the proportion of subjects' answers that the model is more human-like than the human; (ii) By comparing two weighted sums of human and model handshakes we fit a psychometric curve and extract the point of subjective equality (PSE); (iii) By comparing a given model with a weighted sum of human and random signal, we fit a psychometric curve to the answers of the interrogator and extract the PSE for the weight of the human in the weighted sum. Altogether, we provide a protocol to test computational models of the human handshake. We believe that building a model is a necessary step in understanding any phenomenon and, in this case, in understanding the neural mechanisms responsible for the generation of the human handshake.
Neuroscience, Issue 46, Turing test, Human Machine Interface, Haptics, Teleoperation, Motor Control, Motor Behavior, Diagnostics, Perception, handshake, telepresence
Play Button
Genomic MRI - a Public Resource for Studying Sequence Patterns within Genomic DNA
Authors: Ashwin Prakash, Jason Bechtel, Alexei Fedorov.
Institutions: University of Toledo Health Science Campus.
Non-coding genomic regions in complex eukaryotes, including intergenic areas, introns, and untranslated segments of exons, are profoundly non-random in their nucleotide composition and consist of a complex mosaic of sequence patterns. These patterns include so-called Mid-Range Inhomogeneity (MRI) regions -- sequences 30-10000 nucleotides in length that are enriched by a particular base or combination of bases (e.g. (G+T)-rich, purine-rich, etc.). MRI regions are associated with unusual (non-B-form) DNA structures that are often involved in regulation of gene expression, recombination, and other genetic processes (Fedorova & Fedorov 2010). The existence of a strong fixation bias within MRI regions against mutations that tend to reduce their sequence inhomogeneity additionally supports the functionality and importance of these genomic sequences (Prakash et al. 2009). Here we demonstrate a freely available Internet resource -- the Genomic MRI program package -- designed for computational analysis of genomic sequences in order to find and characterize various MRI patterns within them (Bechtel et al. 2008). This package also allows generation of randomized sequences with various properties and level of correspondence to the natural input DNA sequences. The main goal of this resource is to facilitate examination of vast regions of non-coding DNA that are still scarcely investigated and await thorough exploration and recognition.
Genetics, Issue 51, bioinformatics, computational biology, genomics, non-randomness, signals, gene regulation, DNA conformation
Play Button
Characterizing Herbivore Resistance Mechanisms: Spittlebugs on Brachiaria spp. as an Example
Authors: Soroush Parsa, Guillermo Sotelo, Cesar Cardona.
Institutions: CIAT.
Plants can resist herbivore damage through three broad mechanisms: antixenosis, antibiosis and tolerance1. Antixenosis is the degree to which the plant is avoided when the herbivore is able to select other plants2. Antibiosis is the degree to which the plant affects the fitness of the herbivore feeding on it1.Tolerance is the degree to which the plant can withstand or repair damage caused by the herbivore, without compromising the herbivore's growth and reproduction1. The durability of herbivore resistance in an agricultural setting depends to a great extent on the resistance mechanism favored during crop breeding efforts3. We demonstrate a no-choice experiment designed to estimate the relative contributions of antibiosis and tolerance to spittlebug resistance in Brachiaria spp. Several species of African grasses of the genus Brachiaria are valuable forage and pasture plants in the Neotropics, but they can be severely challenged by several native species of spittlebugs (Hemiptera: Cercopidae)4.To assess their resistance to spittlebugs, plants are vegetatively-propagated by stem cuttings and allowed to grow for approximately one month, allowing the growth of superficial roots on which spittlebugs can feed. At that point, each test plant is individually challenged with six spittlebug eggs near hatching. Infestations are allowed to progress for one month before evaluating plant damage and insect survival. Scoring plant damage provides an estimate of tolerance while scoring insect survival provides an estimate of antibiosis. This protocol has facilitated our plant breeding objective to enhance spittlebug resistance in commercial brachiariagrases5.
Plant Biology, Issue 52, host plant resistance, antibiosis, antixenosis, tolerance, Brachiaria, spittlebugs
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
Play Button
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Authors: Sergey Rabotyagov, Todd Campbell, Adriana Valcu, Philip Gassman, Manoj Jha, Keith Schilling, Calvin Wolter, Catherine Kling.
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,5,12,20) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7 with a multiobjective evolutionary algorithm SPEA226, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
The Dig Task: A Simple Scent Discrimination Reveals Deficits Following Frontal Brain Damage
Authors: Kris M. Martens, Cole Vonder Haar, Blake A. Hutsell, Michael R. Hoane.
Institutions: Southern Illinois University at Carbondale.
Cognitive impairment is the most frequent cause of disability in humans following brain damage, yet the behavioral tasks used to assess cognition in rodent models of brain injury is lacking. Borrowing from the operant literature our laboratory utilized a basic scent discrimination paradigm1-4 in order to assess deficits in frontally-injured rats. Previously we have briefly described the Dig task and demonstrated that rats with frontal brain damage show severe deficits across multiple tests within the task5. Here we present a more detailed protocol for this task. Rats are placed into a chamber and allowed to discriminate between two scented sands, one of which contains a reinforcer. The trial ends after the rat either correctly discriminates (defined as digging in the correct scented sand), incorrectly discriminates, or 30 sec elapses. Rats that correctly discriminate are allowed to recover and consume the reinforcer. Rats that discriminate incorrectly are immediately removed from the chamber. This can continue through a variety of reversals and novel scents. The primary analysis is the accuracy for each scent pairing (cumulative proportion correct for each scent). The general findings from the Dig task suggest that it is a simple experimental preparation that can assess deficits in rats with bilateral frontal cortical damage compared to rats with unilateral parietal damage. The Dig task can also be easily incorporated into an existing cognitive test battery. The use of more tasks such as this one can lead to more accurate testing of frontal function following injury, which may lead to therapeutic options for treatment. All animal use was conducted in accordance with protocols approved by the Institutional Animal Care and Use Committee.
Neuroscience, Issue 71, Medicine, Neurobiology, Anatomy, Physiology, Psychology, Behavior, cognitive assessment, dig task, scent discrimination, olfactory, brain injury, traumatic brain injury, TBI, brain damage, rats, animal model
Play Button
Movement Retraining using Real-time Feedback of Performance
Authors: Michael Anthony Hunt.
Institutions: University of British Columbia .
Any modification of movement - especially movement patterns that have been honed over a number of years - requires re-organization of the neuromuscular patterns responsible for governing the movement performance. This motor learning can be enhanced through a number of methods that are utilized in research and clinical settings alike. In general, verbal feedback of performance in real-time or knowledge of results following movement is commonly used clinically as a preliminary means of instilling motor learning. Depending on patient preference and learning style, visual feedback (e.g. through use of a mirror or different types of video) or proprioceptive guidance utilizing therapist touch, are used to supplement verbal instructions from the therapist. Indeed, a combination of these forms of feedback is commonplace in the clinical setting to facilitate motor learning and optimize outcomes. Laboratory-based, quantitative motion analysis has been a mainstay in research settings to provide accurate and objective analysis of a variety of movements in healthy and injured populations. While the actual mechanisms of capturing the movements may differ, all current motion analysis systems rely on the ability to track the movement of body segments and joints and to use established equations of motion to quantify key movement patterns. Due to limitations in acquisition and processing speed, analysis and description of the movements has traditionally occurred offline after completion of a given testing session. This paper will highlight a new supplement to standard motion analysis techniques that relies on the near instantaneous assessment and quantification of movement patterns and the display of specific movement characteristics to the patient during a movement analysis session. As a result, this novel technique can provide a new method of feedback delivery that has advantages over currently used feedback methods.
Medicine, Issue 71, Biophysics, Anatomy, Physiology, Physics, Biomedical Engineering, Behavior, Psychology, Kinesiology, Physical Therapy, Musculoskeletal System, Biofeedback, biomechanics, gait, movement, walking, rehabilitation, clinical, training
Play Button
Test Samples for Optimizing STORM Super-Resolution Microscopy
Authors: Daniel J. Metcalf, Rebecca Edwards, Neelam Kumarswami, Alex E. Knight.
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Preparation and Testing of Plant Seed Meal-based Wood Adhesives
Authors: Zhongqi He, Dorselyn C. Chapital.
Institutions: United States Department of Agriculture.
Recently, the interest in plant seed meal-based products as wood adhesives has steadily increased, as these plant raw materials are considered renewable and environment-friendly. These natural products may serve as alternatives to petroleum-based adhesives to ease environmental and sustainability concerns. This work demonstrates the preparation and testing of the plant seed-based wood adhesives using cottonseed and soy meal as raw materials. In addition to untreated meals, water washed meals and protein isolates are prepared and tested. Adhesive slurries are prepared by mixing a freeze-dried meal product with deionized water (3:25 w/w) for 2 hr. Each adhesive preparation is applied to one end of 2 wood veneer strips using a brush. The tacky adhesive coated areas of the wood veneer strips are lapped and glued by hot-pressing. Adhesive strength is reported as the shear strength of the bonded wood specimen at break. Water resistance of the adhesives is measured by the change in shear strength of the bonded wood specimens at break after water soaking. This protocol allows one to assess plant seed-based agricultural products as suitable candidates for substitution of synthetic-based wood adhesives. Adjustments to the adhesive formulation with or without additives and bonding conditions could optimize their adhesive properties for various practical applications.
Environmental Sciences, Issue 97, Cottonseed meal, soy meal, oilseed, protein isolate, wood adhesive, water resistance, shear strength
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.