JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Inter-Observer Agreement in Measuring Respiratory Rate.
PUBLISHED: 06-20-2015
Respiratory rate (RR) is an important vital sign which is strongly correlated with in-hospital mortality. At the same time, RR is the most likely vital sign to be omitted when assessing a patient. We believe that one reason for this could be the difficulty in measure the RR, since it is not read off a monitor, but counted manually. Also there is the possibility of assessment bias and the inter-observer reliability becomes important. We therefore set out to investigate how the nursing staff counting the actual number of respirations per minute would agree with the nursing staff using a predefined ordinal scale.
Authors: Richard Ricardo, Katy Phelan.
Published: 06-23-2008
Determining the number of cells in culture is important in standardization of culture conditions and in performing accurate quantitation experiments. A hemacytometer is a thick glass slide with a central area designed as a counting chamber. Cell suspension is applied to a defined area and counted so cell density can be calculated.
26 Related JoVE Articles!
Play Button
Coordinate Mapping of Hyolaryngeal Mechanics in Swallowing
Authors: Thomas Z. Thompson, Farres Obeidin, Alisa A. Davidoff, Cody L. Hightower, Christohper Z. Johnson, Sonya L. Rice, Rebecca-Lyn Sokolove, Brandon K. Taylor, John M. Tuck, William G. Pearson, Jr..
Institutions: Georgia Regents University, New York University, Georgia Regents University, Georgia Regents University.
Characterizing hyolaryngeal movement is important to dysphagia research. Prior methods require multiple measurements to obtain one kinematic measurement whereas coordinate mapping of hyolaryngeal mechanics using Modified Barium Swallow (MBS) uses one set of coordinates to calculate multiple variables of interest. For demonstration purposes, ten kinematic measurements were generated from one set of coordinates to determine differences in swallowing two different bolus types. Calculations of hyoid excursion against the vertebrae and mandible are correlated to determine the importance of axes of reference. To demonstrate coordinate mapping methodology, 40 MBS studies were randomly selected from a dataset of healthy normal subjects with no known swallowing impairment. A 5 ml thin-liquid bolus and a 5 ml pudding swallows were measured from each subject. Nine coordinates, mapping the cranial base, mandible, vertebrae and elements of the hyolaryngeal complex, were recorded at the frames of minimum and maximum hyolaryngeal excursion. Coordinates were mathematically converted into ten variables of hyolaryngeal mechanics. Inter-rater reliability was evaluated by Intraclass correlation coefficients (ICC). Two-tailed t-tests were used to evaluate differences in kinematics by bolus viscosity. Hyoid excursion measurements against different axes of reference were correlated. Inter-rater reliability among six raters for the 18 coordinates ranged from ICC = 0.90 - 0.97. A slate of ten kinematic measurements was compared by subject between the six raters. One outlier was rejected, and the mean of the remaining reliability scores was ICC = 0.91, 0.84 - 0.96, 95% CI. Two-tailed t-tests with Bonferroni corrections comparing ten kinematic variables (5 ml thin-liquid vs. 5 ml pudding swallows) showed statistically significant differences in hyoid excursion, superior laryngeal movement, and pharyngeal shortening (p < 0.005). Pearson correlations of hyoid excursion measurements from two different axes of reference were: r = 0.62, r2 = 0.38, (thin-liquid); r = 0.52, r2 = 0.27, (pudding). Obtaining landmark coordinates is a reliable method to generate multiple kinematic variables from video fluoroscopic images useful in dysphagia research.
Medicine, Issue 87, videofluoroscopy, modified barium swallow studies, hyolaryngeal kinematics, deglutition, dysphagia, dysphagia research, hyolaryngeal complex
Play Button
Irrelevant Stimuli and Action Control: Analyzing the Influence of Ignored Stimuli via the Distractor-Response Binding Paradigm
Authors: Birte Moeller, Hartmut Schächinger, Christian Frings.
Institutions: Trier University, Trier University.
Selection tasks in which simple stimuli (e.g. letters) are presented and a target stimulus has to be selected against one or more distractor stimuli are frequently used in the research on human action control. One important question in these settings is how distractor stimuli, competing with the target stimulus for a response, influence actions. The distractor-response binding paradigm can be used to investigate this influence. It is particular useful to separately analyze response retrieval and distractor inhibition effects. Computer-based experiments are used to collect the data (reaction times and error rates). In a number of sequentially presented pairs of stimulus arrays (prime-probe design), participants respond to targets while ignoring distractor stimuli. Importantly, the factors response relation in the arrays of each pair (repetition vs. change) and distractor relation (repetition vs. change) are varied orthogonally. The repetition of the same distractor then has a different effect depending on response relation (repetition vs. change) between arrays. This result pattern can be explained by response retrieval due to distractor repetition. In addition, distractor inhibition effects are indicated by a general advantage due to distractor repetition. The described paradigm has proven useful to determine relevant parameters for response retrieval effects on human action.
Behavior, Issue 87, stimulus-response binding, distractor-response binding, response retrieval, distractor inhibition, event file, action control, selection task
Play Button
DNA-affinity-purified Chip (DAP-chip) Method to Determine Gene Targets for Bacterial Two component Regulatory Systems
Authors: Lara Rajeev, Eric G. Luning, Aindrila Mukhopadhyay.
Institutions: Lawrence Berkeley National Laboratory.
In vivo methods such as ChIP-chip are well-established techniques used to determine global gene targets for transcription factors. However, they are of limited use in exploring bacterial two component regulatory systems with uncharacterized activation conditions. Such systems regulate transcription only when activated in the presence of unique signals. Since these signals are often unknown, the in vitro microarray based method described in this video article can be used to determine gene targets and binding sites for response regulators. This DNA-affinity-purified-chip method may be used for any purified regulator in any organism with a sequenced genome. The protocol involves allowing the purified tagged protein to bind to sheared genomic DNA and then affinity purifying the protein-bound DNA, followed by fluorescent labeling of the DNA and hybridization to a custom tiling array. Preceding steps that may be used to optimize the assay for specific regulators are also described. The peaks generated by the array data analysis are used to predict binding site motifs, which are then experimentally validated. The motif predictions can be further used to determine gene targets of orthologous response regulators in closely related species. We demonstrate the applicability of this method by determining the gene targets and binding site motifs and thus predicting the function for a sigma54-dependent response regulator DVU3023 in the environmental bacterium Desulfovibrio vulgaris Hildenborough.
Genetics, Issue 89, DNA-Affinity-Purified-chip, response regulator, transcription factor binding site, two component system, signal transduction, Desulfovibrio, lactate utilization regulator, ChIP-chip
Play Button
Measuring Respiratory Function in Mice Using Unrestrained Whole-body Plethysmography
Authors: Rebecca Lim, Marcus J. Zavou, Phillipa-Louise Milton, Siow Teng Chan, Jean L. Tan, Hayley Dickinson, Sean V. Murphy, Graham Jenkin, Euan M. Wallace.
Institutions: Monash Institute of Medical Research, Monash Medical Centre, Animal Resource Centre, Perth, Australia, Wake Forest Institute for Regenerative Medicine.
Respiratory dysfunction is one of the leading causes of morbidity and mortality in the world and the rates of mortality continue to rise. Quantitative assessment of lung function in rodent models is an important tool in the development of future therapies. Commonly used techniques for assessing respiratory function including invasive plethysmography and forced oscillation. While these techniques provide valuable information, data collection can be fraught with artefacts and experimental variability due to the need for anesthesia and/or invasive instrumentation of the animal. In contrast, unrestrained whole-body plethysmography (UWBP) offers a precise, non-invasive, quantitative way by which to analyze respiratory parameters. This technique avoids the use of anesthesia and restraints, which is common to traditional plethysmography techniques. This video will demonstrate the UWBP procedure including the equipment set up, calibration and lung function recording. It will explain how to analyze the collected data, as well as identify experimental outliers and artefacts that results from animal movement. The respiratory parameters obtained using this technique include tidal volume, minute volume, inspiratory duty cycle, inspiratory flow rate and the ratio of inspiration time to expiration time. UWBP does not rely on specialized skills and is inexpensive to perform. A key feature of UWBP, and most appealing to potential users, is the ability to perform repeated measures of lung function on the same animal.
Physiology, Issue 90, Unrestrained Whole Body Plethysmography, Lung function, Respiratory Disease, Rodents
Play Button
Monitoring of Systemic and Hepatic Hemodynamic Parameters in Mice
Authors: Chichi Xie, Weiwei Wei, Tao Zhang, Olaf Dirsch, Uta Dahmen.
Institutions: Jena University Hospital, Jena University Hospital, The First Affiliated Hospital of Wenzhou Medical University.
The use of mouse models in experimental research is of enormous importance for the study of hepatic physiology and pathophysiological disturbances. However, due to the small size of the mouse, technical details of the intraoperative monitoring procedure suitable for the mouse were rarely described. Previously we have reported a monitoring procedure to obtain hemodynamic parameters for rats. Now, we adapted the procedure to acquire systemic and hepatic hemodynamic parameters in mice, a species ten-fold smaller than rats. This film demonstrates the instrumentation of the animals as well as the data acquisition process needed to assess systemic and hepatic hemodynamics in mice. Vital parameters, including body temperature, respiratory rate and heart rate were recorded throughout the whole procedure. Systemic hemodynamic parameters consist of carotid artery pressure (CAP) and central venous pressure (CVP). Hepatic perfusion parameters include portal vein pressure (PVP), portal flow rate as well as the flow rate of the common hepatic artery (table 1). Instrumentation and data acquisition to record the normal values was completed within 1.5 h. Systemic and hepatic hemodynamic parameters remained within normal ranges during this procedure. This procedure is challenging but feasible. We have already applied this procedure to assess hepatic hemodynamics in normal mice as well as during 70% partial hepatectomy and in liver lobe clamping experiments. Mean PVP after resection (n= 20), was 11.41±2.94 cmH2O which was significantly higher (P<0.05) than before resection (6.87±2.39 cmH2O). The results of liver lobe clamping experiment indicated that this monitoring procedure is sensitive and suitable for detecting small changes in portal pressure and portal flow rate. In conclusion, this procedure is reliable in the hands of an experienced micro-surgeon but should be limited to experiments where mice are absolutely needed.
Medicine, Issue 92, mice, hemodynamics, hepatic perfusion, CAP, CVP, surgery, intraoperative monitoring, portal vein pressure, blood flow
Play Button
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Authors: Madeleine E. Hackney, Kathleen McKee.
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
Play Button
Automated Measurement of Pulmonary Emphysema and Small Airway Remodeling in Cigarette Smoke-exposed Mice
Authors: Maria E. Laucho-Contreras, Katherine L. Taylor, Ravi Mahadeva, Steve S. Boukedes, Caroline A. Owen.
Institutions: Brigham and Women's Hospital - Harvard Medical School, University of Cambridge - Addenbrooke's Hospital, Brigham and Women's Hospital - Harvard Medical School, Lovelace Respiratory Research Institute.
COPD is projected to be the third most common cause of mortality world-wide by 2020(1). Animal models of COPD are used to identify molecules that contribute to the disease process and to test the efficacy of novel therapies for COPD. Researchers use a number of models of COPD employing different species including rodents, guinea-pigs, rabbits, and dogs(2). However, the most widely-used model is that in which mice are exposed to cigarette smoke. Mice are an especially useful species in which to model COPD because their genome can readily be manipulated to generate animals that are either deficient in, or over-express individual proteins. Studies of gene-targeted mice that have been exposed to cigarette smoke have provided valuable information about the contributions of individual molecules to different lung pathologies in COPD(3-5). Most studies have focused on pathways involved in emphysema development which contributes to the airflow obstruction that is characteristic of COPD. However, small airway fibrosis also contributes significantly to airflow obstruction in human COPD patients(6), but much less is known about the pathogenesis of this lesion in smoke-exposed animals. To address this knowledge gap, this protocol quantifies both emphysema development and small airway fibrosis in smoke-exposed mice. This protocol exposes mice to CS using a whole-body exposure technique, then measures respiratory mechanics in the mice, inflates the lungs of mice to a standard pressure, and fixes the lungs in formalin. The researcher then stains the lung sections with either Gill’s stain to measure the mean alveolar chord length (as a readout of emphysema severity) or Masson’s trichrome stain to measure deposition of extracellular matrix (ECM) proteins around small airways (as a readout of small airway fibrosis). Studies of the effects of molecular pathways on both of these lung pathologies will lead to a better understanding of the pathogenesis of COPD.
Medicine, Issue 95, COPD, mice, small airway remodeling, emphysema, pulmonary function test
Play Button
Mindfulness in Motion (MIM): An Onsite Mindfulness Based Intervention (MBI) for Chronically High Stress Work Environments to Increase Resiliency and Work Engagement
Authors: Maryanna Klatt, Beth Steinberg, Anne-Marie Duchemin.
Institutions: The Ohio State University College of Medicine, Wexner Medical Center, The Ohio State University College of Medicine.
A pragmatic mindfulness intervention to benefit personnel working in chronically high-stress environments, delivered onsite during the workday, is timely and valuable to employee and employer alike. Mindfulness in Motion (MIM) is a Mindfulness Based Intervention (MBI) offered as a modified, less time intensive method (compared to Mindfulness-Based Stress Reduction), delivered onsite, during work, and intends to enable busy working adults to experience the benefits of mindfulness. It teaches mindful awareness principles, rehearses mindfulness as a group, emphasizes the use of gentle yoga stretches, and utilizes relaxing music in the background of both the group sessions and individual mindfulness practice. MIM is delivered in a group format, for 1 hr/week/8 weeks. CDs and a DVD are provided to facilitate individual practice. The yoga movement is emphasized in the protocol to facilitate a quieting of the mind. The music is included for participants to associate the relaxed state experienced in the group session with their individual practice. To determine the intervention feasibility/efficacy we conducted a randomized wait-list control group in Intensive Care Units (ICUs). ICUs represent a high-stress work environment where personnel experience chronic exposure to catastrophic situations as they care for seriously injured/ill patients. Despite high levels of work-related stress, few interventions have been developed and delivered onsite for such environments. The intervention is delivered on site in the ICU, during work hours, with participants receiving time release to attend sessions. The intervention is well received with 97% retention rate. Work engagement and resiliency increase significantly in the intervention group, compared to the wait-list control group, while participant respiration rates decrease significantly pre-post in 6/8 of the weekly sessions. Participants value institutional support, relaxing music, and the instructor as pivotal to program success. This provides evidence that MIM is feasible, well accepted, and can be effectively implemented in a chronically high-stress work environment.
Behavior, Issue 101, Mindfulness, resiliency, work-engagement, stress-reduction, workplace, non-reactivity, Intensive-care, chronic stress, work environment
Play Button
Ultrasound Based Assessment of Coronary Artery Flow and Coronary Flow Reserve Using the Pressure Overload Model in Mice
Authors: Wei-Ting Chang, Sudeshna Fisch, Michael Chen, Yiling Qiu, Susan Cheng, Ronglih Liao.
Institutions: Brigham and Women's Hospital, Harvard Medical School, Chi-Mei Medical Center, Tainan.
Transthoracic Doppler echocardiography (TTDE) is a clinically useful, noninvasive tool for studying coronary artery flow velocity and coronary flow reserve (CFR) in humans. Reduced CFR is accompanied by marked intramyocardial and pericoronary fibrosis and is used as an indication of the severity of dysfunction. This study explores, step-by-step, the real-time changes measured in the coronary flow velocity, CFR and systolic to diastolic peak velocity (S/D) ratio in the setting of an aortic banding model in mice. By using a Doppler transthoracic imaging technique that yields reproducible and reliable data, the method assesses changes in flow in the septal coronary artery (SCA), for a period of over two weeks in mice, that previously either underwent aortic banding or thoracotomy. During imaging, hyperemia in all mice was induced by isoflurane, an anesthetic that increased coronary flow velocity when compared with resting flow. All images were acquired by a single imager. Two ratios, (1) CFR, the ratio between hyperemic and baseline flow velocities, and (2) systolic (S) to diastolic (D) flow were determined, using a proprietary software and by two independent observers. Importantly, the observed changes in coronary flow preceded LV dysfunction as evidenced by normal LV mass and fractional shortening (FS). The method was benchmarked against the current gold standard of coronary assessment, histopathology. The latter technique showed clear pathologic changes in the coronary artery in the form of peri-coronary fibrosis that correlated to the flow changes as assessed by echocardiography. The study underscores the value of using a non-invasive technique to monitor coronary circulation in mouse hearts. The method minimizes redundant use of research animals and demonstrates that advanced ultrasound-based indices, such as CFR and S/D ratios, can serve as viable diagnostic tools in a variety of investigational protocols including drug studies and the study of genetically modified strains.
Medicine, Issue 98, Coronary flow reserve, Doppler echocardiography, non-invasive methodology, use of animals in research, pressure overload, aortic banding
Play Button
Dynamic Contrast Enhanced Magnetic Resonance Imaging of an Orthotopic Pancreatic Cancer Mouse Model
Authors: Hyunki Kim, Sharon Samuel, John W. Totenhagen, Marie Warren, Jeffrey C. Sellers, Donald J. Buchsbaum.
Institutions: University of Alabama at Birmingham, University of Alabama at Birmingham, University of Alabama at Birmingham.
Dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been limitedly used for orthotopic pancreatic tumor xenografts due to severe respiratory motion artifact in the abdominal area. Orthotopic tumor models offer advantages over subcutaneous ones, because those can reflect the primary tumor microenvironment affecting blood supply, neovascularization, and tumor cell invasion. We have recently established a protocol of DCE-MRI of orthotopic pancreatic tumor xenografts in mouse models by securing tumors with an orthogonally bent plastic board to prevent motion transfer from the chest region during imaging. The pressure by this board was localized on the abdominal area, and has not resulted in respiratory difficulty of the animals. This article demonstrates the detailed procedure of orthotopic pancreatic tumor modeling using small animals and DCE-MRI of the tumor xenografts. Quantification method of pharmacokinetic parameters in DCE-MRI is also introduced. The procedure described in this article will assist investigators to apply DCE-MRI for orthotopic gastrointestinal cancer mouse models.
Medicine, Issue 98, Imaging, Cancer, Pancreas, Mouse, Xenograft, DCE-MRI
Play Button
A Multicenter MRI Protocol for the Evaluation and Quantification of Deep Vein Thrombosis
Authors: Venkatesh Mani, Nadia Alie, Sarayu Ramachandran, Philip M. Robson, Cecilia Besa, Gregory Piazza, Michele Mercuri, Michael Grosso, Bachir Taouli, Samuel Z. Goldhaber, Zahi A. Fayad.
Institutions: Icahn School of Medicine at Mount Sinai, Brigham and Women's Hospital, Harvard Medical School, Daiichi Sankyo Pharma Development.
We evaluated a magnetic resonance venography (MRV) approach with gadofosveset to quantify total thrombus volume changes as the principal criterion for treatment efficacy in a multicenter randomized study comparing edoxaban monotherapy with a heparin/warfarin regimen for acute, symptomatic lower extremities deep vein thrombosis (DVT) treatment. We also used a direct thrombus imaging approach (DTHI, without the use of a contrast agent) to quantify fresh thrombus. We then sought to evaluate the reproducibility of the analysis methodology and applicability of using 3D magnetic resonance venography and direct thrombus imaging for the quantification of DVT in a multicenter trial setting. From 10 randomly selected subjects participating in the edoxaban Thrombus Reduction Imaging Study (eTRIS), total thrombus volume in the entire lower extremity deep venous system was quantified bilaterally. Subjects were imaged using 3D-T1W gradient echo sequences before (direct thrombus imaging, DTHI) and 5 min after injection of 0.03 mmol/kg of gadofosveset trisodium (magnetic resonance venography, MRV). The margins of the DVT on corresponding axial, curved multi-planar reformatted images were manually delineated by two observers to obtain volumetric measurements of the venous thrombi. MRV was used to compute total DVT volume, whereas DTHI was used to compute volume of fresh thrombus. Intra-class correlation (ICC) and Bland Altman analysis were performed to compare inter and intra-observer variability of the analysis. The ICC for inter and intra-observer variability was excellent (0.99 and 0.98, p <0.001, respectively) with no bias on Bland-Altman analysis for MRV images. For DTHI images, the results were slightly lower (ICC = 0.88 and 0.95 respectively, p <0.001), with bias for inter-observer results on Bland-Altman plots. This study showed feasibility of thrombus volume estimation in DVT using MRV with gadofosveset trisodium, with good intra- and inter-observer reproducibility in a multicenter setting.
Medicine, Issue 100, venous thrombosis, magnetic resonance imaging, magnetic resonance contrast enhanced venography, factor Xa inhibitor, gadofosveset, image analysis
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Play Button
A Proboscis Extension Response Protocol for Investigating Behavioral Plasticity in Insects: Application to Basic, Biomedical, and Agricultural Research
Authors: Brian H. Smith, Christina M. Burden.
Institutions: Arizona State University.
Insects modify their responses to stimuli through experience of associating those stimuli with events important for survival (e.g., food, mates, threats). There are several behavioral mechanisms through which an insect learns salient associations and relates them to these events. It is important to understand this behavioral plasticity for programs aimed toward assisting insects that are beneficial for agriculture. This understanding can also be used for discovering solutions to biomedical and agricultural problems created by insects that act as disease vectors and pests. The Proboscis Extension Response (PER) conditioning protocol was developed for honey bees (Apis mellifera) over 50 years ago to study how they perceive and learn about floral odors, which signal the nectar and pollen resources a colony needs for survival. The PER procedure provides a robust and easy-to-employ framework for studying several different ecologically relevant mechanisms of behavioral plasticity. It is easily adaptable for use with several other insect species and other behavioral reflexes. These protocols can be readily employed in conjunction with various means for monitoring neural activity in the CNS via electrophysiology or bioimaging, or for manipulating targeted neuromodulatory pathways. It is a robust assay for rapidly detecting sub-lethal effects on behavior caused by environmental stressors, toxins or pesticides. We show how the PER protocol is straightforward to implement using two procedures. One is suitable as a laboratory exercise for students or for quick assays of the effect of an experimental treatment. The other provides more thorough control of variables, which is important for studies of behavioral conditioning. We show how several measures for the behavioral response ranging from binary yes/no to more continuous variable like latency and duration of proboscis extension can be used to test hypotheses. And, we discuss some pitfalls that researchers commonly encounter when they use the procedure for the first time.
Neuroscience, Issue 91, PER, conditioning, honey bee, olfaction, olfactory processing, learning, memory, toxin assay
Play Button
Lesion Explorer: A Video-guided, Standardized Protocol for Accurate and Reliable MRI-derived Volumetrics in Alzheimer's Disease and Normal Elderly
Authors: Joel Ramirez, Christopher J.M. Scott, Alicia A. McNeely, Courtney Berezuk, Fuqiang Gao, Gregory M. Szilagyi, Sandra E. Black.
Institutions: Sunnybrook Health Sciences Centre, University of Toronto.
Obtaining in vivo human brain tissue volumetrics from MRI is often complicated by various technical and biological issues. These challenges are exacerbated when significant brain atrophy and age-related white matter changes (e.g. Leukoaraiosis) are present. Lesion Explorer (LE) is an accurate and reliable neuroimaging pipeline specifically developed to address such issues commonly observed on MRI of Alzheimer's disease and normal elderly. The pipeline is a complex set of semi-automatic procedures which has been previously validated in a series of internal and external reliability tests1,2. However, LE's accuracy and reliability is highly dependent on properly trained manual operators to execute commands, identify distinct anatomical landmarks, and manually edit/verify various computer-generated segmentation outputs. LE can be divided into 3 main components, each requiring a set of commands and manual operations: 1) Brain-Sizer, 2) SABRE, and 3) Lesion-Seg. Brain-Sizer's manual operations involve editing of the automatic skull-stripped total intracranial vault (TIV) extraction mask, designation of ventricular cerebrospinal fluid (vCSF), and removal of subtentorial structures. The SABRE component requires checking of image alignment along the anterior and posterior commissure (ACPC) plane, and identification of several anatomical landmarks required for regional parcellation. Finally, the Lesion-Seg component involves manual checking of the automatic lesion segmentation of subcortical hyperintensities (SH) for false positive errors. While on-site training of the LE pipeline is preferable, readily available visual teaching tools with interactive training images are a viable alternative. Developed to ensure a high degree of accuracy and reliability, the following is a step-by-step, video-guided, standardized protocol for LE's manual procedures.
Medicine, Issue 86, Brain, Vascular Diseases, Magnetic Resonance Imaging (MRI), Neuroimaging, Alzheimer Disease, Aging, Neuroanatomy, brain extraction, ventricles, white matter hyperintensities, cerebrovascular disease, Alzheimer disease
Play Button
Using Learning Outcome Measures to assess Doctoral Nursing Education
Authors: Glenn H. Raup, Jeff King, Romana J. Hughes, Natasha Faidley.
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
Play Button
Determining Cell Number During Cell Culture using the Scepter Cell Counter
Authors: Kathleen Ongena, Chandreyee Das, Janet L. Smith, Sónia Gil, Grace Johnston.
Institutions: Millipore Inc.
Counting cells is often a necessary but tedious step for in vitro cell culture. Consistent cell concentrations ensure experimental reproducibility and accuracy. Cell counts are important for monitoring cell health and proliferation rate, assessing immortalization or transformation, seeding cells for subsequent experiments, transfection or infection, and preparing for cell-based assays. It is important that cell counts be accurate, consistent, and fast, particularly for quantitative measurements of cellular responses. Despite this need for speed and accuracy in cell counting, 71% of 400 researchers surveyed1 who count cells using a hemocytometer. While hemocytometry is inexpensive, it is laborious and subject to user bias and misuse, which results in inaccurate counts. Hemocytometers are made of special optical glass on which cell suspensions are loaded in specified volumes and counted under a microscope. Sources of errors in hemocytometry include: uneven cell distribution in the sample, too many or too few cells in the sample, subjective decisions as to whether a given cell falls within the defined counting area, contamination of the hemocytometer, user-to-user variation, and variation of hemocytometer filling rate2. To alleviate the tedium associated with manual counting, 29% of researchers count cells using automated cell counting devices; these include vision-based counters, systems that detect cells using the Coulter principle, or flow cytometry1. For most researchers, the main barrier to using an automated system is the price associated with these large benchtop instruments1. The Scepter cell counter is an automated handheld device that offers the automation and accuracy of Coulter counting at a relatively low cost. The system employs the Coulter principle of impedance-based particle detection3 in a miniaturized format using a combination of analog and digital hardware for sensing, signal processing, data storage, and graphical display. The disposable tip is engineered with a microfabricated, cell- sensing zone that enables discrimination by cell size and cell volume at sub-micron and sub-picoliter resolution. Enhanced with precision liquid-handling channels and electronics, the Scepter cell counter reports cell population statistics graphically displayed as a histogram.
Cellular Biology, Issue 45, Scepter, cell counting, cell culture, hemocytometer, Coulter, Impedance-based particle detection
Play Button
Manual Muscle Testing: A Method of Measuring Extremity Muscle Strength Applied to Critically Ill Patients
Authors: Nancy Ciesla, Victor Dinglas, Eddy Fan, Michelle Kho, Jill Kuramoto, Dale Needham.
Institutions: Johns Hopkins University, Johns Hopkins Hospital , Johns Hopkins University, University of Maryland Medical System.
Survivors of acute respiratory distress syndrome (ARDS) and other causes of critical illness often have generalized weakness, reduced exercise tolerance, and persistent nerve and muscle impairments after hospital discharge.1-6 Using an explicit protocol with a structured approach to training and quality assurance of research staff, manual muscle testing (MMT) is a highly reliable method for assessing strength, using a standardized clinical examination, for patients following ARDS, and can be completed with mechanically ventilated patients who can tolerate sitting upright in bed and are able to follow two-step commands. 7, 8 This video demonstrates a protocol for MMT, which has been taught to ≥43 research staff who have performed >800 assessments on >280 ARDS survivors. Modifications for the bedridden patient are included. Each muscle is tested with specific techniques for positioning, stabilization, resistance, and palpation for each score of the 6-point ordinal Medical Research Council scale.7,9-11 Three upper and three lower extremity muscles are graded in this protocol: shoulder abduction, elbow flexion, wrist extension, hip flexion, knee extension, and ankle dorsiflexion. These muscles were chosen based on the standard approach for evaluating patients for ICU-acquired weakness used in prior publications. 1,2.
Medicine, Issue 50, Muscle Strength, Critical illness, Intensive Care Units, Reproducibility of Results, Clinical Protocols.
Play Button
Methods for ECG Evaluation of Indicators of Cardiac Risk, and Susceptibility to Aconitine-induced Arrhythmias in Rats Following Status Epilepticus
Authors: Steven L. Bealer, Cameron S. Metcalf, Jason G. Little.
Institutions: University of Utah.
Lethal cardiac arrhythmias contribute to mortality in a number of pathological conditions. Several parameters obtained from a non-invasive, easily obtained electrocardiogram (ECG) are established, well-validated prognostic indicators of cardiac risk in patients suffering from a number of cardiomyopathies. Increased heart rate, decreased heart rate variability (HRV), and increased duration and variability of cardiac ventricular electrical activity (QT interval) are all indicative of enhanced cardiac risk 1-4. In animal models, it is valuable to compare these ECG-derived variables and susceptibility to experimentally induced arrhythmias. Intravenous infusion of the arrhythmogenic agent aconitine has been widely used to evaluate susceptibility to arrhythmias in a range of experimental conditions, including animal models of depression 5 and hypertension 6, following exercise 7 and exposure to air pollutants 8, as well as determination of the antiarrhythmic efficacy of pharmacological agents 9,10. It should be noted that QT dispersion in humans is a measure of QT interval variation across the full set of leads from a standard 12-lead ECG. Consequently, the measure of QT dispersion from the 2-lead ECG in the rat described in this protocol is different than that calculated from human ECG records. This represents a limitation in the translation of the data obtained from rodents to human clinical medicine. Status epilepticus (SE) is a single seizure or series of continuously recurring seizures lasting more than 30 min 11,12 11,12, and results in mortality in 20% of cases 13. Many individuals survive the SE, but die within 30 days 14,15. The mechanism(s) of this delayed mortality is not fully understood. It has been suggested that lethal ventricular arrhythmias contribute to many of these deaths 14-17. In addition to SE, patients experiencing spontaneously recurring seizures, i.e. epilepsy, are at risk of premature sudden and unexpected death associated with epilepsy (SUDEP) 18. As with SE, the precise mechanisms mediating SUDEP are not known. It has been proposed that ventricular abnormalities and resulting arrhythmias make a significant contribution 18-22. To investigate the mechanisms of seizure-related cardiac death, and the efficacy of cardioprotective therapies, it is necessary to obtain both ECG-derived indicators of risk and evaluate susceptibility to cardiac arrhythmias in animal models of seizure disorders 23-25. Here we describe methods for implanting ECG electrodes in the Sprague-Dawley laboratory rat (Rattus norvegicus), following SE, collection and analysis of ECG recordings, and induction of arrhythmias during iv infusion of aconitine. These procedures can be used to directly determine the relationships between ECG-derived measures of cardiac electrical activity and susceptibility to ventricular arrhythmias in rat models of seizure disorders, or any pathology associated with increased risk of sudden cardiac death.
Medicine, Issue 50, cardiac, seizure disorders, QTc, QTd, cardiac arrhythmias, rat
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
A Research Method For Detecting Transient Myocardial Ischemia In Patients With Suspected Acute Coronary Syndrome Using Continuous ST-segment Analysis
Authors: Michele M. Pelter, Teri M. Kozik, Denise L. Loranger, Mary G. Carey.
Institutions: University of Nevada, Reno, St. Joseph's Medical Center, University of Rochester Medical Center .
Each year, an estimated 785,000 Americans will have a new coronary attack, or acute coronary syndrome (ACS). The pathophysiology of ACS involves rupture of an atherosclerotic plaque; hence, treatment is aimed at plaque stabilization in order to prevent cellular death. However, there is considerable debate among clinicians, about which treatment pathway is best: early invasive using percutaneous coronary intervention (PCI/stent) when indicated or a conservative approach (i.e., medication only with PCI/stent if recurrent symptoms occur). There are three types of ACS: ST elevation myocardial infarction (STEMI), non-ST elevation MI (NSTEMI), and unstable angina (UA). Among the three types, NSTEMI/UA is nearly four times as common as STEMI. Treatment decisions for NSTEMI/UA are based largely on symptoms and resting or exercise electrocardiograms (ECG). However, because of the dynamic and unpredictable nature of the atherosclerotic plaque, these methods often under detect myocardial ischemia because symptoms are unreliable, and/or continuous ECG monitoring was not utilized. Continuous 12-lead ECG monitoring, which is both inexpensive and non-invasive, can identify transient episodes of myocardial ischemia, a precursor to MI, even when asymptomatic. However, continuous 12-lead ECG monitoring is not usual hospital practice; rather, only two leads are typically monitored. Information obtained with 12-lead ECG monitoring might provide useful information for deciding the best ACS treatment. Purpose. Therefore, using 12-lead ECG monitoring, the COMPARE Study (electroCardiographic evaluatiOn of ischeMia comParing invAsive to phaRmacological trEatment) was designed to assess the frequency and clinical consequences of transient myocardial ischemia, in patients with NSTEMI/UA treated with either early invasive PCI/stent or those managed conservatively (medications or PCI/stent following recurrent symptoms). The purpose of this manuscript is to describe the methodology used in the COMPARE Study. Method. Permission to proceed with this study was obtained from the Institutional Review Board of the hospital and the university. Research nurses identify hospitalized patients from the emergency department and telemetry unit with suspected ACS. Once consented, a 12-lead ECG Holter monitor is applied, and remains in place during the patient's entire hospital stay. Patients are also maintained on the routine bedside ECG monitoring system per hospital protocol. Off-line ECG analysis is done using sophisticated software and careful human oversight.
Medicine, Issue 70, Anatomy, Physiology, Cardiology, Myocardial Ischemia, Cardiovascular Diseases, Health Occupations, Health Care, transient myocardial ischemia, Acute Coronary Syndrome, electrocardiogram, ST-segment monitoring, Holter monitoring, research methodology
Play Button
Whole-Body Nanoparticle Aerosol Inhalation Exposures
Authors: Jinghai Yi, Bean T. Chen, Diane Schwegler-Berry, Dave Frazer, Vince Castranova, Carroll McBride, Travis L. Knuckles, Phoebe A. Stapleton, Valerie C. Minarchick, Timothy R. Nurkiewicz.
Institutions: West Virginia University , West Virginia University , National Institute for Occupational Safety and Health.
Inhalation is the most likely exposure route for individuals working with aerosolizable engineered nano-materials (ENM). To properly perform nanoparticle inhalation toxicology studies, the aerosols in a chamber housing the experimental animals must have: 1) a steady concentration maintained at a desired level for the entire exposure period; 2) a homogenous composition free of contaminants; and 3) a stable size distribution with a geometric mean diameter < 200 nm and a geometric standard deviation σg < 2.5 5. The generation of aerosols containing nanoparticles is quite challenging because nanoparticles easily agglomerate. This is largely due to very strong inter-particle forces and the formation of large fractal structures in tens or hundreds of microns in size 6, which are difficult to be broken up. Several common aerosol generators, including nebulizers, fluidized beds, Venturi aspirators and the Wright dust feed, were tested; however, none were able to produce nanoparticle aerosols which satisfy all criteria 5. A whole-body nanoparticle aerosol inhalation exposure system was fabricated, validated and utilized for nano-TiO2 inhalation toxicology studies. Critical components: 1) novel nano-TiO2 aerosol generator; 2) 0.5 m3 whole-body inhalation exposure chamber; and 3) monitor and control system. Nano-TiO2 aerosols generated from bulk dry nano-TiO2 powders (primary diameter of 21 nm, bulk density of 3.8 g/cm3) were delivered into the exposure chamber at a flow rate of 90 LPM (10.8 air changes/hr). Particle size distribution and mass concentration profiles were measured continuously with a scanning mobility particle sizer (SMPS), and an electric low pressure impactor (ELPI). The aerosol mass concentration (C) was verified gravimetrically (mg/m3). The mass (M) of the collected particles was determined as M = (Mpost-Mpre), where Mpre and Mpost are masses of the filter before and after sampling (mg). The mass concentration was calculated as C = M/(Q*t), where Q is sampling flowrate (m3/min), and t is the sampling time (minute). The chamber pressure, temperature, relative humidity (RH), O2 and CO2 concentrations were monitored and controlled continuously. Nano-TiO2 aerosols collected on Nuclepore filters were analyzed with a scanning electron microscope (SEM) and energy dispersive X-ray (EDX) analysis. In summary, we report that the nano-particle aerosols generated and delivered to our exposure chamber have: 1) steady mass concentration; 2) homogenous composition free of contaminants; 3) stable particle size distributions with a count-median aerodynamic diameter of 157 nm during aerosol generation. This system reliably and repeatedly creates test atmospheres that simulate occupational, environmental or domestic ENM aerosol exposures.
Medicine, Issue 75, Physiology, Anatomy, Chemistry, Biomedical Engineering, Pharmacology, Titanium dioxide, engineered nanomaterials, nanoparticle, toxicology, inhalation exposure, aerosols, dry powder, animal model
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
Play Button
Contextual and Cued Fear Conditioning Test Using a Video Analyzing System in Mice
Authors: Hirotaka Shoji, Keizo Takao, Satoko Hattori, Tsuyoshi Miyakawa.
Institutions: Fujita Health University, Core Research for Evolutionary Science and Technology (CREST), National Institutes of Natural Sciences.
The contextual and cued fear conditioning test is one of the behavioral tests that assesses the ability of mice to learn and remember an association between environmental cues and aversive experiences. In this test, mice are placed into a conditioning chamber and are given parings of a conditioned stimulus (an auditory cue) and an aversive unconditioned stimulus (an electric footshock). After a delay time, the mice are exposed to the same conditioning chamber and a differently shaped chamber with presentation of the auditory cue. Freezing behavior during the test is measured as an index of fear memory. To analyze the behavior automatically, we have developed a video analyzing system using the ImageFZ application software program, which is available as a free download at Here, to show the details of our protocol, we demonstrate our procedure for the contextual and cued fear conditioning test in C57BL/6J mice using the ImageFZ system. In addition, we validated our protocol and the video analyzing system performance by comparing freezing time measured by the ImageFZ system or a photobeam-based computer measurement system with that scored by a human observer. As shown in our representative results, the data obtained by ImageFZ were similar to those analyzed by a human observer, indicating that the behavioral analysis using the ImageFZ system is highly reliable. The present movie article provides detailed information regarding the test procedures and will promote understanding of the experimental situation.
Behavior, Issue 85, Fear, Learning, Memory, ImageFZ program, Mouse, contextual fear, cued fear
Play Button
Measuring Sensitivity to Viewpoint Change with and without Stereoscopic Cues
Authors: Jason Bell, Edwin Dickinson, David R. Badcock, Frederick A. A. Kingdom.
Institutions: Australian National University, University of Western Australia, McGill University.
The speed and accuracy of object recognition is compromised by a change in viewpoint; demonstrating that human observers are sensitive to this transformation. Here we discuss a novel method for simulating the appearance of an object that has undergone a rotation-in-depth, and include an exposition of the differences between perspective and orthographic projections. Next we describe a method by which human sensitivity to rotation-in-depth can be measured. Finally we discuss an apparatus for creating a vivid percept of a 3-dimensional rotation-in-depth; the Wheatstone Eight Mirror Stereoscope. By doing so, we reveal a means by which to evaluate the role of stereoscopic cues in the discrimination of viewpoint rotated shapes and objects.
Behavior, Issue 82, stereo, curvature, shape, viewpoint, 3D, object recognition, rotation-in-depth (RID)
Play Button
Adapting the Electrospinning Process to Provide Three Unique Environments for a Tri-layered In Vitro Model of the Airway Wall
Authors: Jack C. Bridge, Jonathan W. Aylott, Christopher E. Brightling, Amir M. Ghaemmaghami, Alan J. Knox, Mark P. Lewis, Felicity R.A.J. Rose, Gavin E. Morris.
Institutions: University of Nottingham, University of Nottingham, University of Nottingham, University of Nottingham, University of Leicester, Loughborough University.
Electrospinning is a highly adaptable method producing porous 3D fibrous scaffolds that can be exploited in in vitro cell culture. Alterations to intrinsic parameters within the process allow a high degree of control over scaffold characteristics including fiber diameter, alignment and porosity. By developing scaffolds with similar dimensions and topographies to organ- or tissue-specific extracellular matrices (ECM), micro-environments representative to those that cells are exposed to in situ can be created. The airway bronchiole wall, comprised of three main micro-environments, was selected as a model tissue. Using decellularized airway ECM as a guide, we electrospun the non-degradable polymer, polyethylene terephthalate (PET), by three different protocols to produce three individual electrospun scaffolds optimized for epithelial, fibroblast or smooth muscle cell-culture. Using a commercially available bioreactor system, we stably co-cultured the three cell-types to provide an in vitro model of the airway wall over an extended time period. This model highlights the potential for such methods being employed in in vitro diagnostic studies investigating important inter-cellular cross-talk mechanisms or assessing novel pharmaceutical targets, by providing a relevant platform to allow the culture of fully differentiated adult cells within 3D, tissue-specific environments.
Bioengineering, Issue 101, Electrospinning, 3D Cell Culture, Bioreactor, Airway, Tissue Engineering, In Vitro Model
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.