JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Resilience amongst Australian aboriginal youth: an ecological analysis of factors associated with psychosocial functioning in high and low family risk contexts.
PUBLISHED: 01-01-2014
We investigate whether the profile of factors protecting psychosocial functioning of high risk exposed Australian Aboriginal youth are the same as those promoting psychosocial functioning in low risk exposed youth. Data on 1,021 youth aged 12-17 years were drawn from the Western Australian Aboriginal Child Health Survey (WAACHS 2000-2002), a population representative survey of the health and well-being of Aboriginal children, their families and community contexts. A person-centered approach was used to define four groups of youth cross-classified according to level of risk exposure (high/low) and psychosocial functioning (good/poor). Multivariate logistic regression was used to model the influence of individual, family, cultural and community factors on psychosocial outcomes separately for youth in high and low family-risk contexts. Results showed that in high family risk contexts, prosocial friendship and low area-level socioeconomic status uniquely protected psychosocial functioning. However, in low family risk contexts the perception of racism increased the likelihood of poor psychosocial functioning. For youth in both high and low risk contexts, higher self-esteem and self-regulation were associated with good psychosocial functioning although the relationship was non-linear. These findings demonstrate that an empirical resilience framework of analysis can identify potent protective processes operating uniquely in contexts of high risk and is the first to describe distinct profiles of risk, protective and promotive factors within high and low risk exposed Australian Aboriginal youth.
Authors: Derek J. Dean, Hans-Leo Teulings, Michael Caligiuri, Vijay A. Mittal.
Published: 11-21-2013
Growing evidence suggests that movement abnormalities are a core feature of psychosis. One marker of movement abnormality, dyskinesia, is a result of impaired neuromodulation of dopamine in fronto-striatal pathways. The traditional methods for identifying movement abnormalities include observer-based reports and force stability gauges. The drawbacks of these methods are long training times for raters, experimenter bias, large site differences in instrumental apparatus, and suboptimal reliability. Taking these drawbacks into account has guided the development of better standardized and more efficient procedures to examine movement abnormalities through handwriting analysis software and tablet. Individuals at risk for psychosis showed significantly more dysfluent pen movements (a proximal measure for dyskinesia) in a handwriting task. Handwriting kinematics offers a great advance over previous methods of assessing dyskinesia, which could clearly be beneficial for understanding the etiology of psychosis.
17 Related JoVE Articles!
Play Button
A Multi-Modal Approach to Assessing Recovery in Youth Athletes Following Concussion
Authors: Nick Reed, James Murphy, Talia Dick, Katie Mah, Melissa Paniccia, Lee Verweel, Danielle Dobney, Michelle Keightley.
Institutions: Holland Bloorview Kids Rehabilitation Hospital, University of Toronto, University of Toronto.
Concussion is one of the most commonly reported injuries amongst children and youth involved in sport participation. Following a concussion, youth can experience a range of short and long term neurobehavioral symptoms (somatic, cognitive and emotional/behavioral) that can have a significant impact on one’s participation in daily activities and pursuits of interest (e.g., school, sports, work, family/social life, etc.). Despite this, there remains a paucity in clinically driven research aimed specifically at exploring concussion within the youth sport population, and more specifically, multi-modal approaches to measuring recovery. This article provides an overview of a novel and multi-modal approach to measuring recovery amongst youth athletes following concussion. The presented approach involves the use of both pre-injury/baseline testing and post-injury/follow-up testing to assess performance across a wide variety of domains (post-concussion symptoms, cognition, balance, strength, agility/motor skills and resting state heart rate variability). The goal of this research is to gain a more objective and accurate understanding of recovery following concussion in youth athletes (ages 10-18 years). Findings from this research can help to inform the development and use of improved approaches to concussion management and rehabilitation specific to the youth sport community.
Medicine, Issue 91, concussion, children, youth, athletes, assessment, management, rehabilitation
Play Button
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Authors: Madeleine E. Hackney, Kathleen McKee.
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
Play Button
Assessment and Evaluation of the High Risk Neonate: The NICU Network Neurobehavioral Scale
Authors: Barry M. Lester, Lynne Andreozzi-Fontaine, Edward Tronick, Rosemarie Bigsby.
Institutions: Brown University, Women & Infants Hospital of Rhode Island, University of Massachusetts, Boston.
There has been a long-standing interest in the assessment of the neurobehavioral integrity of the newborn infant. The NICU Network Neurobehavioral Scale (NNNS) was developed as an assessment for the at-risk infant. These are infants who are at increased risk for poor developmental outcome because of insults during prenatal development, such as substance exposure or prematurity or factors such as poverty, poor nutrition or lack of prenatal care that can have adverse effects on the intrauterine environment and affect the developing fetus. The NNNS assesses the full range of infant neurobehavioral performance including neurological integrity, behavioral functioning, and signs of stress/abstinence. The NNNS is a noninvasive neonatal assessment tool with demonstrated validity as a predictor, not only of medical outcomes such as cerebral palsy diagnosis, neurological abnormalities, and diseases with risks to the brain, but also of developmental outcomes such as mental and motor functioning, behavior problems, school readiness, and IQ. The NNNS can identify infants at high risk for abnormal developmental outcome and is an important clinical tool that enables medical researchers and health practitioners to identify these infants and develop intervention programs to optimize the development of these infants as early as possible. The video shows the NNNS procedures, shows examples of normal and abnormal performance and the various clinical populations in which the exam can be used.
Behavior, Issue 90, NICU Network Neurobehavioral Scale, NNNS, High risk infant, Assessment, Evaluation, Prediction, Long term outcome
Play Button
An Investigation of the Effects of Sports-related Concussion in Youth Using Functional Magnetic Resonance Imaging and the Head Impact Telemetry System
Authors: Michelle Keightley, Stephanie Green, Nick Reed, Sabrina Agnihotri, Amy Wilkinson, Nancy Lobaugh.
Institutions: University of Toronto, University of Toronto, University of Toronto, Bloorview Kids Rehab, Toronto Rehab, Sunnybrook Health Sciences Centre, University of Toronto.
One of the most commonly reported injuries in children who participate in sports is concussion or mild traumatic brain injury (mTBI)1. Children and youth involved in organized sports such as competitive hockey are nearly six times more likely to suffer a severe concussion compared to children involved in other leisure physical activities2. While the most common cognitive sequelae of mTBI appear similar for children and adults, the recovery profile and breadth of consequences in children remains largely unknown2, as does the influence of pre-injury characteristics (e.g. gender) and injury details (e.g. magnitude and direction of impact) on long-term outcomes. Competitive sports, such as hockey, allow the rare opportunity to utilize a pre-post design to obtain pre-injury data before concussion occurs on youth characteristics and functioning and to relate this to outcome following injury. Our primary goals are to refine pediatric concussion diagnosis and management based on research evidence that is specific to children and youth. To do this we use new, multi-modal and integrative approaches that will: 1.Evaluate the immediate effects of head trauma in youth 2.Monitor the resolution of post-concussion symptoms (PCS) and cognitive performance during recovery 3.Utilize new methods to verify brain injury and recovery To achieve our goals, we have implemented the Head Impact Telemetry (HIT) System. (Simbex; Lebanon, NH, USA). This system equips commercially available Easton S9 hockey helmets (Easton-Bell Sports; Van Nuys, CA, USA) with single-axis accelerometers designed to measure real-time head accelerations during contact sport participation 3 - 5. By using telemetric technology, the magnitude of acceleration and location of all head impacts during sport participation can be objectively detected and recorded. We also use functional magnetic resonance imaging (fMRI) to localize and assess changes in neural activity specifically in the medial temporal and frontal lobes during the performance of cognitive tasks, since those are the cerebral regions most sensitive to concussive head injury 6. Finally, we are acquiring structural imaging data sensitive to damage in brain white matter.
Medicine, Issue 47, Mild traumatic brain injury, concussion, fMRI, youth, Head Impact Telemetry System
Play Button
Measuring the Subjective Value of Risky and Ambiguous Options using Experimental Economics and Functional MRI Methods
Authors: Ifat Levy, Lior Rosenberg Belmaker, Kirk Manson, Agnieszka Tymula, Paul W. Glimcher.
Institutions: Yale School of Medicine, Yale School of Medicine, New York University , New York University , New York University .
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2, the degree of those aversions vary substantially across individuals, such that the subjective value of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3 to assess the neural representation of the subjective values of risky and ambiguous options4. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations. In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
Neuroscience, Issue 67, Medicine, Molecular Biology, fMRI, magnetic resonance imaging, decision-making, value, uncertainty, risk, ambiguity
Play Button
Eye Tracking Young Children with Autism
Authors: Noah J. Sasson, Jed T. Elison.
Institutions: University of Texas at Dallas, University of North Carolina at Chapel Hill.
The rise of accessible commercial eye-tracking systems has fueled a rapid increase in their use in psychological and psychiatric research. By providing a direct, detailed and objective measure of gaze behavior, eye-tracking has become a valuable tool for examining abnormal perceptual strategies in clinical populations and has been used to identify disorder-specific characteristics1, promote early identification2, and inform treatment3. In particular, investigators of autism spectrum disorders (ASD) have benefited from integrating eye-tracking into their research paradigms4-7. Eye-tracking has largely been used in these studies to reveal mechanisms underlying impaired task performance8 and abnormal brain functioning9, particularly during the processing of social information1,10-11. While older children and adults with ASD comprise the preponderance of research in this area, eye-tracking may be especially useful for studying young children with the disorder as it offers a non-invasive tool for assessing and quantifying early-emerging developmental abnormalities2,12-13. Implementing eye-tracking with young children with ASD, however, is associated with a number of unique challenges, including issues with compliant behavior resulting from specific task demands and disorder-related psychosocial considerations. In this protocol, we detail methodological considerations for optimizing research design, data acquisition and psychometric analysis while eye-tracking young children with ASD. The provided recommendations are also designed to be more broadly applicable for eye-tracking children with other developmental disabilities. By offering guidelines for best practices in these areas based upon lessons derived from our own work, we hope to help other investigators make sound research design and analysis choices while avoiding common pitfalls that can compromise data acquisition while eye-tracking young children with ASD or other developmental difficulties.
Medicine, Issue 61, eye tracking, autism, neurodevelopmental disorders, toddlers, perception, attention, social cognition
Play Button
Combining Computer Game-Based Behavioural Experiments With High-Density EEG and Infrared Gaze Tracking
Authors: Keith J. Yoder, Matthew K. Belmonte.
Institutions: Cornell University, University of Chicago, Manesar, India.
Experimental paradigms are valuable insofar as the timing and other parameters of their stimuli are well specified and controlled, and insofar as they yield data relevant to the cognitive processing that occurs under ecologically valid conditions. These two goals often are at odds, since well controlled stimuli often are too repetitive to sustain subjects' motivation. Studies employing electroencephalography (EEG) are often especially sensitive to this dilemma between ecological validity and experimental control: attaining sufficient signal-to-noise in physiological averages demands large numbers of repeated trials within lengthy recording sessions, limiting the subject pool to individuals with the ability and patience to perform a set task over and over again. This constraint severely limits researchers' ability to investigate younger populations as well as clinical populations associated with heightened anxiety or attentional abnormalities. Even adult, non-clinical subjects may not be able to achieve their typical levels of performance or cognitive engagement: an unmotivated subject for whom an experimental task is little more than a chore is not the same, behaviourally, cognitively, or neurally, as a subject who is intrinsically motivated and engaged with the task. A growing body of literature demonstrates that embedding experiments within video games may provide a way between the horns of this dilemma between experimental control and ecological validity. The narrative of a game provides a more realistic context in which tasks occur, enhancing their ecological validity (Chaytor & Schmitter-Edgecombe, 2003). Moreover, this context provides motivation to complete tasks. In our game, subjects perform various missions to collect resources, fend off pirates, intercept communications or facilitate diplomatic relations. In so doing, they also perform an array of cognitive tasks, including a Posner attention-shifting paradigm (Posner, 1980), a go/no-go test of motor inhibition, a psychophysical motion coherence threshold task, the Embedded Figures Test (Witkin, 1950, 1954) and a theory-of-mind (Wimmer & Perner, 1983) task. The game software automatically registers game stimuli and subjects' actions and responses in a log file, and sends event codes to synchronise with physiological data recorders. Thus the game can be combined with physiological measures such as EEG or fMRI, and with moment-to-moment tracking of gaze. Gaze tracking can verify subjects' compliance with behavioural tasks (e.g. fixation) and overt attention to experimental stimuli, and also physiological arousal as reflected in pupil dilation (Bradley et al., 2008). At great enough sampling frequencies, gaze tracking may also help assess covert attention as reflected in microsaccades - eye movements that are too small to foveate a new object, but are as rapid in onset and have the same relationship between angular distance and peak velocity as do saccades that traverse greater distances. The distribution of directions of microsaccades correlates with the (otherwise) covert direction of attention (Hafed & Clark, 2002).
Neuroscience, Issue 46, High-density EEG, ERP, ICA, gaze tracking, computer game, ecological validity
Play Button
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Authors: Colin W. Bell, Barbara E. Fricks, Jennifer D. Rocca, Jessica M. Steinweg, Shawna K. McMahon, Matthew D. Wallenstein.
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample). Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
Play Button
Linking Predation Risk, Herbivore Physiological Stress and Microbial Decomposition of Plant Litter
Authors: Oswald J. Schmitz, Mark A. Bradford, Michael S. Strickland, Dror Hawlena.
Institutions: Yale University, Virginia Tech, The Hebrew University of Jerusalem.
The quantity and quality of detritus entering the soil determines the rate of decomposition by microbial communities as well as recycle rates of nitrogen (N) and carbon (C) sequestration1,2. Plant litter comprises the majority of detritus3, and so it is assumed that decomposition is only marginally influenced by biomass inputs from animals such as herbivores and carnivores4,5. However, carnivores may influence microbial decomposition of plant litter via a chain of interactions in which predation risk alters the physiology of their herbivore prey that in turn alters soil microbial functioning when the herbivore carcasses are decomposed6. A physiological stress response by herbivores to the risk of predation can change the C:N elemental composition of herbivore biomass7,8,9 because stress from predation risk increases herbivore basal energy demands that in nutrient-limited systems forces herbivores to shift their consumption from N-rich resources to support growth and reproduction to C-rich carbohydrate resources to support heightened metabolism6. Herbivores have limited ability to store excess nutrients, so stressed herbivores excrete N as they increase carbohydrate-C consumption7. Ultimately, prey stressed by predation risk increase their body C:N ratio7,10, making them poorer quality resources for the soil microbial pool likely due to lower availability of labile N for microbial enzyme production6. Thus, decomposition of carcasses of stressed herbivores has a priming effect on the functioning of microbial communities that decreases subsequent ability to of microbes to decompose plant litter6,10,11. We present the methodology to evaluate linkages between predation risk and litter decomposition by soil microbes. We describe how to: induce stress in herbivores from predation risk; measure those stress responses, and measure the consequences on microbial decomposition. We use insights from a model grassland ecosystem comprising the hunting spider predator (Pisuarina mira), a dominant grasshopper herbivore (Melanoplus femurrubrum),and a variety of grass and forb plants9.
Environmental Sciences, Issue 73, Microbiology, Plant Biology, Entomology, Organisms, Investigative Techniques, Biological Phenomena, Chemical Phenomena, Metabolic Phenomena, Microbiological Phenomena, Earth Resources and Remote Sensing, Life Sciences (General), Litter Decomposition, Ecological Stoichiometry, Physiological Stress and Ecosystem Function, Predation Risk, Soil Respiration, Carbon Sequestration, Soil Science, respiration, spider, grasshoper, model system
Play Button
An Experimental Paradigm for the Prediction of Post-Operative Pain (PPOP)
Authors: Ruth Landau, John C. Kraft, Lisa Y. Flint, Brendan Carvalho, Philippe Richebé, Monica Cardoso, Patricia Lavand'homme, Michal Granot, David Yarnitsky, Alex Cahana.
Institutions: University of Washington School of Medicine.
Many women undergo cesarean delivery without problems, however some experience significant pain after cesarean section. Pain is associated with negative short-term and long-term effects on the mother. Prior to women undergoing surgery, can we predict who is at risk for developing significant postoperative pain and potentially prevent or minimize its negative consequences? These are the fundamental questions that a team from the University of Washington, Stanford University, the Catholic University in Brussels, Belgium, Santa Joana Women's Hospital in São Paulo, Brazil, and Rambam Medical Center in Israel is currently evaluating in an international research collaboration. The ultimate goal of this project is to provide optimal pain relief during and after cesarean section by offering individualized anesthetic care to women who appear to be more 'susceptible' to pain after surgery. A significant number of women experience moderate or severe acute post-partum pain after vaginal and cesarean deliveries. 1 Furthermore, 10-15% of women suffer chronic persistent pain after cesarean section. 2 With constant increase in cesarean rates in the US 3 and the already high rate in Brazil, this is bound to create a significant public health problem. When questioning women's fears and expectations from cesarean section, pain during and after it is their greatest concern. 4 Individual variability in severity of pain after vaginal or operative delivery is influenced by multiple factors including sensitivity to pain, psychological factors, age, and genetics. The unique birth experience leads to unpredictable requirements for analgesics, from 'none at all' to 'very high' doses of pain medication. Pain after cesarean section is an excellent model to study post-operative pain because it is performed on otherwise young and healthy women. Therefore, it is recommended to attenuate the pain during the acute phase because this may lead to chronic pain disorders. The impact of developing persistent pain is immense, since it may impair not only the ability of women to care for their child in the immediate postpartum period, but also their own well being for a long period of time. In a series of projects, an international research network is currently investigating the effect of pregnancy on pain modulation and ways to predict who will suffer acute severe pain and potentially chronic pain, by using simple pain tests and questionnaires in combination with genetic analysis. A relatively recent approach to investigate pain modulation is via the psychophysical measure of Diffuse Noxious Inhibitory Control (DNIC). This pain-modulating process is the neurophysiological basis for the well-known phenomenon of 'pain inhibits pain' from remote areas of the body. The DNIC paradigm has evolved recently into a clinical tool and simple test and has been shown to be a predictor of post-operative pain.5 Since pregnancy is associated with decreased pain sensitivity and/or enhanced processes of pain modulation, using tests that investigate pain modulation should provide a better understanding of the pathways involved with pregnancy-induced analgesia and may help predict pain outcomes during labor and delivery. For those women delivering by cesarean section, a DNIC test performed prior to surgery along with psychosocial questionnaires and genetic tests should enable one to identify women prone to suffer severe post-cesarean pain and persistent pain. These clinical tests should allow anesthesiologists to offer not only personalized medicine to women with the promise to improve well-being and satisfaction, but also a reduction in the overall cost of perioperative and long term care due to pain and suffering. On a larger scale, these tests that explore pain modulation may become bedside screening tests to predict the development of pain disorders following surgery.
JoVE Medicine, Issue 35, diffuse noxious inhibitory control, DNIC, temporal summation, TS, psychophysical testing, endogenous analgesia, pain modulation, pregnancy-induced analgesia, cesarean section, post-operative pain, prediction
Play Button
Dried Blood Spot Collection of Health Biomarkers to Maximize Participation in Population Studies
Authors: Michael W. Ostler, James H. Porter, Orfeu M. Buxton.
Institutions: Harvard School of Public Health, Brigham and Women's Hospital, Harvard Medical School, Pennsylvania State University.
Biomarkers are directly-measured biological indicators of disease, health, exposures, or other biological information. In population and social sciences, biomarkers need to be easy to obtain, transport, and analyze. Dried Blood Spots meet this need, and can be collected in the field with high response rates. These elements are particularly important in longitudinal study designs including interventions where attrition is critical to avoid, and high response rates improve the interpretation of results. Dried Blood Spot sample collection is simple, quick, relatively painless, less invasive then venipuncture, and requires minimal field storage requirements (i.e. samples do not need to be immediately frozen and can be stored for a long period of time in a stable freezer environment before assay). The samples can be analyzed for a variety of different analytes, including cholesterol, C-reactive protein, glycosylated hemoglobin, numerous cytokines, and other analytes, as well as provide genetic material. DBS collection is depicted as employed in several recent studies.
Medicine, Issue 83, dried blood spots (DBS), Biomarkers, cardiometabolic risk, Inflammation, standard precautions, blood collection
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Development of a Virtual Reality Assessment of Everyday Living Skills
Authors: Stacy A. Ruse, Vicki G. Davis, Alexandra S. Atkins, K. Ranga R. Krishnan, Kolleen H. Fox, Philip D. Harvey, Richard S.E. Keefe.
Institutions: NeuroCog Trials, Inc., Duke-NUS Graduate Medical Center, Duke University Medical Center, Fox Evaluation and Consulting, PLLC, University of Miami Miller School of Medicine.
Cognitive impairments affect the majority of patients with schizophrenia and these impairments predict poor long term psychosocial outcomes.  Treatment studies aimed at cognitive impairment in patients with schizophrenia not only require demonstration of improvements on cognitive tests, but also evidence that any cognitive changes lead to clinically meaningful improvements.  Measures of “functional capacity” index the extent to which individuals have the potential to perform skills required for real world functioning.  Current data do not support the recommendation of any single instrument for measurement of functional capacity.  The Virtual Reality Functional Capacity Assessment Tool (VRFCAT) is a novel, interactive gaming based measure of functional capacity that uses a realistic simulated environment to recreate routine activities of daily living. Studies are currently underway to evaluate and establish the VRFCAT’s sensitivity, reliability, validity, and practicality. This new measure of functional capacity is practical, relevant, easy to use, and has several features that improve validity and sensitivity of measurement of function in clinical trials of patients with CNS disorders.
Behavior, Issue 86, Virtual Reality, Cognitive Assessment, Functional Capacity, Computer Based Assessment, Schizophrenia, Neuropsychology, Aging, Dementia
Play Button
Developing Neuroimaging Phenotypes of the Default Mode Network in PTSD: Integrating the Resting State, Working Memory, and Structural Connectivity
Authors: Noah S. Philip, S. Louisa Carpenter, Lawrence H. Sweet.
Institutions: Alpert Medical School, Brown University, University of Georgia.
Complementary structural and functional neuroimaging techniques used to examine the Default Mode Network (DMN) could potentially improve assessments of psychiatric illness severity and provide added validity to the clinical diagnostic process. Recent neuroimaging research suggests that DMN processes may be disrupted in a number of stress-related psychiatric illnesses, such as posttraumatic stress disorder (PTSD). Although specific DMN functions remain under investigation, it is generally thought to be involved in introspection and self-processing. In healthy individuals it exhibits greatest activity during periods of rest, with less activity, observed as deactivation, during cognitive tasks, e.g., working memory. This network consists of the medial prefrontal cortex, posterior cingulate cortex/precuneus, lateral parietal cortices and medial temporal regions. Multiple functional and structural imaging approaches have been developed to study the DMN. These have unprecedented potential to further the understanding of the function and dysfunction of this network. Functional approaches, such as the evaluation of resting state connectivity and task-induced deactivation, have excellent potential to identify targeted neurocognitive and neuroaffective (functional) diagnostic markers and may indicate illness severity and prognosis with increased accuracy or specificity. Structural approaches, such as evaluation of morphometry and connectivity, may provide unique markers of etiology and long-term outcomes. Combined, functional and structural methods provide strong multimodal, complementary and synergistic approaches to develop valid DMN-based imaging phenotypes in stress-related psychiatric conditions. This protocol aims to integrate these methods to investigate DMN structure and function in PTSD, relating findings to illness severity and relevant clinical factors.
Medicine, Issue 89, default mode network, neuroimaging, functional magnetic resonance imaging, diffusion tensor imaging, structural connectivity, functional connectivity, posttraumatic stress disorder
Play Button
The Trier Social Stress Test Protocol for Inducing Psychological Stress
Authors: Melissa A. Birkett.
Institutions: Northern Arizona University.
This article demonstrates a psychological stress protocol for use in a laboratory setting. Protocols that allow researchers to study the biological pathways of the stress response in health and disease are fundamental to the progress of research in stress and anxiety.1 Although numerous protocols exist for inducing stress response in the laboratory, many neglect to provide a naturalistic context or to incorporate aspects of social and psychological stress. Of psychological stress protocols, meta-analysis suggests that the Trier Social Stress Test (TSST) is the most useful and appropriate standardized protocol for studies of stress hormone reactivity.2 In the original description of the TSST, researchers sought to design and evaluate a procedure capable of inducing a reliable stress response in the majority of healthy volunteers.3 These researchers found elevations in heart rate, blood pressure and several endocrine stress markers in response to the TSST (a psychological stressor) compared to a saline injection (a physical stressor).3 Although the TSST has been modified to meet the needs of various research groups, it generally consists of a waiting period upon arrival, anticipatory speech preparation, speech performance, and verbal arithmetic performance periods, followed by one or more recovery periods. The TSST requires participants to prepare and deliver a speech, and verbally respond to a challenging arithmetic problem in the presence of a socially evaluative audience.3 Social evaluation and uncontrollability have been identified as key components of stress induction by the TSST.4 In use for over a decade, the goal of the TSST is to systematically induce a stress response in order to measure differences in reactivity, anxiety and activation of the hypothalamic-pituitary-adrenal (HPA) or sympathetic-adrenal-medullary (SAM) axis during the task.1 Researchers generally assess changes in self-reported anxiety, physiological measures (e.g. heart rate), and/or neuroendocrine indices (e.g. the stress hormone cortisol) in response to the TSST. Many investigators have adopted salivary sampling for stress markers such as cortisol and alpha-amylase (a marker of autonomic nervous system activation) as an alternative to blood sampling to reduce the confounding stress of blood-collection techniques. In addition to changes experienced by an individual completing the TSST, researchers can compare changes between different treatment groups (e.g. clinical versus healthy control samples) or the effectiveness of stress-reducing interventions.1
Medicine, Issue 56, Stress, anxiety, laboratory stressor, cortisol, physiological response, psychological stressor
Play Button
Using Visual and Narrative Methods to Achieve Fair Process in Clinical Care
Authors: Laura S. Lorenz, Jon A. Chilingerian.
Institutions: Brandeis University, Brandeis University.
The Institute of Medicine has targeted patient-centeredness as an important area of quality improvement. A major dimension of patient-centeredness is respect for patient's values, preferences, and expressed needs. Yet specific approaches to gaining this understanding and translating it to quality care in the clinical setting are lacking. From a patient perspective quality is not a simple concept but is best understood in terms of five dimensions: technical outcomes; decision-making efficiency; amenities and convenience; information and emotional support; and overall patient satisfaction. Failure to consider quality from this five-pronged perspective results in a focus on medical outcomes, without considering the processes central to quality from the patient's perspective and vital to achieving good outcomes. In this paper, we argue for applying the concept of fair process in clinical settings. Fair process involves using a collaborative approach to exploring diagnostic issues and treatments with patients, explaining the rationale for decisions, setting expectations about roles and responsibilities, and implementing a core plan and ongoing evaluation. Fair process opens the door to bringing patient expertise into the clinical setting and the work of developing health care goals and strategies. This paper provides a step by step illustration of an innovative visual approach, called photovoice or photo-elicitation, to achieve fair process in clinical work with acquired brain injury survivors and others living with chronic health conditions. Applying this visual tool and methodology in the clinical setting will enhance patient-provider communication; engage patients as partners in identifying challenges, strengths, goals, and strategies; and support evaluation of progress over time. Asking patients to bring visuals of their lives into the clinical interaction can help to illuminate gaps in clinical knowledge, forge better therapeutic relationships with patients living with chronic conditions such as brain injury, and identify patient-centered goals and possibilities for healing. The process illustrated here can be used by clinicians, (primary care physicians, rehabilitation therapists, neurologists, neuropsychologists, psychologists, and others) working with people living with chronic conditions such as acquired brain injury, mental illness, physical disabilities, HIV/AIDS, substance abuse, or post-traumatic stress, and by leaders of support groups for the types of patients described above and their family members or caregivers.
Medicine, Issue 48, person-centered care, participatory visual methods, photovoice, photo-elicitation, narrative medicine, acquired brain injury, disability, rehabilitation, palliative care
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.