Complementary structural and functional neuroimaging techniques used to examine the Default Mode Network (DMN) could potentially improve assessments of psychiatric illness severity and provide added validity to the clinical diagnostic process. Recent neuroimaging research suggests that DMN processes may be disrupted in a number of stress-related psychiatric illnesses, such as posttraumatic stress disorder (PTSD).
Although specific DMN functions remain under investigation, it is generally thought to be involved in introspection and self-processing. In healthy individuals it exhibits greatest activity during periods of rest, with less activity, observed as deactivation, during cognitive tasks, e.g., working memory. This network consists of the medial prefrontal cortex, posterior cingulate cortex/precuneus, lateral parietal cortices and medial temporal regions.
Multiple functional and structural imaging approaches have been developed to study the DMN. These have unprecedented potential to further the understanding of the function and dysfunction of this network. Functional approaches, such as the evaluation of resting state connectivity and task-induced deactivation, have excellent potential to identify targeted neurocognitive and neuroaffective (functional) diagnostic markers and may indicate illness severity and prognosis with increased accuracy or specificity. Structural approaches, such as evaluation of morphometry and connectivity, may provide unique markers of etiology and long-term outcomes. Combined, functional and structural methods provide strong multimodal, complementary and synergistic approaches to develop valid DMN-based imaging phenotypes in stress-related psychiatric conditions. This protocol aims to integrate these methods to investigate DMN structure and function in PTSD, relating findings to illness severity and relevant clinical factors.
21 Related JoVE Articles!
Assessment of Age-related Changes in Cognitive Functions Using EmoCogMeter, a Novel Tablet-computer Based Approach
Institutions: Freie Universität Berlin, Charité Berlin, Freie Universität Berlin, Psychiatric University Hospital Zurich.
The main goal of this study was to assess the usability of a tablet-computer-based application (EmoCogMeter) in investigating the effects of age on cognitive functions across the lifespan in a sample of 378 healthy subjects (age range 18-89 years). Consistent with previous findings we found an age-related cognitive decline across a wide range of neuropsychological domains (memory, attention, executive functions), thereby proving the usability of our tablet-based application. Regardless of prior computer experience, subjects of all age groups were able to perform the tasks without instruction or feedback from an experimenter. Increased motivation and compliance proved to be beneficial for task performance, thereby potentially increasing the validity of the results. Our promising findings underline the great clinical and practical potential of a tablet-based application for detection and monitoring of cognitive dysfunction.
Behavior, Issue 84, Neuropsychological Testing, cognitive decline, age, tablet-computer, memory, attention, executive functions
Combining Computer Game-Based Behavioural Experiments With High-Density EEG and Infrared Gaze Tracking
Institutions: Cornell University, University of Chicago, Manesar, India.
Experimental paradigms are valuable insofar as the timing and other parameters of their stimuli are well specified and controlled, and insofar as they yield data relevant to the cognitive processing that occurs under ecologically valid conditions. These two goals often are at odds, since well controlled stimuli often are too repetitive to sustain subjects' motivation. Studies employing electroencephalography (EEG) are often especially sensitive to this dilemma between ecological validity and experimental control: attaining sufficient signal-to-noise in physiological averages demands large numbers of repeated trials within lengthy recording sessions, limiting the subject pool to individuals with the ability and patience to perform a set task over and over again. This constraint severely limits researchers' ability to investigate younger populations as well as clinical populations associated with heightened anxiety or attentional abnormalities. Even adult, non-clinical subjects may not be able to achieve their typical levels of performance or cognitive engagement: an unmotivated subject for whom an experimental task is little more than a chore is not the same, behaviourally, cognitively, or neurally, as a subject who is intrinsically motivated and engaged with the task. A growing body of literature demonstrates that embedding experiments within video games may provide a way between the horns of this dilemma between experimental control and ecological validity. The narrative of a game provides a more realistic context in which tasks occur, enhancing their ecological validity (Chaytor & Schmitter-Edgecombe, 2003). Moreover, this context provides motivation to complete tasks. In our game, subjects perform various missions to collect resources, fend off pirates, intercept communications or facilitate diplomatic relations. In so doing, they also perform an array of cognitive tasks, including a Posner attention-shifting paradigm (Posner, 1980), a go/no-go test of motor inhibition, a psychophysical motion coherence threshold task, the Embedded Figures Test (Witkin, 1950, 1954) and a theory-of-mind (Wimmer & Perner, 1983) task. The game software automatically registers game stimuli and subjects' actions and responses in a log file, and sends event codes to synchronise with physiological data recorders. Thus the game can be combined with physiological measures such as EEG or fMRI, and with moment-to-moment tracking of gaze. Gaze tracking can verify subjects' compliance with behavioural tasks (e.g. fixation) and overt attention to experimental stimuli, and also physiological arousal as reflected in pupil dilation (Bradley et al.
, 2008). At great enough sampling frequencies, gaze tracking may also help assess covert attention as reflected in microsaccades - eye movements that are too small to foveate a new object, but are as rapid in onset and have the same relationship between angular distance and peak velocity as do saccades that traverse greater distances. The distribution of directions of microsaccades correlates with the (otherwise) covert direction of attention (Hafed & Clark, 2002).
Neuroscience, Issue 46, High-density EEG, ERP, ICA, gaze tracking, computer game, ecological validity
Assessment of Social Interaction Behaviors
Institutions: Mount Sinai Hospital, Mount Sinai Hospital, University of Toronto, University of Toronto, University of Toronto.
Social interactions are a fundamental and adaptive component of the biology of numerous species. Social recognition is critical for the structure and stability of the networks and relationships that define societies. For animals, such as mice, recognition of conspecifics may be important for maintaining social hierarchy and for mate choice 1
A variety of neuropsychiatric disorders are characterized by disruptions in social behavior and social recognition, including depression, autism spectrum disorders, bipolar disorders, obsessive-compulsive disorders, and schizophrenia. Studies of humans as well as animal models (e.g., Drosophila melanogaster
, Caenorhabditis elegans
, Mus musculus
, Rattus norvegicus
) have identified genes involved in the regulation of social behavior 2
. To assess sociability in animal models, several behavioral tests have been developed (reviewed in 3
). Integrative research using animal models and appropriate tests for social behavior may lead to the development of improved treatments for social psychopathologies.
The three-chamber paradigm test known as Crawley's sociability and preference for social novelty protocol has been successfully employed to study social affiliation and social memory in several inbred and mutant mouse lines (e.g. 4-7
). The main principle of this test is based on the free choice by a subject mouse to spend time in any of three box's compartments during two experimental sessions, including indirect contact with one or two mice with which it is unfamiliar. To quantitate social tendencies of the experimental mouse, the main tasks are to measure a) the time spent with a novel conspecific and b) preference for a novel vs. a familiar conspecific. Thus, the experimental design of this test allows evaluation of two critical but distinguishable aspects of social behavior, such as social affiliation/motivation, as well as social memory and novelty. "Sociability" in this case is defined as propensity to spend time with another mouse, as compared to time spent alone in an identical but empty chamber 7
. "Preference for social novelty" is defined as propensity to spend time with a previously unencountered mouse rather than with a familiar mouse 7
. This test provides robust results, which then must be carefully analyzed, interpreted and supported/confirmed by alternative sociability tests. In addition to specific applications, Crawley's sociability test can be included as an important component of general behavioral screen of mutant mice.
Neuroscience, Issue 48, Mice, behavioral test, phenotyping, social interaction
Quantification of Orofacial Phenotypes in Xenopus
Institutions: Virginia Commonwealth University.
has become an important tool for dissecting the mechanisms governing craniofacial development and defects. A method to quantify orofacial development will allow for more rigorous analysis of orofacial phenotypes upon abrogation with substances that can genetically or molecularly manipulate gene expression or protein function. Using two dimensional images of the embryonic heads, traditional size dimensions-such as orofacial width, height and area- are measured. In addition, a roundness measure of the embryonic mouth opening is used to describe the shape of the mouth. Geometric morphometrics of these two dimensional images is also performed to provide a more sophisticated view of changes in the shape of the orofacial region. Landmarks are assigned to specific points in the orofacial region and coordinates are created. A principle component analysis is used to reduce landmark coordinates to principle components that then discriminate the treatment groups. These results are displayed as a scatter plot in which individuals with similar orofacial shapes cluster together. It is also useful to perform a discriminant function analysis, which statistically compares the positions of the landmarks between two treatment groups. This analysis is displayed on a transformation grid where changes in landmark position are viewed as vectors. A grid is superimposed on these vectors so that a warping pattern is displayed to show where significant landmark positions have changed. Shape changes in the discriminant function analysis are based on a statistical measure, and therefore can be evaluated by a p-value. This analysis is simple and accessible, requiring only a stereoscope and freeware software, and thus will be a valuable research and teaching resource.
Developmental Biology, Issue 93, Orofacial quantification, geometric morphometrics, Xenopus, orofacial development, orofacial defects, shape changes, facial dimensions
Analysis of Nephron Composition and Function in the Adult Zebrafish Kidney
Institutions: University of Notre Dame.
The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.
Cellular Biology, Issue 90,
zebrafish; kidney; nephron; nephrology; renal; regeneration; proximal tubule; distal tubule; segment; mesonephros; physiology; acute kidney injury (AKI)
Profiling of Estrogen-regulated MicroRNAs in Breast Cancer Cells
Institutions: University of Houston.
Estrogen plays vital roles in mammary gland development and breast cancer progression. It mediates its function by binding to and activating the estrogen receptors (ERs), ERα, and ERβ. ERα is frequently upregulated in breast cancer and drives the proliferation of breast cancer cells. The ERs function as transcription factors and regulate gene expression. Whereas ERα's regulation of protein-coding genes is well established, its regulation of noncoding microRNA (miRNA) is less explored. miRNAs play a major role in the post-transcriptional regulation of genes, inhibiting their translation or degrading their mRNA. miRNAs can function as oncogenes or tumor suppressors and are also promising biomarkers. Among the miRNA assays available, microarray and quantitative real-time polymerase chain reaction (qPCR) have been extensively used to detect and quantify miRNA levels. To identify miRNAs regulated by estrogen signaling in breast cancer, their expression in ERα-positive breast cancer cell lines were compared before and after estrogen-activation using both the µParaflo-microfluidic microarrays and Dual Labeled Probes-low density arrays. Results were validated using specific qPCR assays, applying both Cyanine dye-based and Dual Labeled Probes-based chemistry. Furthermore, a time-point assay was used to identify regulations over time. Advantages of the miRNA assay approach used in this study is that it enables a fast screening of mature miRNA regulations in numerous samples, even with limited sample amounts. The layout, including the specific conditions for cell culture and estrogen treatment, biological and technical replicates, and large-scale screening followed by in-depth confirmations using separate techniques, ensures a robust detection of miRNA regulations, and eliminates false positives and other artifacts. However, mutated or unknown miRNAs, or regulations at the primary and precursor transcript level, will not be detected. The method presented here represents a thorough investigation of estrogen-mediated miRNA regulation.
Medicine, Issue 84, breast cancer, microRNA, estrogen, estrogen receptor, microarray, qPCR
Analysis of RNA Processing Reactions Using Cell Free Systems: 3' End Cleavage of Pre-mRNA Substrates in vitro
Institutions: The Scripps Research Institute, City College of New York.
The 3’ end of mammalian mRNAs is not formed by abrupt termination of transcription by RNA polymerase II (RNPII). Instead, RNPII synthesizes precursor mRNA beyond the end of mature RNAs, and an active process of endonuclease activity is required at a specific site. Cleavage of the precursor RNA normally occurs 10-30 nt downstream from the consensus polyA site (AAUAAA) after the CA dinucleotides. Proteins from the cleavage complex, a multifactorial protein complex of approximately 800 kDa, accomplish this specific nuclease activity. Specific RNA sequences upstream and downstream of the polyA site control the recruitment of the cleavage complex. Immediately after cleavage, pre-mRNAs are polyadenylated by the polyA polymerase (PAP) to produce mature stable RNA messages.
Processing of the 3’ end of an RNA transcript may be studied using cellular nuclear extracts with specific radiolabeled RNA substrates. In sum, a long 32
P-labeled uncleaved precursor RNA is incubated with nuclear extracts in vitro
, and cleavage is assessed by gel electrophoresis and autoradiography. When proper cleavage occurs, a shorter 5’ cleaved product is detected and quantified. Here, we describe the cleavage assay in detail using, as an example, the 3’ end processing of HIV-1 mRNAs.
Infectious Diseases, Issue 87, Cleavage, Polyadenylation, mRNA processing, Nuclear extracts, 3' Processing Complex
Quantitative Measurement of the Immune Response and Sleep in Drosophila
Institutions: University of Pennsylvania Perelman School of Medicine.
A complex interaction between the immune response and host behavior has been described in a wide range of species. Excess sleep, in particular, is known to occur as a response to infection in mammals 1
and has also recently been described in Drosophila melanogaster2
. It is generally accepted that sleep is beneficial to the host during an infection and that it is important for the maintenance of a robust immune system3,4
. However, experimental evidence that supports this hypothesis is limited4
, and the function of excess sleep during an immune response remains unclear. We have used a multidisciplinary approach to address this complex problem, and have conducted studies in the simple genetic model system, the fruitfly Drosophila melanogaster
. We use a standard assay for measuring locomotor behavior and sleep in flies, and demonstrate how this assay is used to measure behavior in flies infected with a pathogenic strain of bacteria. This assay is also useful for monitoring the duration of survival in individual flies during an infection. Additional measures of immune function include the ability of flies to clear an infection and the activation of NFκB, a key transcription factor that is central to the innate immune response in Drosophila
. Both survival outcome and bacterial clearance during infection together are indicators of resistance and tolerance to infection. Resistance refers to the ability of flies to clear an infection, while tolerance is defined as the ability of the host to limit damage from an infection and thereby survive despite high levels of pathogen within the system5
. Real-time monitoring of NFκB activity during infection provides insight into a molecular mechanism of survival during infection. The use of Drosophila
in these straightforward assays facilitates the genetic and molecular analyses of sleep and the immune response and how these two complex systems are reciprocally influenced.
Immunology, Issue 70, Neuroscience, Medicine, Physiology, Pathology, Microbiology, immune response, sleep, Drosophila, infection, bacteria, luciferase reporter assay, animal model
Modeling The Lifecycle Of Ebola Virus Under Biosafety Level 2 Conditions With Virus-like Particles Containing Tetracistronic Minigenomes
Institutions: National Institute of Allergy and Infectious Diseases, National Institutes of Health, National Institute of Allergy and Infectious Diseases, National Institutes of Health.
Ebola viruses cause severe hemorrhagic fevers in humans and non-human primates, with case fatality rates as high as 90%. There are no approved vaccines or specific treatments for the disease caused by these viruses, and work with infectious Ebola viruses is restricted to biosafety level 4 laboratories, significantly limiting the research on these viruses. Lifecycle modeling systems model the virus lifecycle under biosafety level 2 conditions; however, until recently such systems have been limited to either individual aspects of the virus lifecycle, or a single infectious cycle. Tetracistronic minigenomes, which consist of Ebola virus non-coding regions, a reporter gene, and three Ebola virus genes involved in morphogenesis, budding, and entry (VP40, GP1,2
, and VP24), can be used to produce replication and transcription-competent virus-like particles (trVLPs) containing these minigenomes. These trVLPs can continuously infect cells expressing the Ebola virus proteins responsible for genome replication and transcription, allowing us to safely model multiple infectious cycles under biosafety level 2 conditions. Importantly, the viral components of this systems are solely derived from Ebola virus and not from other viruses (as is, for example, the case in systems using pseudotyped viruses), and VP40, GP1,2
and VP24 are not overexpressed in this system, making it ideally suited for studying morphogenesis, budding and entry, although other aspects of the virus lifecycle such as genome replication and transcription can also be modeled with this system. Therefore, the tetracistronic trVLP assay represents the most comprehensive lifecycle modeling system available for Ebola viruses, and has tremendous potential for use in investigating the biology of Ebola viruses in future. Here, we provide detailed information on the use of this system, as well as on expected results.
Infectious Diseases, Issue 91, hemorrhagic Fevers, Viral, Mononegavirales Infections, Ebola virus, filovirus, lifecycle modeling system, minigenome, reverse genetics, virus-like particles, replication, transcription, budding, morphogenesis, entry
Investigating Protein-protein Interactions in Live Cells Using Bioluminescence Resonance Energy Transfer
Institutions: Max Planck Institute for Psycholinguistics, Donders Institute for Brain, Cognition and Behaviour.
Assays based on Bioluminescence Resonance Energy Transfer (BRET) provide a sensitive and reliable means to monitor protein-protein interactions in live cells. BRET is the non-radiative transfer of energy from a 'donor' luciferase enzyme to an 'acceptor' fluorescent protein. In the most common configuration of this assay, the donor is Renilla reniformis
luciferase and the acceptor is Yellow Fluorescent Protein (YFP). Because the efficiency of energy transfer is strongly distance-dependent, observation of the BRET phenomenon requires that the donor and acceptor be in close proximity. To test for an interaction between two proteins of interest in cultured mammalian cells, one protein is expressed as a fusion with luciferase and the second as a fusion with YFP. An interaction between the two proteins of interest may bring the donor and acceptor sufficiently close for energy transfer to occur. Compared to other techniques for investigating protein-protein interactions, the BRET assay is sensitive, requires little hands-on time and few reagents, and is able to detect interactions which are weak, transient, or dependent on the biochemical environment found within a live cell. It is therefore an ideal approach for confirming putative interactions suggested by yeast two-hybrid or mass spectrometry proteomics studies, and in addition it is well-suited for mapping interacting regions, assessing the effect of post-translational modifications on protein-protein interactions, and evaluating the impact of mutations identified in patient DNA.
Cellular Biology, Issue 87, Protein-protein interactions, Bioluminescence Resonance Energy Transfer, Live cell, Transfection, Luciferase, Yellow Fluorescent Protein, Mutations
A Method for Investigating Age-related Differences in the Functional Connectivity of Cognitive Control Networks Associated with Dimensional Change Card Sort Performance
Institutions: University of Western Ontario.
The ability to adjust behavior to sudden changes in the environment develops gradually in childhood and adolescence. For example, in the Dimensional Change Card Sort task, participants switch from sorting cards one way, such as shape, to sorting them a different way, such as color. Adjusting behavior in this way exacts a small performance cost, or switch cost, such that responses are typically slower and more error-prone on switch trials in which the sorting rule changes as compared to repeat trials in which the sorting rule remains the same. The ability to flexibly adjust behavior is often said to develop gradually, in part because behavioral costs such as switch costs typically decrease with increasing age. Why aspects of higher-order cognition, such as behavioral flexibility, develop so gradually remains an open question. One hypothesis is that these changes occur in association with functional changes in broad-scale cognitive control networks. On this view, complex mental operations, such as switching, involve rapid interactions between several distributed brain regions, including those that update and maintain task rules, re-orient attention, and select behaviors. With development, functional connections between these regions strengthen, leading to faster and more efficient switching operations. The current video describes a method of testing this hypothesis through the collection and multivariate analysis of fMRI data from participants of different ages.
Behavior, Issue 87, Neurosciences, fMRI, Cognitive Control, Development, Functional Connectivity
Development of a Virtual Reality Assessment of Everyday Living Skills
Institutions: NeuroCog Trials, Inc., Duke-NUS Graduate Medical Center, Duke University Medical Center, Fox Evaluation and Consulting, PLLC, University of Miami Miller School of Medicine.
Cognitive impairments affect the majority of patients with schizophrenia and these impairments predict poor long term psychosocial outcomes. Treatment studies aimed at cognitive impairment in patients with schizophrenia not only require demonstration of improvements on cognitive tests, but also evidence that any cognitive changes lead to clinically meaningful improvements. Measures of “functional capacity” index the extent to which individuals have the potential to perform skills required for real world functioning. Current data do not support the recommendation of any single instrument for measurement of functional capacity. The Virtual Reality Functional Capacity Assessment Tool (VRFCAT) is a novel, interactive gaming based measure of functional capacity that uses a realistic simulated environment to recreate routine activities of daily living. Studies are currently underway to evaluate and establish the VRFCAT’s sensitivity, reliability, validity, and practicality. This new measure of functional capacity is practical, relevant, easy to use, and has several features that improve validity and sensitivity of measurement of function in clinical trials of patients with CNS disorders.
Behavior, Issue 86, Virtual Reality, Cognitive Assessment, Functional Capacity, Computer Based Assessment, Schizophrenia, Neuropsychology, Aging, Dementia
In Vivo Modeling of the Morbid Human Genome using Danio rerio
Institutions: Duke University Medical Center, Duke University, Duke University Medical Center.
Here, we present methods for the development of assays to query potentially clinically significant nonsynonymous changes using in vivo
complementation in zebrafish. Zebrafish (Danio rerio
) are a useful animal system due to their experimental tractability; embryos are transparent to enable facile viewing, undergo rapid development ex vivo,
and can be genetically manipulated.1
These aspects have allowed for significant advances in the analysis of embryogenesis, molecular processes, and morphogenetic signaling. Taken together, the advantages of this vertebrate model make zebrafish highly amenable to modeling the developmental defects in pediatric disease, and in some cases, adult-onset disorders. Because the zebrafish genome is highly conserved with that of humans (~70% orthologous), it is possible to recapitulate human disease states in zebrafish. This is accomplished either through the injection of mutant human mRNA to induce dominant negative or gain of function alleles, or utilization of morpholino (MO) antisense oligonucleotides to suppress genes to mimic loss of function variants. Through complementation of MO-induced phenotypes with capped human mRNA, our approach enables the interpretation of the deleterious effect of mutations on human protein sequence based on the ability of mutant mRNA to rescue a measurable, physiologically relevant phenotype. Modeling of the human disease alleles occurs through microinjection of zebrafish embryos with MO and/or human mRNA at the 1-4 cell stage, and phenotyping up to seven days post fertilization (dpf). This general strategy can be extended to a wide range of disease phenotypes, as demonstrated in the following protocol. We present our established models for morphogenetic signaling, craniofacial, cardiac, vascular integrity, renal function, and skeletal muscle disorder phenotypes, as well as others.
Molecular Biology, Issue 78, Genetics, Biomedical Engineering, Medicine, Developmental Biology, Biochemistry, Anatomy, Physiology, Bioengineering, Genomics, Medical, zebrafish, in vivo, morpholino, human disease modeling, transcription, PCR, mRNA, DNA, Danio rerio, animal model
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Brain Imaging Investigation of the Impairing Effect of Emotion on Cognition
Institutions: University of Alberta, University of Alberta, University of Illinois, Duke University , Duke University , VA Medical Center, Yale University, University of Illinois, University of Illinois.
Emotions can impact cognition by exerting both enhancing
(e.g., better memory for emotional events) and impairing
(e.g., increased emotional distractibility) effects (reviewed in 1
). Complementing our recent protocol 2
describing a method that allows investigation of the neural correlates of the memory-enhancing effect of emotion (see also 1, 3-5
), here we present a protocol that allows investigation of the neural correlates of the detrimental impact of emotion on cognition. The main feature of this method is that it allows identification of reciprocal modulations between activity in a ventral neural system, involved in 'hot' emotion processing (HotEmo
system), and a dorsal system, involved in higher-level 'cold' cognitive/executive processing (ColdEx
system), which are linked to cognitive performance and to individual variations in behavior (reviewed in 1
). Since its initial introduction 6
, this design has proven particularly versatile and influential in the elucidation of various aspects concerning the neural correlates of the detrimental impact of emotional distraction on cognition, with a focus on working memory (WM), and of coping with such distraction 7,11
, in both healthy 8-11
and clinical participants 12-14
Neuroscience, Issue 60, Emotion-Cognition Interaction, Cognitive/Emotional Interference, Task-Irrelevant Distraction, Neuroimaging, fMRI, MRI
Coherence between Brain Cortical Function and Neurocognitive Performance during Changed Gravity Conditions
Institutions: German Sport University Cologne, University of Toronto, Queensland University of Technology, Gilching, Germany.
Previous studies of cognitive, mental and/or motor processes during short-, medium- and long-term weightlessness have only been descriptive in nature, and focused on psychological aspects. Until now, objective observation of neurophysiological parameters has not been carried out - undoubtedly because the technical and methodological means have not been available -, investigations into the neurophysiological effects of weightlessness are in their infancy (Schneider et al.
While imaging techniques such as positron emission tomography (PET) and magnetic resonance imaging (MRI) would be hardly applicable in space, the non-invasive near-infrared spectroscopy (NIRS) technique represents a method of mapping hemodynamic processes in the brain in real time that is both relatively inexpensive and that can be employed even under extreme conditions. The combination with electroencephalography (EEG) opens up the possibility of following the electrocortical processes under changing gravity conditions with a finer temporal resolution as well as with deeper localization, for instance with electrotomography (LORETA).
Previous studies showed an increase of beta frequency activity under normal gravity conditions and a decrease under weightlessness conditions during a parabolic flight (Schneider et al.
2008a+b). Tilt studies revealed different changes in brain function, which let suggest, that changes in parabolic flight might reflect emotional processes rather than hemodynamic changes. However, it is still unclear whether these are effects of changed gravity or hemodynamic changes within the brain. Combining EEG/LORETA and NIRS should for the first time make it possible to map the effect of weightlessness and reduced gravity on both hemodynamic and electrophysiological processes in the brain. Initially, this is to be done as part of a feasibility study during a parabolic flight. Afterwards, it is also planned to use both techniques during medium- and long-term space flight.
It can be assumed that the long-term redistribution of the blood volume and the associated increase in the supply of oxygen to the brain will lead to changes in the central nervous system that are also responsible for anaemic processes, and which can in turn reduce performance (De Santo et al.
2005), which means that they could be crucial for the success and safety of a mission (Genik et al.
2005, Ellis 2000).
Depending on these results, it will be necessary to develop and employ extensive countermeasures. Initial results for the MARS500 study suggest that, in addition to their significance in the context of the cardiovascular and locomotor systems, sport and physical activity can play a part in improving neurocognitive parameters. Before this can be fully established, however, it seems necessary to learn more about the influence of changing gravity conditions on neurophysiological processes and associated neurocognitive impairment.
Neuroscience, Issue 51, EEG, NIRS, electrotomography, parabolic flight, weightlessness, imaging, cognitive performance
High Density Event-related Potential Data Acquisition in Cognitive Neuroscience
Institutions: Boston College.
Functional magnetic resonance imaging (fMRI) is currently the standard method of evaluating brain function in the field of Cognitive Neuroscience, in part because fMRI data acquisition and analysis techniques are readily available. Because fMRI has excellent spatial resolution but poor temporal resolution, this method can only be used to identify the spatial location of brain activity associated with a given cognitive process (and reveals virtually nothing about the time course of brain activity). By contrast, event-related potential (ERP) recording, a method that is used much less frequently than fMRI, has excellent temporal resolution and thus can track rapid temporal modulations in neural activity. Unfortunately, ERPs are under utilized in Cognitive Neuroscience because data acquisition techniques are not readily available and low density ERP recording has poor spatial resolution. In an effort to foster the increased use of ERPs in Cognitive Neuroscience, the present article details key techniques involved in high density ERP data acquisition. Critically, high density ERPs offer the promise of excellent temporal resolution and good spatial resolution (or excellent spatial resolution if coupled with fMRI), which is necessary to capture the spatial-temporal dynamics of human brain function.
Neuroscience, Issue 38, ERP, electrodes, methods, setup
Probing the Brain in Autism Using fMRI and Diffusion Tensor Imaging
Institutions: University of Alabama at Birmingham.
Newly emerging theories suggest that the brain does not function as a cohesive unit in autism, and this discordance is reflected in the behavioral symptoms displayed by individuals with autism. While structural neuroimaging findings have provided some insights into brain abnormalities in autism, the consistency of such findings is questionable. Functional neuroimaging, on the other hand, has been more fruitful in this regard because autism is a disorder of dynamic processing and allows examination of communication between cortical networks, which appears to be where the underlying problem occurs in autism. Functional connectivity is defined as the temporal correlation of spatially separate neurological events1. Findings from a number of recent fMRI studies have supported the idea that there is weaker coordination between different parts of the brain that should be working together to accomplish complex social or language problems2,3,4,5,6
. One of the mysteries of autism is the coexistence of deficits in several domains along with relatively intact, sometimes enhanced, abilities. Such complex manifestation of autism calls for a global and comprehensive examination of the disorder at the neural level. A compelling recent account of the brain functioning in autism, the cortical underconnectivity theory,2,7
provides an integrating framework for the neurobiological bases of autism. The cortical underconnectivity theory of autism suggests that any language, social, or psychological function that is dependent on the integration of multiple brain regions is susceptible to disruption as the processing demand increases. In autism, the underfunctioning of integrative circuitry in the brain may cause widespread underconnectivity. In other words, people with autism may interpret information in a piecemeal fashion at the expense of the whole. Since cortical underconnectivity among brain regions, especially the frontal cortex and more posterior areas 3,6
, has now been relatively well established, we can begin to further understand brain connectivity as a critical component of autism symptomatology.
A logical next step in this direction is to examine the anatomical connections that may mediate the functional connections mentioned above. Diffusion Tensor Imaging (DTI) is a relatively novel neuroimaging technique that helps probe the diffusion of water in the brain to infer the integrity of white matter fibers. In this technique, water diffusion in the brain is examined in several directions using diffusion gradients. While functional connectivity provides information about the synchronization of brain activation across different brain areas during a task or during rest, DTI helps in understanding the underlying axonal organization which may facilitate the cross-talk among brain areas. This paper will describe these techniques as valuable tools in understanding the brain in autism and the challenges involved in this line of research.
Medicine, Issue 55, Functional magnetic resonance imaging (fMRI), MRI, Diffusion tensor imaging (DTI), Functional Connectivity, Neuroscience, Developmental disorders, Autism, Fractional Anisotropy
Investigating Social Cognition in Infants and Adults Using Dense Array Electroencephalography (dEEG)
Institutions: University Toronto Scarborough.
Dense array electroencephalography (d
EEG), which provides a non-invasive window for measuring brain activity and a temporal resolution unsurpassed by any other current brain imaging technology1,2
, is being used increasingly in the study of social cognitive functioning in infants and adults. While d
EEG is enabling researchers to examine brain activity patterns with unprecedented levels of sensitivity, conventional EEG recording systems continue to face certain limitations, including 1) poor spatial resolution and source localization3,4
,2) the physical discomfort for test subjects of enduring the individual application of numerous electrodes to the surface of the scalp, and 3) the complexity for researchers of learning to use multiple software packages to collect and process data. Here we present an overview of an established methodology that represents a significant improvement on conventional methodologies for studying EEG in infants and adults. Although several analytical software techniques can be used to establish indirect indices of source localization to improve the spatial resolution of d
EEG, the HydroCel Geodesic Sensor Net (HCGSN) by Electrical Geodesics, Inc. (EGI), a dense sensory array that maintains equal distances among adjacent recording electrodes on all surfaces of the scalp, further enhances spatial resolution4,5,6
compared to standard d
EEG systems. The sponge-based HCGSN can be applied rapidly and without scalp abrasion, making it ideal for use with adults7,8
, and infants12
, in both research and clinical4,5,6,13,14,15
settings. This feature allows for considerable cost and time savings by decreasing the average net application time compared to other d
EEG systems. Moreover, the HCGSN includes unified, seamless software applications for all phases of data, greatly simplifying the collection, processing, and analysis of d
The HCGSN features a low-profile electrode pedestal, which, when filled with electrolyte solution, creates a sealed microenvironment and an electrode-scalp interface. In all Geodesic d
EEG systems, EEG sensors detect changes in voltage originating from the participant's scalp, along with a small amount of electrical noise originating from the room environment. Electrical signals from all sensors of the Geodesic sensor net are received simultaneously by the amplifier, where they are automatically processed, packaged, and sent to the data-acquisition computer (DAC). Once received by the DAC, scalp electrical activity can be isolated from artifacts for analysis using the filtering and artifact detection tools included in the EGI software. Typically, the HCGSN can be used continuously for only up to two hours because the electrolyte solution dries out over time, gradually decreasing the quality of the scalp-electrode interface.
In the Parent-Infant Research Lab at the University of Toronto, we are using d
EEG to study social cognitive processes including memory, emotion, goals, intentionality, anticipation, and executive functioning in both adult and infant participants.
Neuroscience, Issue 52, Developmental Affective Neuroscience, high density EEG, social cognition, infancy, and parenting