JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Forensic analysis of Venezuelan elections during the Chávez presidency.
PLoS ONE
PUBLISHED: 01-01-2014
Hugo Chávez dominated the Venezuelan electoral landscape since his first presidential victory in 1998 until his death in 2013. Nobody doubts that he always received considerable voter support in the numerous elections held during his mandate. However, the integrity of the electoral system has come into question since the 2004 Presidential Recall Referendum. From then on, different sectors of society have systematically alleged electoral irregularities or biases in favor of the incumbent party. We have carried out a thorough forensic analysis of the national-level Venezuelan electoral processes held during the 1998-2012 period to assess these complaints. The second-digit Benford's law and two statistical models of vote distributions, recently introduced in the literature, are reviewed and used in our case study. In addition, we discuss a new method to detect irregular variations in the electoral roll. The outputs obtained from these election forensic tools are examined taking into account the substantive context of the elections and referenda under study. Thus, we reach two main conclusions. Firstly, all the tools uncover anomalous statistical patterns, which are consistent with election fraud from 2004 onwards. Although our results are not a concluding proof of fraud, they signal the Recall Referendum as a turning point in the integrity of the Venezuelan elections. Secondly, our analysis calls into question the reliability of the electoral register since 2004. In particular, we found irregular variations in the electoral roll that were decisive in winning the 50% majority in the 2004 Referendum and in the 2012 Presidential Elections.
Authors: Allison Poussard, Michael Patterson, Katherine Taylor, Alexey Seregin, Jeanon Smith, Jennifer Smith, Milagros Salazar, Slobodan Paessler.
Published: 12-02-2012
ABSTRACT
Modern advancements in imaging technology encourage further development and refinement in the way viral research is accomplished. Initially proposed by Russel and Burch in Hume's 3Rs (replacement, reduction, refinement), the utilization of animal models in scientific research is under constant pressure to identify new methodologies to reduce animal usage while improving scientific accuracy and speed. A major challenge to Hume's principals however, is how to ensure the studies are statistically accurate while reducing animal disease morbidity and overall numbers. Vaccine efficacy studies currently require a large number of animals in order to be considered statistically significant and often result in high morbidity and mortality endpoints for identification of immune protection. We utilized in vivo imaging systems (IVIS) in conjunction with a firefly bioluminescent enzyme to progressively track the invasion of the central nervous system (CNS) by an encephalitic virus in a murine model. Typically, the disease progresses relatively slowly, however virus replication is rapid, especially within the CNS, and can lead to an often, lethal outcome. Following intranasal infection of the mice with TC83-Luc, an attenuated Venezuelan equine encephalitis virus strain modified to expresses a luciferase gene; we are able to visualize virus replication within the brain at least three days before the development of clinical disease symptoms. Utilizing CNS invasion as a key encephalitic disease development endpoint we are able to quickly identify therapeutic and vaccine protection against TC83-Luc infection before clinical symptoms develop. With IVIS technology we are able to demonstrate the rapid and accurate testing of drug therapeutics and vaccines while reducing animal numbers and morbidity.
27 Related JoVE Articles!
Play Button
Measuring Attentional Biases for Threat in Children and Adults
Authors: Vanessa LoBue.
Institutions: Rutgers University.
Investigators have long been interested in the human propensity for the rapid detection of threatening stimuli. However, until recently, research in this domain has focused almost exclusively on adult participants, completely ignoring the topic of threat detection over the course of development. One of the biggest reasons for the lack of developmental work in this area is likely the absence of a reliable paradigm that can measure perceptual biases for threat in children. To address this issue, we recently designed a modified visual search paradigm similar to the standard adult paradigm that is appropriate for studying threat detection in preschool-aged participants. Here we describe this new procedure. In the general paradigm, we present participants with matrices of color photographs, and ask them to find and touch a target on the screen. Latency to touch the target is recorded. Using a touch-screen monitor makes the procedure simple and easy, allowing us to collect data in participants ranging from 3 years of age to adults. Thus far, the paradigm has consistently shown that both adults and children detect threatening stimuli (e.g., snakes, spiders, angry/fearful faces) more quickly than neutral stimuli (e.g., flowers, mushrooms, happy/neutral faces). Altogether, this procedure provides an important new tool for researchers interested in studying the development of attentional biases for threat.
Behavior, Issue 92, Detection, threat, attention, attentional bias, anxiety, visual search
52190
Play Button
A Novel Method for Assessing Proximal and Distal Forelimb Function in the Rat: the Irvine, Beatties and Bresnahan (IBB) Forelimb Scale
Authors: Karen-Amanda Irvine, Adam R. Ferguson, Kathleen D. Mitchell, Stephanie B. Beattie, Michael S. Beattie, Jacqueline C. Bresnahan.
Institutions: University of California, San Francisco.
Several experimental models of cervical spinal cord injury (SCI) have been developed recently to assess the consequences of damage to this level of the spinal cord (Pearse et al., 2005, Gensel et al., 2006, Anderson et al., 2009), as the majority of human SCI occur here (Young, 2010; www.sci-info-pages.com). Behavioral deficits include loss of forelimb function due to damage to the white matter affecting both descending motor and ascending sensory systems, and to the gray matter containing the segmental circuitry for processing sensory input and motor output for the forelimb. Additionally, a key priority for human patients with cervical SCI is restoration of hand/arm function (Anderson, 2004). Thus, outcome measures that assess both proximal and distal forelimb function are needed. Although there are several behavioral assays that are sensitive to different aspects of forelimb recovery in experimental models of cervical SCI (Girgis et al., 2007, Gensel et al., 2006, Ballerman et al., 2001, Metz and Whishaw, 2000, Bertelli and Mira, 1993, Montoya et al., 1991, Whishaw and Pellis, 1990), few techniques provide detailed information on the recovery of fine motor control and digit movement. The current measurement technique, the Irvine, Beatties and Bresnahan forelimb scale (IBB), can detect recovery of both proximal and distal forelimb function including digit movements during a naturally occurring behavior that does not require extensive training or deprivation to enhance motivation. The IBB was generated by observing recovery after a unilateral C6 SCI, and involves video recording of animals eating two differently shaped cereals (spherical and doughnut) of a consistent size. These videos were then used to assess features of forelimb use, such as joint position, object support, digit movement and grasping technique. The IBB, like other forelimb behavioral tasks, shows a consistent pattern of recovery that is sensitive to injury severity. Furthermore, the IBB scale could be used to assess recovery following other types of injury that impact normal forelimb function.
Neuroscience, Issue 46, spinal cord injury, recovery of function, forelimb function, neurological test, cervical injuries
2246
Play Button
Assessment of Age-related Changes in Cognitive Functions Using EmoCogMeter, a Novel Tablet-computer Based Approach
Authors: Philipp Fuge, Simone Grimm, Anne Weigand, Yan Fan, Matti Gärtner, Melanie Feeser, Malek Bajbouj.
Institutions: Freie Universität Berlin, Charité Berlin, Freie Universität Berlin, Psychiatric University Hospital Zurich.
The main goal of this study was to assess the usability of a tablet-computer-based application (EmoCogMeter) in investigating the effects of age on cognitive functions across the lifespan in a sample of 378 healthy subjects (age range 18-89 years). Consistent with previous findings we found an age-related cognitive decline across a wide range of neuropsychological domains (memory, attention, executive functions), thereby proving the usability of our tablet-based application. Regardless of prior computer experience, subjects of all age groups were able to perform the tasks without instruction or feedback from an experimenter. Increased motivation and compliance proved to be beneficial for task performance, thereby potentially increasing the validity of the results. Our promising findings underline the great clinical and practical potential of a tablet-based application for detection and monitoring of cognitive dysfunction.
Behavior, Issue 84, Neuropsychological Testing, cognitive decline, age, tablet-computer, memory, attention, executive functions
50942
Play Button
Measurement of Greenhouse Gas Flux from Agricultural Soils Using Static Chambers
Authors: Sarah M. Collier, Matthew D. Ruark, Lawrence G. Oates, William E. Jokela, Curtis J. Dell.
Institutions: University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, USDA-ARS Dairy Forage Research Center, USDA-ARS Pasture Systems Watershed Management Research Unit.
Measurement of greenhouse gas (GHG) fluxes between the soil and the atmosphere, in both managed and unmanaged ecosystems, is critical to understanding the biogeochemical drivers of climate change and to the development and evaluation of GHG mitigation strategies based on modulation of landscape management practices. The static chamber-based method described here is based on trapping gases emitted from the soil surface within a chamber and collecting samples from the chamber headspace at regular intervals for analysis by gas chromatography. Change in gas concentration over time is used to calculate flux. This method can be utilized to measure landscape-based flux of carbon dioxide, nitrous oxide, and methane, and to estimate differences between treatments or explore system dynamics over seasons or years. Infrastructure requirements are modest, but a comprehensive experimental design is essential. This method is easily deployed in the field, conforms to established guidelines, and produces data suitable to large-scale GHG emissions studies.
Environmental Sciences, Issue 90, greenhouse gas, trace gas, gas flux, static chamber, soil, field, agriculture, climate
52110
Play Button
High-throughput Synthesis of Carbohydrates and Functionalization of Polyanhydride Nanoparticles
Authors: Brenda R. Carrillo-Conde, Rajarshi Roychoudhury, Ana V. Chavez-Santoscoy, Balaji Narasimhan, Nicola L.B. Pohl.
Institutions: Iowa State University, Iowa State University.
Transdisciplinary approaches involving areas such as material design, nanotechnology, chemistry, and immunology have to be utilized to rationally design efficacious vaccines carriers. Nanoparticle-based platforms can prolong the persistence of vaccine antigens, which could improve vaccine immunogenicity1. Several biodegradable polymers have been studied as vaccine delivery vehicles1; in particular, polyanhydride particles have demonstrated the ability to provide sustained release of stable protein antigens and to activate antigen presenting cells and modulate immune responses2-12. The molecular design of these vaccine carriers needs to integrate the rational selection of polymer properties as well as the incorporation of appropriate targeting agents. High throughput automated fabrication of targeting ligands and functionalized particles is a powerful tool that will enhance the ability to study a wide range of properties and will lead to the design of reproducible vaccine delivery devices. The addition of targeting ligands capable of being recognized by specific receptors on immune cells has been shown to modulate and tailor immune responses10,11,13 C-type lectin receptors (CLRs) are pattern recognition receptors (PRRs) that recognize carbohydrates present on the surface of pathogens. The stimulation of immune cells via CLRs allows for enhanced internalization of antigen and subsequent presentation for further T cell activation14,15. Therefore, carbohydrate molecules play an important role in the study of immune responses; however, the use of these biomolecules often suffers from the lack of availability of structurally well-defined and pure carbohydrates. An automation platform based on iterative solution-phase reactions can enable rapid and controlled synthesis of these synthetically challenging molecules using significantly lower building block quantities than traditional solid-phase methods16,17. Herein we report a protocol for the automated solution-phase synthesis of oligosaccharides such as mannose-based targeting ligands with fluorous solid-phase extraction for intermediate purification. After development of automated methods to make the carbohydrate-based targeting agent, we describe methods for their attachment on the surface of polyanhydride nanoparticles employing an automated robotic set up operated by LabVIEW as previously described10. Surface functionalization with carbohydrates has shown efficacy in targeting CLRs10,11 and increasing the throughput of the fabrication method to unearth the complexities associated with a multi-parametric system will be of great value (Figure 1a).
Bioengineering, Issue 65, Chemical Engineering, High-throughput, Automation, Carbohydrates, Synthesis, Polyanhydrides, Nanoparticles, Functionalization, Targeting, Fluorous Solid Phase Extraction
3967
Play Button
Extraction and Analysis of Cortisol from Human and Monkey Hair
Authors: Jerrold Meyer, Melinda Novak, Amanda Hamel, Kendra Rosenberg.
Institutions: University of Massachusetts, Amherst, University of Massachusetts, Amherst.
The stress hormone cortisol (CORT) is slowly incorporated into the growing hair shaft of humans, nonhuman primates, and other mammals. We developed and validated a method for CORT extraction and analysis from rhesus monkey hair and subsequently adapted this method for use with human scalp hair. In contrast to CORT "point samples" obtained from plasma or saliva, hair CORT provides an integrated measure of hypothalamic-pituitary-adrenocortical (HPA) system activity, and thus physiological stress, during the period of hormone incorporation. Because human scalp hair grows at an average rate of 1 cm/month, CORT levels obtained from hair segments several cm in length can potentially serve as a biomarker of stress experienced over a number of months. In our method, each hair sample is first washed twice in isopropanol to remove any CORT from the outside of the hair shaft that has been deposited from sweat or sebum. After drying, the sample is ground to a fine powder to break up the hair's protein matrix and increase the surface area for extraction. CORT from the interior of the hair shaft is extracted into methanol, the methanol is evaporated, and the extract is reconstituted in assay buffer. Extracted CORT, along with standards and quality controls, is then analyzed by means of a sensitive and specific commercially available enzyme immunoassay (EIA) kit. Readout from the EIA is converted to pg CORT per mg powdered hair weight. This method has been used in our laboratory to analyze hair CORT in humans, several species of macaque monkeys, marmosets, dogs, and polar bears. Many studies both from our lab and from other research groups have demonstrated the broad applicability of hair CORT for assessing chronic stress exposure in natural as well as laboratory settings.
Basic Protocol, Issue 83, cortisol, hypothalamic-pituitary-adrenocortical axis, hair, stress, humans, monkeys
50882
Play Button
Isolation and Culture of Dental Epithelial Stem Cells from the Adult Mouse Incisor
Authors: Miquella G. Chavez, Jimmy Hu, Kerstin Seidel, Chunying Li, Andrew Jheon, Adrien Naveau, Orapin Horst, Ophir D. Klein.
Institutions: University of California, San Francisco, University of California, San Francisco, Zhongshan Hospital of Dalian University, Université Paris Descartes, Sorbonne Paris Cite, UMR S872, Université Pierre et Marie Curie, UMR S872, INSERM U872, University of California, San Francisco, University of California, San Francisco.
Understanding the cellular and molecular mechanisms that underlie tooth regeneration and renewal has become a topic of great interest1-4, and the mouse incisor provides a model for these processes. This remarkable organ grows continuously throughout the animal's life and generates all the necessary cell types from active pools of adult stem cells housed in the labial (toward the lip) and lingual (toward the tongue) cervical loop (CL) regions. Only the dental stem cells from the labial CL give rise to ameloblasts that generate enamel, the outer covering of teeth, on the labial surface. This asymmetric enamel formation allows abrasion at the incisor tip, and progenitors and stem cells in the proximal incisor ensure that the dental tissues are constantly replenished. The ability to isolate and grow these progenitor or stem cells in vitro allows their expansion and opens doors to numerous experiments not achievable in vivo, such as high throughput testing of potential stem cell regulatory factors. Here, we describe and demonstrate a reliable and consistent method to culture cells from the labial CL of the mouse incisor.
Stem Cell Biology, Issue 87, Epithelial Stem Cells, Adult Stem Cells, Incisor, Cervical Loop, Cell Culture
51266
Play Button
Proteomic Sample Preparation from Formalin Fixed and Paraffin Embedded Tissue
Authors: Jacek R. Wiśniewski.
Institutions: Max Planck Institute of Biochemistry.
Preserved clinical material is a unique source for proteomic investigation of human disorders. Here we describe an optimized protocol allowing large scale quantitative analysis of formalin fixed and paraffin embedded (FFPE) tissue. The procedure comprises four distinct steps. The first one is the preparation of sections from the FFPE material and microdissection of cells of interest. In the second step the isolated cells are lysed and processed using 'filter aided sample preparation' (FASP) technique. In this step, proteins are depleted from reagents used for the sample lysis and are digested in two-steps using endoproteinase LysC and trypsin. After each digestion, the peptides are collected in separate fractions and their content is determined using a highly sensitive fluorescence measurement. Finally, the peptides are fractionated on 'pipette-tip' microcolumns. The LysC-peptides are separated into 4 fractions whereas the tryptic peptides are separated into 2 fractions. In this way prepared samples allow analysis of proteomes from minute amounts of material to a depth of 10,000 proteins. Thus, the described workflow is a powerful technique for studying diseases in a system-wide-fashion as well as for identification of potential biomarkers and drug targets.
Chemistry, Issue 79, Clinical Chemistry Tests, Proteomics, Proteomics, Proteomics, analytical chemistry, Formalin fixed and paraffin embedded (FFPE), sample preparation, proteomics, filter aided sample preparation (FASP), clinical proteomics; microdissection, SAX-fractionation
50589
Play Button
Profiling of Pre-micro RNAs and microRNAs using Quantitative Real-time PCR (qPCR) Arrays
Authors: Pauline Chugh, Kristen Tamburro, Dirk P Dittmer.
Institutions: University of North Carolina at Chapel Hill.
Quantitative real-time PCR (QPCR) has emerged as an accurate and valuable tool in profiling gene expression levels. One of its many advantages is a lower detection limit compared to other methods of gene expression profiling while using smaller amounts of input for each assay. Automated qPCR setup has improved this field by allowing for greater reproducibility. Its convenient and rapid setup allows for high-throughput experiments, enabling the profiling of many different genes simultaneously in each experiment. This method along with internal plate controls also reduces experimental variables common to other techniques. We recently developed a qPCR assay for profiling of pre-microRNAs (pre-miRNAs) using a set of 186 primer pairs. MicroRNAs have emerged as a novel class of small, non-coding RNAs with the ability to regulate many mRNA targets at the post-transcriptional level. These small RNAs are first transcribed by RNA polymerase II as a primary miRNA (pri-miRNA) transcript, which is then cleaved into the precursor miRNA (pre-miRNA). Pre-miRNAs are exported to the cytoplasm where Dicer cleaves the hairpin loop to yield mature miRNAs. Increases in miRNA levels can be observed at both the precursor and mature miRNA levels and profiling of both of these forms can be useful. There are several commercially available assays for mature miRNAs; however, their high cost may deter researchers from this profiling technique. Here, we discuss a cost-effective, reliable, SYBR-based qPCR method of profiling pre-miRNAs. Changes in pre-miRNA levels often reflect mature miRNA changes and can be a useful indicator of mature miRNA expression. However, simultaneous profiling of both pre-miRNAs and mature miRNAs may be optimal as they can contribute nonredundant information and provide insight into microRNA processing. Furthermore, the technique described here can be expanded to encompass the profiling of other library sets for specific pathways or pathogens.
Biochemistry, Issue 46, pre-microRNAs, qPCR, profiling, Tecan Freedom Evo, robot
2210
Play Button
Microgavage of Zebrafish Larvae
Authors: Jordan L. Cocchiaro, John F. Rawls.
Institutions: University of North Carolina at Chapel Hill .
The zebrafish has emerged as a powerful model organism for studying intestinal development1-5, physiology6-11, disease12-16, and host-microbe interactions17-25. Experimental approaches for studying intestinal biology often require the in vivo introduction of selected materials into the lumen of the intestine. In the larval zebrafish model, this is typically accomplished by immersing fish in a solution of the selected material, or by injection through the abdominal wall. Using the immersion method, it is difficult to accurately monitor or control the route or timing of material delivery to the intestine. For this reason, immersion exposure can cause unintended toxicity and other effects on extraintestinal tissues, limiting the potential range of material amounts that can be delivered into the intestine. Also, the amount of material ingested during immersion exposure can vary significantly between individual larvae26. Although these problems are not encountered during direct injection through the abdominal wall, proper injection is difficult and causes tissue damage which could influence experimental results. We introduce a method for microgavage of zebrafish larvae. The goal of this method is to provide a safe, effective, and consistent way to deliver material directly to the lumen of the anterior intestine in larval zebrafish with controlled timing. Microgavage utilizes standard embryo microinjection and stereomicroscopy equipment common to most laboratories that perform zebrafish research. Once fish are properly positioned in methylcellulose, gavage can be performed quickly at a rate of approximately 7-10 fish/ min, and post-gavage survival approaches 100% depending on the gavaged material. We also show that microgavage can permit loading of the intestinal lumen with high concentrations of materials that are lethal to fish when exposed by immersion. To demonstrate the utility of this method, we present a fluorescent dextran microgavage assay that can be used to quantify transit from the intestinal lumen to extraintestinal spaces. This test can be used to verify proper execution of the microgavage procedure, and also provides a novel zebrafish assay to examine intestinal epithelial barrier integrity under different experimental conditions (e.g. genetic manipulation, drug treatment, or exposure to environmental factors). Furthermore, we show how gavage can be used to evaluate intestinal motility by gavaging fluorescent microspheres and monitoring their subsequent transit. Microgavage can be applied to deliver diverse materials such as live microorganisms, secreted microbial factors/toxins, pharmacological agents, and physiological probes. With these capabilities, the larval zebrafish microgavage method has the potential to enhance a broad range of research fields using the zebrafish model system.
Biochemistry, Issue 72, Molecular Biology, Anatomy, Physiology, Basic Protocols, Surgery, Zebrafish, Danio rerio, intestine, lumen, larvae, gavage, microgavage, epithelium, barrier function, gut motility, microsurgery, microscopy, animal model
4434
Play Button
Transcriptome Analysis of Single Cells
Authors: Jacqueline Morris, Jennifer M. Singh, James H. Eberwine.
Institutions: University of Pennsylvania, University of Pennsylvania.
Many gene expression analysis techniques rely on material isolated from heterogeneous populations of cells from tissue homogenates or cells in culture.1,2,3 In the case of the brain, regions such as the hippocampus contain a complex arrangement of different cell types, each with distinct mRNA profiles. The ability to harvest single cells allows for a more in depth investigation into the molecular differences between and within cell populations. We describe a simple and rapid method for harvesting cells for further processing. Pipettes often used in electrophysiology are utilized to isolate (using aspiration) a cell of interest and conveniently deposit it into an Eppendorf tube for further processing with any number of molecular biology techniques. Our protocol can be modified for the harvest of dendrites from cell culture or even individual cells from acute slices. We also describe the aRNA amplification method as a major downstream application of single cell isolations. This method was developed previously by our lab as an alternative to other gene expression analysis techniques such as reverse-transcription or real-time polymerase chain reaction (PCR).4,5,6,7,8 This technique provides for linear amplification of the polyadenylated RNA beginning with only femtograms of material and resulting in microgram amounts of antisense RNA. The linearly amplified material provides a more accurate estimation than PCR exponential amplification of the relative abundance of components of the transcriptome of the isolated cell. The basic procedure consists of two rounds of amplification. Briefly, a T7 RNA polymerase promoter site is incorporated into double stranded cDNA created from the mRNA transcripts. An overnight in vitro transcription (IVT) reaction is then performed in which T7 RNA polymerase produces many antisense transcripts from the double stranded cDNA. The second round repeats this process but with some technical differences since the starting material is antisense RNA. It is standard to repeat the second round, resulting in three rounds of amplification. Often, the third round in vitro transcription reaction is performed using biotinylated nucleoside triphosphates so that the antisense RNA produced can be hybridized and detected on a microarray.7,8
Neuroscience, Issue 50, single-cell, transcriptome, aRNA amplification, RT-PCR, molecular biology, gene expression
2634
Play Button
Meal Duration as a Measure of Orofacial Nociceptive Responses in Rodents
Authors: Phillip R. Kramer, Larry L. Bellinger.
Institutions: Texas A&M University Baylor College of Dentistry.
A lengthening in meal duration can be used to measure an increase in orofacial mechanical hyperalgesia having similarities to the guarding behavior of humans with orofacial pain. To measure meal duration unrestrained rats are continuously kept in sound attenuated, computerized feeding modules for days to weeks to record feeding behavior. These sound-attenuated chambers are equipped with chow pellet dispensers. The dispenser has a pellet trough with a photobeam placed at the bottom of the trough and when a rodent removes a pellet from the feeder trough this beam is no longer blocked, signaling the computer to drop another pellet. The computer records the date and time when the pellets were taken from the trough and from this data the experimenter can calculate the meal parameters. When calculating meal parameters a meal was defined based on previous work and was set at 10 min (in other words when the animal does not eat for 10 min that would be the end of the animal's meal) also the minimum meal size was set at 3 pellets. The meal duration, meal number, food intake, meal size and inter-meal interval can then be calculated by the software for any time period that the operator desires. Of the feeding parameters that can be calculated meal duration has been shown to be a continuous noninvasive biological marker of orofacial nociception in male rats and mice and female rats. Meal duration measurements are quantitative, require no training or animal manipulation, require cortical participation, and do not compete with other experimentally induced behaviors. These factors distinguish this assay from other operant or reflex methods for recording orofacial nociception.
Behavior, Issue 83, Pain, rat, nociception, myofacial, orofacial, tooth, temporomandibular joint (TMJ)
50745
Play Button
Training Synesthetic Letter-color Associations by Reading in Color
Authors: Olympia Colizoli, Jaap M. J. Murre, Romke Rouw.
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
50893
Play Button
Characterization of Surface Modifications by White Light Interferometry: Applications in Ion Sputtering, Laser Ablation, and Tribology Experiments
Authors: Sergey V. Baryshev, Robert A. Erck, Jerry F. Moore, Alexander V. Zinovev, C. Emil Tripa, Igor V. Veryovkin.
Institutions: Argonne National Laboratory, Argonne National Laboratory, MassThink LLC.
In materials science and engineering it is often necessary to obtain quantitative measurements of surface topography with micrometer lateral resolution. From the measured surface, 3D topographic maps can be subsequently analyzed using a variety of software packages to extract the information that is needed. In this article we describe how white light interferometry, and optical profilometry (OP) in general, combined with generic surface analysis software, can be used for materials science and engineering tasks. In this article, a number of applications of white light interferometry for investigation of surface modifications in mass spectrometry, and wear phenomena in tribology and lubrication are demonstrated. We characterize the products of the interaction of semiconductors and metals with energetic ions (sputtering), and laser irradiation (ablation), as well as ex situ measurements of wear of tribological test specimens. Specifically, we will discuss: Aspects of traditional ion sputtering-based mass spectrometry such as sputtering rates/yields measurements on Si and Cu and subsequent time-to-depth conversion. Results of quantitative characterization of the interaction of femtosecond laser irradiation with a semiconductor surface. These results are important for applications such as ablation mass spectrometry, where the quantities of evaporated material can be studied and controlled via pulse duration and energy per pulse. Thus, by determining the crater geometry one can define depth and lateral resolution versus experimental setup conditions. Measurements of surface roughness parameters in two dimensions, and quantitative measurements of the surface wear that occur as a result of friction and wear tests. Some inherent drawbacks, possible artifacts, and uncertainty assessments of the white light interferometry approach will be discussed and explained.
Materials Science, Issue 72, Physics, Ion Beams (nuclear interactions), Light Reflection, Optical Properties, Semiconductor Materials, White Light Interferometry, Ion Sputtering, Laser Ablation, Femtosecond Lasers, Depth Profiling, Time-of-flight Mass Spectrometry, Tribology, Wear Analysis, Optical Profilometry, wear, friction, atomic force microscopy, AFM, scanning electron microscopy, SEM, imaging, visualization
50260
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
In Situ Neutron Powder Diffraction Using Custom-made Lithium-ion Batteries
Authors: William R. Brant, Siegbert Schmid, Guodong Du, Helen E. A. Brand, Wei Kong Pang, Vanessa K. Peterson, Zaiping Guo, Neeraj Sharma.
Institutions: University of Sydney, University of Wollongong, Australian Synchrotron, Australian Nuclear Science and Technology Organisation, University of Wollongong, University of New South Wales.
Li-ion batteries are widely used in portable electronic devices and are considered as promising candidates for higher-energy applications such as electric vehicles.1,2 However, many challenges, such as energy density and battery lifetimes, need to be overcome before this particular battery technology can be widely implemented in such applications.3 This research is challenging, and we outline a method to address these challenges using in situ NPD to probe the crystal structure of electrodes undergoing electrochemical cycling (charge/discharge) in a battery. NPD data help determine the underlying structural mechanism responsible for a range of electrode properties, and this information can direct the development of better electrodes and batteries. We briefly review six types of battery designs custom-made for NPD experiments and detail the method to construct the ‘roll-over’ cell that we have successfully used on the high-intensity NPD instrument, WOMBAT, at the Australian Nuclear Science and Technology Organisation (ANSTO). The design considerations and materials used for cell construction are discussed in conjunction with aspects of the actual in situ NPD experiment and initial directions are presented on how to analyze such complex in situ data.
Physics, Issue 93, In operando, structure-property relationships, electrochemical cycling, electrochemical cells, crystallography, battery performance
52284
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
50341
Play Button
Flexible Colonoscopy in Mice to Evaluate the Severity of Colitis and Colorectal Tumors Using a Validated Endoscopic Scoring System
Authors: Tomohiro Kodani, Alex Rodriguez-Palacios, Daniele Corridoni, Loris Lopetuso, Luca Di Martino, Brian Marks, James Pizarro, Theresa Pizarro, Amitabh Chak, Fabio Cominelli.
Institutions: Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland.
The use of modern endoscopy for research purposes has greatly facilitated our understanding of gastrointestinal pathologies. In particular, experimental endoscopy has been highly useful for studies that require repeated assessments in a single laboratory animal, such as those evaluating mechanisms of chronic inflammatory bowel disease and the progression of colorectal cancer. However, the methods used across studies are highly variable. At least three endoscopic scoring systems have been published for murine colitis and published protocols for the assessment of colorectal tumors fail to address the presence of concomitant colonic inflammation. This study develops and validates a reproducible endoscopic scoring system that integrates evaluation of both inflammation and tumors simultaneously. This novel scoring system has three major components: 1) assessment of the extent and severity of colorectal inflammation (based on perianal findings, transparency of the wall, mucosal bleeding, and focal lesions), 2) quantitative recording of tumor lesions (grid map and bar graph), and 3) numerical sorting of clinical cases by their pathological and research relevance based on decimal units with assigned categories of observed lesions and endoscopic complications (decimal identifiers). The video and manuscript presented herein were prepared, following IACUC-approved protocols, to allow investigators to score their own experimental mice using a well-validated and highly reproducible endoscopic methodology, with the system option to differentiate distal from proximal endoscopic colitis (D-PECS).
Medicine, Issue 80, Crohn's disease, ulcerative colitis, colon cancer, Clostridium difficile, SAMP mice, DSS/AOM-colitis, decimal scoring identifier
50843
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
50671
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
51823
Play Button
Viral Concentration Determination Through Plaque Assays: Using Traditional and Novel Overlay Systems
Authors: Alan Baer, Kylene Kehn-Hall.
Institutions: George Mason University.
Plaque assays remain one of the most accurate methods for the direct quantification of infectious virons and antiviral substances through the counting of discrete plaques (infectious units and cellular dead zones) in cell culture. Here we demonstrate how to perform a basic plaque assay, and how differing overlays and techniques can affect plaque formation and production. Typically solid or semisolid overlay substrates, such as agarose or carboxymethyl cellulose, have been used to restrict viral spread, preventing indiscriminate infection through the liquid growth medium. Immobilized overlays restrict cellular infection to the immediately surrounding monolayer, allowing the formation of discrete countable foci and subsequent plaque formation. To overcome the difficulties inherent in using traditional overlays, a novel liquid overlay utilizing microcrystalline cellulose and carboxymethyl cellulose sodium has been increasingly used as a replacement in the standard plaque assay. Liquid overlay plaque assays can be readily performed in either standard 6 or 12 well plate formats as per traditional techniques and require no special equipment. Due to its liquid state and subsequent ease of application and removal, microculture plate formats may alternatively be utilized as a rapid, accurate and high throughput alternative to larger scale viral titrations. Use of a non heated viscous liquid polymer offers the opportunity to streamline work, conserves reagents, incubator space, and increases operational safety when used in traditional or high containment labs as no reagent heating or glassware are required. Liquid overlays may also prove more sensitive than traditional overlays for certain heat labile viruses.
Virology, Issue 93, Plaque Assay, Virology, Viral Quantification, Cellular Overlays, Agarose, Avicel, Crystal Violet Staining, Serial Dilutions, Rift Valley fever virus, Venezuelan Equine Encephalitis, Influenza
52065
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
The Ladder Rung Walking Task: A Scoring System and its Practical Application.
Authors: Gerlinde A. Metz, Ian Q. Whishaw.
Institutions: University of Lethbridge.
Progress in the development of animal models for/stroke, spinal cord injury, and other neurodegenerative disease requires tests of high sensitivity to elaborate distinct aspects of motor function and to determine even subtle loss of movement capacity. To enhance efficacy and resolution of testing, tests should permit qualitative and quantitative measures of motor function and be sensitive to changes in performance during recovery periods. The present study describes a new task to assess skilled walking in the rat to measure both forelimb and hindlimb function at the same time. Animals are required to walk along a horizontal ladder on which the spacing of the rungs is variable and is periodically changed. Changes in rung spacing prevent animals from learning the absolute and relative location of the rungs and so minimize the ability of the animals to compensate for impairments through learning. In addition, changing the spacing between the rungs allows the test to be used repeatedly in long-term studies. Methods are described for both quantitative and qualitative description of both fore- and hindlimb performance, including limb placing, stepping, co-ordination. Furthermore, use of compensatory strategies is indicated by missteps or compensatory steps in response to another limb’s misplacement.
Neuroscience, Issue 28, rat, animal model of walking, skilled movement, ladder test, rung test, neuroscience
1204
Play Button
Brain Imaging Investigation of the Memory-Enhancing Effect of Emotion
Authors: Andrea Shafer, Alexandru Iordan, Roberto Cabeza, Florin Dolcos.
Institutions: University of Alberta, University of Illinois, Urbana-Champaign, Duke University, University of Illinois, Urbana-Champaign.
Emotional events tend to be better remembered than non-emotional events1,2. One goal of cognitive and affective neuroscientists is to understand the neural mechanisms underlying this enhancing effect of emotion on memory. A method that has proven particularly influential in the investigation of the memory-enhancing effect of emotion is the so-called subsequent memory paradigm (SMP). This method was originally used to investigate the neural correlates of non-emotional memories3, and more recently we and others also applied it successfully to studies of emotional memory (reviewed in4, 5-7). Here, we describe a protocol that allows investigation of the neural correlates of the memory-enhancing effect of emotion using the SMP in conjunction with event-related functional magnetic resonance imaging (fMRI). An important feature of the SMP is that it allows separation of brain activity specifically associated with memory from more general activity associated with perception. Moreover, in the context of investigating the impact of emotional stimuli, SMP allows identification of brain regions whose activity is susceptible to emotional modulation of both general/perceptual and memory-specific processing. This protocol can be used in healthy subjects8-15, as well as in clinical patients where there are alterations in the neural correlates of emotion perception and biases in remembering emotional events, such as those suffering from depression and post-traumatic stress disorder (PTSD)16, 17.
Neuroscience, Issue 51, Affect, Recognition, Recollection, Dm Effect, Neuroimaging
2433
Play Button
Cross-Modal Multivariate Pattern Analysis
Authors: Kaspar Meyer, Jonas T. Kaplan.
Institutions: University of Southern California.
Multivariate pattern analysis (MVPA) is an increasingly popular method of analyzing functional magnetic resonance imaging (fMRI) data1-4. Typically, the method is used to identify a subject's perceptual experience from neural activity in certain regions of the brain. For instance, it has been employed to predict the orientation of visual gratings a subject perceives from activity in early visual cortices5 or, analogously, the content of speech from activity in early auditory cortices6. Here, we present an extension of the classical MVPA paradigm, according to which perceptual stimuli are not predicted within, but across sensory systems. Specifically, the method we describe addresses the question of whether stimuli that evoke memory associations in modalities other than the one through which they are presented induce content-specific activity patterns in the sensory cortices of those other modalities. For instance, seeing a muted video clip of a glass vase shattering on the ground automatically triggers in most observers an auditory image of the associated sound; is the experience of this image in the "mind's ear" correlated with a specific neural activity pattern in early auditory cortices? Furthermore, is this activity pattern distinct from the pattern that could be observed if the subject were, instead, watching a video clip of a howling dog? In two previous studies7,8, we were able to predict sound- and touch-implying video clips based on neural activity in early auditory and somatosensory cortices, respectively. Our results are in line with a neuroarchitectural framework proposed by Damasio9,10, according to which the experience of mental images that are based on memories - such as hearing the shattering sound of a vase in the "mind's ear" upon seeing the corresponding video clip - is supported by the re-construction of content-specific neural activity patterns in early sensory cortices.
Neuroscience, Issue 57, perception, sensory, cross-modal, top-down, mental imagery, fMRI, MRI, neuroimaging, multivariate pattern analysis, MVPA
3307
Play Button
Use of Rotorod as a Method for the Qualitative Analysis of Walking in Rat
Authors: Ian Q. Whishaw, Katie Li, Paul A. Whishaw, Bogdan Gorny, Gerlinde A. Metz.
Institutions: University of Lethbridge.
High speed videoanalysis of the details of movement can provide a source of information about qualitative aspects of walking movements. When walking on a rotorod, animals remain in approximately the same place making repetitive movements of stepping. Thus the task provides a rich source of information on the details of foot stepping movements. Subjects were hemi-Parkinson analogue rats, produced by injection of 6-hydroxydopamine (6-OHDA) into the right nigrostriatal bundle to deplete nigrostriatal dopamine (DA). The present report provides a video analysis illustration of animals previously were filmed from frontal, lateral, and posterior views as they walked (15). Rating scales and frame-by-frame replay of the video records of stepping behavior indicated that the hemi-Parkinson rats were chronically impaired in posture and limb use contralateral to the DA-depletion. The contralateral limbs participated less in initiating and sustaining propulsion than the ipsilateral limbs. These deficits secondary to unilateral DA-depletion show that the rotorod provides a use task for the analysis of stepping movements.
Neuroscience, Issue 22, Rat walking, gait analysis, rotorod, rat forelimb, Parkinson disease model, dopamine depletion
1030
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.