Show Advanced Search

REFINE YOUR SEARCH:

Containing Text
- - -
+
Filter by author or institution
GO
Filter by publication date
From:
October, 2006
Until:
Today
Filter by journal

Filter by science education

 
 

Dichotic Listening

JoVE 10101

Source: Laboratory of Jonathan Flombaum—Johns Hopkins University

It is a well-known fact that the human ability to process incoming stimuli is limited. Nonetheless, the world is complicated, and there are always many things going on at once. Selective attention is the mechanism that allows humans and other animals to control which stimuli get processed and which become ignored. Think of a cocktail party: a person couldn’t possibly attend to all of the conversations taking place at once. However, everyone has the ability to selectively listen to one conversation, leading all the rest to become unattended to and nothing more than background noise. In order to study how people do this, researchers simulate a more controlled cocktail party environment by playing sounds to participants dichotically, i.e., by playing different sounds simultaneously to each ear. This is called a dichotic listening paradigm. This experiment demonstrates standard procedures for investigating selective auditory attention with a paradigm called dichotic listening.


 Cognitive Psychology

Language: The N400 in Semantic Incongruity

JoVE 10275

Source: Laboratories of Sarah I. Gimbel and Jonas T. Kaplan— University of Southern California

Understanding language is one of the most complex cognitive tasks that humans are capable of. Given the incredible amount of possible choices when combining individual words to form meaning in sentences, it is crucial that the brain is able to identify when words form coherent combinations and when an anomaly appears that undermines meaning. Extensive research has shown that certain scalp-recorded electrical events are sensitive to deviations in this kind of expectation. Importantly, these electrical signatures of incongruity are specific to unexpected meanings, and are therefore different from the brain's general responses to other kinds of anomalies. The neurophysiological correlates of semantic incongruity have been experimentally examined through the use of paradigms that present semantically congruent and incongruent ends to sentences. Originally introduced in 1980, the semantic incongruity task presents the participant with a series of sentences that end with either a congruent or incongruent word. To test that the response is from semantic incongruity and not more generally due to surprise, some sentences included words presented in a different size.1 The semantically incongrue


 Neuropsychology

Utilizing Repetitive Transcranial Magnetic Stimulation to Improve Language Function in Stroke Patients with Chronic Non-fluent Aphasia

1Department of Neurology, Perelman School of Medicine, University of Pennsylvania, 2Center for Cognitive Neuroscience, University of Pennsylvania, 3Veterans Affairs Boston Healthcare System, 4Harold Goodglass Aphasia Research Center, Boston University School of Medicine, 5Department of Neurology, Boston University School of Medicine

JoVE 50228


 Medicine

A Familiarization Protocol Facilitates the Participation of Children with ASD in Electrophysiological Research

1Department of Communication Sciences and Disorders, Southern Connecticut State University, 2Haskins Laboratories, 3Department of Psychology, Southern Connecticut State University, 4Department of Social Work, Southern Connecticut State University, 5Department of Psychology, University of Connecticut

JoVE 55941


 Neuroscience

An Introduction to Cognition

JoVE 5419

Cognition encompasses mental processes such as memory, perception, decision-making reasoning and language. Cognitive scientists are using a combination of behavioral and neuropsychological techniques to investigate the underlying neural substrates of cognition. They are interested in understanding how information is perceived, processed and how does it affect the final execution of behaviors. With this knowledge, researchers hope to develop new treatments for individuals with cognitive impairments. JoVE's introduction to cognition reviews several components of this phenomenon, such as perception, attention, language comprehension, etc. Key questions in the field of cognition will be discussed along with specific methods currently being used to answer these questions. Finally, specific studies that investigate different aspects of cognition using tools like functional Magnetic Resonance Imaging (fMRI) or Transcranial magnetic stimulation (TMS) will be explained.


 Behavioral Science

Optogenetic Stimulation of the Auditory Nerve

1InnerEarLab, Department of Otolaryngology, University Medical Center Goettingen, 2Bernstein Focus for Neurotechnology, University of Goettingen, 3Auditory Systems Physiology Group, Department of Otolaryngology, University Medical Center Goettingen, 4Center for Nanoscale Microscopy and Molecular Physiology of the Brain, University of Goettingen, 5Department of Chemical, Electronic, and Biomedical Engineering, University of Guanajuato

JoVE 52069


 Neuroscience

Simultaneous Event-Related Brain Potential Recordings in Pairs of Partners: Assessing the Sensitivity of the Brain to the Percepts of Others

1Douglas Mental Health University Institute, 2Department of Psychiatry, McGill University, 3Department of Neurology and Neurosurgery, McGill University, 4Department of Psychology, McGill University

Video Coming Soon

JoVE 56120


 JoVE In-Press

Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.

Assessing Working Memory in Children: The Comprehensive Assessment Battery for Children – Working Memory (CABC-WM)

1Communication Sciences and Disorders, MGH Institute of Health Professions, 2Speech and Hearing Science, Arizona State University, 3Speech, Language, and Hearing Sciences, University of Arizona, 4Department of Psychological Sciences, University of Missouri-Columbia, 5Sanford School of Social and Family Dynamics, Arizona State University, 6School of Social and Behavioral Sciences, New College of Interdisciplinary Arts and Sciences, Arizona State University - West

JoVE 55121


 Behavior

Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.

Ultrasound Images of the Tongue: A Tutorial for Assessment and Remediation of Speech Sound Errors

1Department of Communication Sciences and Disorders, Syracuse University, 2Haskins Laboratories, 3Department of Communicative Sciences and Disorders, New York University, 4Department of Communication Sciences and Disorders, University of Cincinnati, 5Program in Speech-Language-Hearing Sciences, City University of New York Graduate Center, 6Department of Linguistics, Yale University

JoVE 55123


 Behavior

Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.

The McGurk Effect

JoVE 10295

Source: Laboratory of Jonathan Flombaum—Johns Hopkins University

Spoken language, a singular human achievement, relies heavily on specialized perceptual mechanisms. One important feature of language perception mechanisms is that they simultaneously rely on auditory and visual information. This makes sense, because until modern times, a person could expect that most language would be heard in face-to-face interactions. And because producing specific speech sounds requires precise articulation, the mouth can supply good visual information about what someone is saying. In fact, with an up-close and unobstructed view of someone's face, the mouth can often supply better visual signals than speech supplies auditory signals. The result is that the human brain favors visual input, and uses it to disambiguate inherent ambiguity in spoken language. This reliance on visual input to interpret sound was described by Harry McGurk and John Macdonald in a paper in 1976 called Hearing lips and seeing voices.1 In that paper, they described an illusion that arises through a mismatch between a sound recording and a video recording. That illusion has become known as the McGurk effect. This video will demonstrate how to produce and interpret the McGurk effect.


 Sensation and Perception

Results below contain some, but not all of your search terms.

An Experimental Protocol for Assessing the Performance of New Ultrasound Probes Based on CMUT Technology in Application to Brain Imaging

1Department of Electrical, Computer and Biomedical Engineering, University of Pavia, 2Department of Information Engineering, University of Florence, 3Department of Engineering, Roma Tre University, 4FTMTR&D/SPA, STMicroelectronics, 5Brain Connectivity Center, BCC, Istituto Neurologico Nazionale Fondazione C. Mondino I.R.C.C.S., 6Department of Molecular Medicine - Unit of Pathology, University of Pavia, Foundation IRCCS Policlinico San Matteo

JoVE 55798


 Bioengineering

Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.

Recording Human Electrocorticographic (ECoG) Signals for Neuroscientific Research and Real-time Functional Cortical Mapping

1Wadsworth Center, New York State Department of Health, 2Department of Neurology, Albany Medical College, 3Department of Neurosurgery, Albany Medical College, 4Department of Neurosurgery, Washington University, 5Department of Biomed. Eng., Rensselaer Polytechnic Institute, 6Department of Biomed. Sci., State University of New York at Albany, 7Department of Elec. and Comp. Eng., University of Texas at El Paso

JoVE 3993


 Neuroscience

Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.

Mutual Exclusivity: How Children Learn the Meanings of Words

JoVE 10132

Source: Laboratories of Nicholaus Noles and Judith Danovitch—University of Louisville

Humans are different from other animals in many ways, but perhaps the most important differentiating factor is their ability to use language. Other animals can communicate and even understand and use language in limited ways, but trying to teach human language to a chimp or a dog takes a great deal of time and effort. In contrast, young humans acquire their native language easily, and they learn linguistic rules without explicit instruction, which is an accomplishment that even the smartest animals cannot match.  One advantage young humans have over animals is that the human brain is especially adapted to learn new words. With only a few exposures, young children can learn new words and remember them. Perhaps more impressively, children can use what they already know to guide their future learning. For example, children treat objects as if they have only one label. So, if a child has learned the word hammer, they won’t assume an unfamiliar tool has the same name. This is the principle of mutual exclusivity.1-2 This video demonstrates children’s ability to use mutual exclusivity to match words to objects in their environment.


 Developmental Psychology

Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.

Creating Objects and Object Categories for Studying Perception and Perceptual Learning

1Brain and Behavior Discovery Institute, Georgia Health Sciences University, 2Vision Discovery Institute, Georgia Health Sciences University, 3Department of Opthalmology, Georgia Health Sciences University, 4Intelligent Systems Laboratory, Palo Alto Research Center, 5Pattern Recognition Systems, Palo Alto Research Center, 6Department of Psychology, University of Minnesota

JoVE 3358


 Neuroscience

Results below contain some, but not all of your search terms.
Results below contain some, but not all of your search terms.
123456
More Results...