-1::1
Simple Hit Counter
Skip to content

Products

Solutions

×
×
Sign In

EN

EN - EnglishCN - 简体中文DE - DeutschES - EspañolKR - 한국어IT - ItalianoFR - FrançaisPT - Português do BrasilPL - PolskiHE - עִבְרִיתRU - РусскийJA - 日本語TR - TürkçeAR - العربية
Sign In Start Free Trial

RESEARCH

JoVE Journal

Peer reviewed scientific video journal

Behavior
Biochemistry
Bioengineering
Biology
Cancer Research
Chemistry
Developmental Biology
View All
JoVE Encyclopedia of Experiments

Video encyclopedia of advanced research methods

Biological Techniques
Biology
Cancer Research
Immunology
Neuroscience
Microbiology
JoVE Visualize

Visualizing science through experiment videos

EDUCATION

JoVE Core

Video textbooks for undergraduate courses

Analytical Chemistry
Anatomy and Physiology
Biology
Cell Biology
Chemistry
Civil Engineering
Electrical Engineering
View All
JoVE Science Education

Visual demonstrations of key scientific experiments

Advanced Biology
Basic Biology
Chemistry
View All
JoVE Lab Manual

Videos of experiments for undergraduate lab courses

Biology
Chemistry

BUSINESS

JoVE Business

Video textbooks for business education

Accounting
Finance
Macroeconomics
Marketing
Microeconomics

OTHERS

JoVE Quiz

Interactive video based quizzes for formative assessments

Authors

Teaching Faculty

Librarians

K12 Schools

Products

RESEARCH

JoVE Journal

Peer reviewed scientific video journal

JoVE Encyclopedia of Experiments

Video encyclopedia of advanced research methods

JoVE Visualize

Visualizing science through experiment videos

EDUCATION

JoVE Core

Video textbooks for undergraduates

JoVE Science Education

Visual demonstrations of key scientific experiments

JoVE Lab Manual

Videos of experiments for undergraduate lab courses

BUSINESS

JoVE Business

Video textbooks for business education

OTHERS

JoVE Quiz

Interactive video based quizzes for formative assessments

Solutions

Authors
Teaching Faculty
Librarians
K12 Schools

Language

English

EN

English

CN

简体中文

DE

Deutsch

ES

Español

KR

한국어

IT

Italiano

FR

Français

PT

Português do Brasil

PL

Polski

HE

עִבְרִית

RU

Русский

JA

日本語

TR

Türkçe

AR

العربية

    Menu

    JoVE Journal

    Behavior

    Biochemistry

    Bioengineering

    Biology

    Cancer Research

    Chemistry

    Developmental Biology

    Engineering

    Environment

    Genetics

    Immunology and Infection

    Medicine

    Neuroscience

    Menu

    JoVE Encyclopedia of Experiments

    Biological Techniques

    Biology

    Cancer Research

    Immunology

    Neuroscience

    Microbiology

    Menu

    JoVE Core

    Analytical Chemistry

    Anatomy and Physiology

    Biology

    Cell Biology

    Chemistry

    Civil Engineering

    Electrical Engineering

    Introduction to Psychology

    Mechanical Engineering

    Medical-Surgical Nursing

    View All

    Menu

    JoVE Science Education

    Advanced Biology

    Basic Biology

    Chemistry

    Clinical Skills

    Engineering

    Environmental Sciences

    Physics

    Psychology

    View All

    Menu

    JoVE Lab Manual

    Biology

    Chemistry

    Menu

    JoVE Business

    Accounting

    Finance

    Macroeconomics

    Marketing

    Microeconomics

Start Free Trial
Loading...
Home
JoVE Journal
Behavior
Frame-by-Frame Video Analysis of Idiosyncratic Reach-to-Grasp Movements in Humans
Frame-by-Frame Video Analysis of Idiosyncratic Reach-to-Grasp Movements in Humans
JoVE Journal
Behavior
A subscription to JoVE is required to view this content.  Sign in or start your free trial.
JoVE Journal Behavior
Frame-by-Frame Video Analysis of Idiosyncratic Reach-to-Grasp Movements in Humans

Frame-by-Frame Video Analysis of Idiosyncratic Reach-to-Grasp Movements in Humans

Full Text
8,625 Views
10:51 min
January 15, 2018

DOI: 10.3791/56733-v

Jenni M. Karl1, Jessica R. Kuntz2, Layne A. Lenhart2, Ian Q. Whishaw2

1Department of Psychology,Thompson Rivers University, 2Department of Neuroscience,University of Lethbridge

Summary

This protocol describes how to use frame-by-frame video analysis to quantify idiosyncratic reach-to-grasp movements in humans. A comparative analysis of reaching in sighted versus unsighted healthy adults is used to demonstrate the technique, but the method can also be applied to the study of developmental and clinical populations.

Transcript

The overall goal of this procedure is to quantify the temporal organization, kinematic structure, and topographical features of idiosyncratic reach-to-grasp movements. This method can be used to answer key questions related to the neurobehavioral organization of hand movements in infants, brain injured patients, and in non-human primates, who can be difficult to study using automated motion tracking techniques. The main advantage of this technique is that it is highly reliable, inexpensive, and unintrusive, yet it is easily modified to suit a number of research goals.

Demonstrating the procedure today will me Alexis Wilson and Marisa Bertoli, who are research assistants in my laboratory. Begin by selecting blueberries, donut balls, and orange slices to serve as reaching targets. Escort the participant into the testing room and inform them that they will be completing a total of 60 reaching trials, separated into four blocks, with each block consisting of 15 reaching trials.

Next, adjust the height of a self-standing height-adjustable pedestal to the seated participant's trunk length, so that the top of the pedestal stands midway between the top of the participant's hip and the participant's sternum. Then, position one high-speed video camera's sagittal to the participant on the same side as the participant's non-dominant hand at a one meter distance from the pedestal to record a reach-side view of the participant's dominant hand. Following that, place a second video camera one meter in front of the pedestal to capture a front-on view of the participant.

Instruct the participant to begin each reaching trial with their hands opened, relaxed, and resting palm down on the dorsum of their upper thighs. Tell the participant that at the beginning of each trial, the experimenter will place a target object on the pedestal and that they should wait until the experimenter provides a verbal one, two, three, go command to reach out with their dominant hand, grasp the target object, and then place the target object in their mouth as if they were going to eat it. Next, begin data collection by quickly tapping the top central surface of the pedestal with your index finger to serve as a time cue on all videos.

Finally, place the first target object on the pedestal and use a one, two, three, go cue to signal to the participant to perform the reaching trial. Begin by opening the video files in the video editing software program. Use the arrow keys on the keyboard to navigate to the video frame that depicts the movement the experimenter taps the top of the pedestal with her index finger.

Then, use the trim function in the video editing software to remove all frames prior to the current frame. Next, use the export function on the video editing software to save the trimmed version of each video record to a secure location on the computer's hard drive. Finally, select and drag all the newly trimmed video records for a single participant into separate timelines in the video editing software to perform time-synchronized frame-by-frame video analysis.

Begin by identifying the frame number of movement start, which is defined as the first visible lifting of the palm of the hand away from the dorsum of the upper thigh. Then, identify the frame number of collection, which is the formation of a closed hand posture in which the digits maximally flex and close. Next, identify the frame number of maximum height, which is the maximum height of the most proximal knuckle of the index finger as the hand reaches towards the target object.

Following that, identify the frame number of peak aperture, which is the maximum opening of the hand that occurs after collection, but prior to first contact with the target. Identify the frame number of first contact between the hand and the target object. Finally, identify the frame number that corresponds to the moment at which all manipulation of the target object is complete and the participant has a firm hold on the target object, which is referred to as final grasp.

Begin by opening the still frame image that depicts the key behavioral event of collection in the photo editing software. Select the ruler tool and use it to draw a straight line between the central tip of the thumb and the central tip of the index finger. Record the length of this line as the collection distance in the spreadsheet.

Next, open the still frame image that depicts maximum height in the photo editing software. Use the ruler tool to measure the vertical distance between the top of the pedestal and the top of the participant's index knuckle. Record the length of this line as the maximum height distance in the spreadsheet.

Open the peak aperture image in the photo editing software and use the ruler tool to measure the distance between the central tip of the thumb and the central tip of the index finger. Record the length of this line as the peak aperture distance in the spreadsheet. Next, open the first contact image and use the ruler tool to measure the distance between the central tip of the thumb and the central tip of the index finger.

Record the length of this line as the first contact aperture distance in the spreadsheet. Finally, open the still frame image that depicts final grasp in the photo editing software. Use the ruler tool to measure the distance between the central tip of the thumb and the central tip of the index finger.

Record the length of this line as the final grasp aperture distance in the spreadsheet. While performing the frame-by-frame video analysis, document in the spreadsheet which part of the hand is used to make first contact with the target for each trial for each participant. Determine first contact points by exporting a still-frame image of the target, opening it in the photo editing software, and using the program's paintbrush tool to mark the location on the target at which first contact between the hand and the target was made for each trial.

Then, determine the grasp points by using the program's paintbrush tool to mark the location on the target at which the hand contacts the target at the time of final grasp for each trial. Next, determine the number of adjustments made on each trial by inspecting the video record, noting any instances where the participant released and reestablished contact with the target between the frame of first contact and the frame of final grasp. Then, characterize grip-type as either a pincer grip, precision grip, or power grip.

Finally, determine the grasp strategy used for each trial and record it in the spreadsheet for each trial. Results indicated that when participants can use vision to preemptively identify both location and size of a target object, they integrate the reach and the grasp into a single seamless prehensile act. When vision is unavailable, however, they dissociate the two movements so that tactile feedback can be used to first direct the hand in relation to the location and then the shape of the target.

Additionally, the hand takes a more elevated approach to the target and thus achieves a greater maximum height in the no vision condition compared to the vision condition. In the no vision condition, the location of first contact and the part of the hand to make first contact with the target varies greatly. This contrasts with the vision condition, in which participants generally used the index finger and/or thumb to make first contact with opposite sides of the target.

In the no vision condition, the aperture of the hand does not pre-shape to the size of the target at either peak aperture or at first contact. Yet hand aperture at final grasp is identical in the vision and no vision conditions. Variations of this protocol have enhanced our understanding of reach-to-grasp behavior in unsighted adults, human infants, brain injured patients, and in non-human primates by providing insight into both the neural and the behavioral organization of prehension and also by providing support for the Dual Visuomotor Channel Theory of reaching.

Explore More Videos

Frame-by-frame Video AnalysisReach-to-grasp MovementsKinematic StructureTopographical FeaturesNeurobehavioral OrganizationHand MovementsAutomated Motion TrackingHigh-speed VideoReach-side ViewFront-on ViewReaching TrialsTarget ObjectTime Cue

Related Videos

The Structure of Skilled Forelimb Reaching in the Rat: A Movement Rating Scale

12:06

The Structure of Skilled Forelimb Reaching in the Rat: A Movement Rating Scale

Related Videos

13.1K Views

Behavioral Assessment of Manual Dexterity in Non-Human Primates

16:00

Behavioral Assessment of Manual Dexterity in Non-Human Primates

Related Videos

22.7K Views

Corticospinal Excitability Modulation During Action Observation

12:33

Corticospinal Excitability Modulation During Action Observation

Related Videos

9.2K Views

Methods to Explore the Influence of Top-down Visual Processes on Motor Behavior

09:49

Methods to Explore the Influence of Top-down Visual Processes on Motor Behavior

Related Videos

26.6K Views

Design and Use of an Apparatus for Presenting Graspable Objects in 3D Workspace

09:11

Design and Use of an Apparatus for Presenting Graspable Objects in 3D Workspace

Related Videos

5.9K Views

Automated Rat Single-Pellet Reaching with 3-Dimensional Reconstruction of Paw and Digit Trajectories

07:52

Automated Rat Single-Pellet Reaching with 3-Dimensional Reconstruction of Paw and Digit Trajectories

Related Videos

14.7K Views

In Vivo Wireless Optogenetic Control of Skilled Motor Behavior

07:52

In Vivo Wireless Optogenetic Control of Skilled Motor Behavior

Related Videos

3.6K Views

Assessing Corticospinal Excitability During Goal-Directed Reaching Behavior

05:05

Assessing Corticospinal Excitability During Goal-Directed Reaching Behavior

Related Videos

1.9K Views

Estimation of Contact Regions Between Hands and Objects During Human Multi-Digit Grasping

09:41

Estimation of Contact Regions Between Hands and Objects During Human Multi-Digit Grasping

Related Videos

1.9K Views

Using Eye Movements to Evaluate the Cognitive Processes Involved in Text Comprehension

06:49

Using Eye Movements to Evaluate the Cognitive Processes Involved in Text Comprehension

Related Videos

27.7K Views

JoVE logo
Contact Us Recommend to Library
Research
  • JoVE Journal
  • JoVE Encyclopedia of Experiments
  • JoVE Visualize
Business
  • JoVE Business
Education
  • JoVE Core
  • JoVE Science Education
  • JoVE Lab Manual
  • JoVE Quizzes
Solutions
  • Authors
  • Teaching Faculty
  • Librarians
  • K12 Schools
About JoVE
  • Overview
  • Leadership
Others
  • JoVE Newsletters
  • JoVE Help Center
  • Blogs
  • Site Maps
Contact Us Recommend to Library
JoVE logo

Copyright © 2025 MyJoVE Corporation. All rights reserved

Privacy Terms of Use Policies
WeChat QR code