Waiting
Login processing...

Trial ends in Request Full Access Tell Your Colleague About Jove

Neuroscience

A Flexible Platform for Monitoring Cerebellum-Dependent Sensory Associative Learning

Published: January 19, 2022 doi: 10.3791/63205

Summary

We have developed a single platform to track animal behavior during two climbing fiber-dependent associative learning tasks. The low-cost design allows integration with optogenetic or imaging experiments directed towards climbing fiber-associated cerebellar activity.

Abstract

Climbing fiber inputs to Purkinje cells provide instructive signals critical for cerebellum-dependent associative learning. Studying these signals in head-fixed mice facilitates the use of imaging, electrophysiological, and optogenetic methods. Here, a low-cost behavioral platform (~$1000) was developed that allows tracking of associative learning in head-fixed mice that locomote freely on a running wheel. The platform incorporates two common associative learning paradigms: eyeblink conditioning and delayed tactile startle conditioning. Behavior is tracked using a camera and the wheel movement by a detector. We describe the components and setup and provide a detailed protocol for training and data analysis. This platform allows the incorporation of optogenetic stimulation and fluorescence imaging. The design allows a single host computer to control multiple platforms for training multiple animals simultaneously.

Introduction

Pavlovian conditioning of sub-second association between stimuli to elicit a conditioned response has long been used to probe cerebellar-dependent learning. For example, in classical delay eyeblink conditioning (DEC), animals learn to make a well-timed protective blink in response to a neutral conditional stimulus (CS; e.g., a flash of light or auditory tone) when it is paired repeatedly with an unconditional stimulus (US; e.g., a puff of air applied to the cornea) which always elicits a reflex blink, and which comes at or near the end of the CS. The learned response is referred to as a conditioned response (CR), while the reflex response is referred to as the unconditioned response (UR). In rabbits, cerebellum-specific lesions disrupt this form of learning1,2,3,4. Further, Purkinje cell complex spikes, driven by their climbing fiber inputs5, provide a necessary6,7 and sufficient8,9 signal for the acquisition of properly-timed CRs.

More recently, climbing fiber-dependent associative learning paradigms have been developed for head-fixed mice. DEC was the first associative learning paradigm to be adapted to this configuration10,11. DEC in head-fixed mice has been used to identify cerebellar regions11,12,13,14,15,16,17 and circuit elements11,12,13,14,15,18,19 that are required for task acquisition and extinction. This approach has also been used to demonstrate how the cellular-level physiological representation of task parameters evolves with learning13,15,16.

In addition to eyeblink, the delayed startle tactile conditioning (DTSC) paradigm was recently developed as a novel associative learning task for head-fixed mice20. Conceptually similar to DEC, DTSC involves the presentation of a neutral CS with a US, a tap to the face sufficient in intensity to engage a startle reflex21,22 as the UR. In the DTSC paradigm, both the UR and CR are read out as backward locomotion on a wheel. DTSC has now been used to uncover how associative learning alters cerebellar activity and patterns of gene expression20.

In this work, a method was developed for flexibly applying DEC or DTSC in a single platform. The stimulus and platform attributes are schematized in Figure 1. The design incorporates the capacity to track animal behavior with a camera as well as a rotary encoder to track mouse locomotion on a wheel. All aspects of data logging and trial structure are controlled by paired microcontrollers (Arduino) and a single-board computer (SBC; Raspberry Pi). These devices can be accessed through a provided graphical user interface. Here, we present a workflow for setup, experiment preparation and execution, and a customized analysis pipeline for data visualization.

Subscription Required. Please recommend JoVE to your librarian.

Protocol

The animal protocols described here have been approved by the Animal Care and Use Committees of Princeton University.

1. Setting up the SBC

  1. Connect the camera serial interface (CSI) cable to the Raspberry NoIR V2 camera and the camera port on the SBC.
  2. Download the operating system for the SBC onto the host computer. Write the operating system image to a micro secure digital (microSD) card.
    NOTE: Detailed instructions for these procedures for a Raspberry Pi SBC can be found elsewhere23. The system has been tested using the following operating systems: Stretch, Buster, Bullseye.
  3. To enable secure shell communication, create an extensionless file called "ssh" in the boot partition of the microSD card. Once this is done, eject the microSD card from the host machine and insert it into the SBC microSD card slot. Power the SBC by plugging in its power supply.
  4. Prepare the SBC to accept a wired connection to the host.
    1. Attach a monitor with an appropriate cable to the SBC. Open a terminal, type the command ifconfig and record the ethernet IP address of the SBC.
      NOTE: Raspberry Pi model 3B+ has an HDMI display port, while model 4B has a micro-HDMI port.
    2. Go to the Interface tab of the Raspberry Pi configuration setting and enable the options for Camera, secure shell network protocol (SSH), and Virtual Network Computing (VNC).
  5. Establish a wired connection between the host computer and the SBC.
    1. Connect an ethernet cable to the ethernet port on the SBC and a host computer. Attach the other end of these cables to an ethernet switch.
    2. Use a virtual network computing client such as VNC viewer24 and access the desktop using the SBC IP address and the default authentication (user = "pi", password = "raspberry").
  6. Download required software included in the protocol steps.
    CAUTION: Change the default username and password to prevent unauthorized access to the SBC.
    1. Enter the following command in the SBC terminal to download the rig software:
      git clone --depth=1 https://github.com/gerardjb/assocLearnRig
    2. Enter the following commands to download the necessary python libraries.
      cd assocLearnRig
      python3 setup.py
    3. To allow direct control over the microcontroller, connect to the SBC and download the microcontroller integrated development environment (IDE) following steps 1.6.4-1.6.7.
    4. Open the web browser on the SBC desktop and navigate to https://arduino.cc/en/software. Download the latest Linux ARM 32 bit version of the IDE.
    5. Open a terminal window on the SBC desktop and navigate to the downloads directory by typing cd Downloads/
    6. To install the IDE, type the following commands in the terminal:
      tar -xf arduino-<version>-linuxarm.tar.xz
      sudo mv arduino-<version> /opt
      sudo /opt/arduino-<version>/install.sh

      (here <version> is the version of the downloaded IDE)
    7. Open an instance of the microcontroller IDE on the SBC desktop. Select menu option Tools > Manage Libraries. Install the "Encoder" library from Paul Stoffregen.
  7. Expand SBC onboard memory with a USB thumb drive.
    1. Insert a thumb drive to a USB port on the SBC. Use a USB 3.0 port if available.
    2. Type in the terminal ls -l /dev/disk/by-uuid/ to find the thumb drive and its unique reference (UUID). Record the UUID.
    3. To allow the pi user to write to the USB device, type the following commands one by one into the terminal:
      sudo mkdir /media/usb
      sudo chown -R pi:pi /media/usb
      sudo mount /dev/sda1 /media/usb -o uid=pi,gid=pi
      NOTE: The thumb drive can be added as a device that will auto-mount when the SBC restarts by adding the following line to the end of the fstab file at /etc/fstab:
      ​UUID=<UUID from step 1.7.2> /media/usb vfat auto,nofail,noatime,users,rw,uid=pi,gid=pi 0 0

2. Wiring stimulus hardware and assembling stage

  1. Connect and prepare microcontrollers.
    1. Connect the SBC to the programming port of the microcontroller (Arduino Due) with a USB2 type A to USB2 micro cable.
      ​NOTE: Use a high-quality cable such as the product in the Table of Materials to ensure proper operation.
    2. Locate "dueAssocLearn.ino" in the downloaded project repository. Open the sketch with the microcontroller IDE and upload it to the microcontroller connected to the SBC.
    3. Download and install the appropriate version of the Arduino IDE on the host computer.
    4. Connect the host computer to the microcontroller (Arduino Uno) with a USB2 type B to USB2 type A cable.
    5. Go to the GitHub repository (https://github.com/gerardjb/assocLearnRig) and download the "DTSC_US.ino" sketch to the host computer.
    6. On the host computer, run the microcontroller IDE and open the "DTSC_US.ino" sketch, then upload it to the microcontroller.
  2. Attach wires to the microcontrollers, breadboard, LEDs, rotary encoder, stepper motor with driver, and solenoid valve with driver as indicated in the Fritzing diagram in Figure 2.
  3. Power the stepper motor and solenoid valve.
    1. Properly wire one channel of a power supply to the +V and GND pins of the stepper motor driver.
    2. Turn on the power supply and set the attached channel voltage to 25 V.
      ​NOTE: If the connections between the stepper motor, driver, and power supply are correctly configured, a green indicator LED on the stepper motor driver will turn on.
    3. Properly wire the positive lead of a power supply to the solenoid valve driver hold voltage pin and the other positive lead to the spike voltage pin.
    4. Attach the negative leads to a ground shared with the control signal.
    5. Turn on the power supply and set the channel connected to the hold voltage to about 2.5 V and the channel connected to spike voltage to about 12 V.
  4. Connect an air source regulated to a pressure of ~20 PSI to the solenoid valve using the luer adapter.
  5. Test that all stimulus components and camera are functioning properly.
    1. Open a terminal on the SBC and type cd ~/assocLearnRig to navigate to the cloned GitHub repository.
    2. In the terminal, type python3 assocLearnRig_app.py to start the control graphical user interface.
    3. Start the camera stream by hitting the Stream button.
    4. Select the DEC Radio button, upload to the microcontroller, and start a session with default parameters by hitting the Start Session button.
      ​NOTE: After this step, a printout of the data log should appear in the terminal, the message on the camera stream should disappear, and the LED CS and solenoid valve US should turn on and off at appropriate times during each trial.
    5. After the session ends, repeat the previous steps with the DTSC Radio button selected.
      NOTE: Sketches in the GitHub repository ("testStepper.ino", "testRotary.ino", and "testSolenoid.ino") can be used to test individual components if the above steps do not provide satisfactory results.
  6. Make the running wheel.
    1. Cut a 3" wheel from a foam roller. Drill a 1/4" hole in the exact wheel center so that the wheel will not wobble when turned by the mouse's locomotion.
    2. Insert a 1/4" shaft into the wheel and fix it in place using clamping hubs placed on each side of the wheel.
  7. Affix the rotary encoder to a 4.5" aluminum channel using an M3 bolt. Stabilize the aluminum channel on the aluminum breadboard using a right angle bracket with a 1/4" bolt, nut, and washer as shown.
  8. Attach the wheel and rotary encoder using a shaft-coupling sleeve.
  9. Stabilize the free side of the wheel shaft with a bearing inserted in a right-angle end clamp installed on a breadboard-mounted optical post.
    ​NOTE: Ensure the wheel spins freely without wobbling when rotated by hand.
  10. Position the stimulus hardware, head restraint, infrared light array, and picamera around the assembled wheel.
    1. Position the head restraints using optical posts and right angle post clamps so that the head posts are 1.5 cm in front of the wheel axle and 2 cm above the wheel surface. (Values are for a 20 g mouse).
    2. Position the CS LED and solenoid valve outlet used for the DEC US less than 1 cm from the eye used for DEC.
    3. Mount the stepper motor used for the DTSC US
    4. Mount the picamera on an optical post ~10 cm from where the animal will be.
      ​NOTE: The design for the picamera mount can be made on a 3D printer from the file in "RaspPiCamMount1_1.stl" in the GitHub repository.
    5. Place the infrared light array slightly above and directly facing the position of the face on the same side as the picamera.
    6. Make a tactile stimulus for DTSC by taping foam to the edge of a piece of acrylic mounted to a 1/4" shaft using a clamping hub. Attach the tactile stimulus to the stepper motor shaft.
      ​NOTE: The design for the acrylic piece can be laser cut following the pattern in "TactileStimDesign.pdf" in the GitHub repository.

3. Preparing and running behavior experiments

  1. Implanting mouse headplate.
    1. Anesthetize a mouse using 2% isoflurane and head fix in a stereotactic frame.
    2. Apply an ophthalmic ointment to the eyes.
    3. Shave the scalp using soapy water and a sterile scalpel. Inject lidocaine directly underneath the skin of the incision site and clean the surgical site with povidone.
    4. Make an incision with a scalpel along the midline of the scalp from the back edge of the eyes to the back edge of the skull, being careful not to press too hard on the skull.
    5. Spread the incision open and clamp both sides with sterile hemostats to hold it open. Gently remove the periosteum using a cotton swab dipped with ethanol and allow the surface of the exposed skull to dry.
    6. Position the headplate level on the skull, making sure to position the front of the headplate posterior to the eyes. Use cyanoacrylate glue to attach the headplate to the skull and allow the glue to dry fully.
    7. Mix the dental cement powder (1 scoop), solvent (2 drops), and catalyst (1 drop) in a mixing dish and apply to all areas of exposed bone. Add layers until the surface is flush with the top edge of the headplate, making sure the headplate is securely attached to the skull.
    8. Suture the skin closed behind and in front of the headplate if necessary.
    9. Inject post-operative analgesia such as carprofen per institutional guidelines while allowing the animal to recover for at least 5 days.
  2. Preparing for behavior sessions.
    1. Allow the test animals to habituate to the platform by mounting them in the head restraint for 30-min sessions for 5 days preceding experiments.
      NOTE: By the end of the habituation sessions, animals should run comfortably on the wheel.
    2. (DEC only) Prior to sessions, ensure that the solenoid valve outlet is centered on the target eye positioned >1 cm away.
    3. (DEC only) Manually actuate an air puff using the push button. Ensure that the mouse promptly produces a blink without showing overt signs of stress such as adopting a hunched posture or grabbing the affected periocular region with the ipsilateral forepaw.
    4. (DTSC only) Prior to sessions, ensure that the tactile stimulus is centered on the animal's nose positioned ~1.5 cm away.
      NOTE: When a DTSC behavioral session is not running, the stepper motor is automatically inactivated to allow manual repositioning.
    5. (DTSC only) In the SBC terminal, type python3 assocLearnRig_app.py to start the GUI.
    6. (DTSC only) Run a test session of three trials with the default parameters by hitting the Start Session button in the GUI.
    7. (DTSC only) Ensure that the logged data that prints to the terminal show a deflection of greater than 20 but less than 100 steps logged on the rotary encoder following the US on each trial.
      ​CAUTION: To avoid harm and reduce stress to the animal, start the stimulus farther from the animal and move it closer until the required conditions are met.
  3. Running behavioral sessions with data logging.
    1. Mount a mouse to the head restraint.
    2. In the terminal of the SBC, type python3 assocLearnRig_app.py to start the GUI.
    3. To allow camera recordings during the behavioral trials, hit the Stream button.
      NOTE: Sessions can be run without a camera. In this case, only data from the rotary encoder and stimulus presentation timestamps are logged.
    4. Input identifying information for the animal into the Animal ID field and hit the Set button.
    5. Select either the DEC or DTSC from the radio button under the Session Type heading depending on which behavioral paradigm is desired.
    6. Input the desired experiment parameters to the fields below the Animal ID field and hit the Upload to Arduino button.
      ​NOTE: Details of the experiment parameters can be found in the GitHub repository README section.
    7. Hit the Start Session button to begin the session.
    8. When a session is initialized, data will begin logging in a new directory created in "/media/usb" in the SBC thumb drive mount point.

4. Exporting and analyzing data

  1. To export all the recorded sessions to the host computer, open a command prompt and input the command pscp -r pi@Pi_IP_address:/media/usb* host_computer_destination, then authenticate with the SBC password.
    NOTE: The above command is for a Windows machine. On Mac and Linux machines, use terminal and replace "pscp" with "scp".
  2. Install Anaconda25 or another python package manager (PPM) on the host computer.
  3. Go to the GitHub repository and download "analyzeSession.py", "summarizeSessions.py", "session2mp4s.py", and "requirementsHost.txt".
  4. Open a PPM prompt and type conda install --file directory_containing_requirementsHostrequirements Host.txt to ensure that the Python package installation has the required python libraries.
  5. In the prompt, type cd directory_containing_analyzeData to navigate to the directory containing "analyzeData.py" and "session2mp4s.py". Run the analysis program by typing python analyzeSession.py
    NOTE: An error message will be generated if using a Python 2 version as python. To check the version, type python -V in the prompt.
  6. Select the directory containing the data when prompted. Directories with multiple subdirectories will be analyzed sequentially.
  7. For DEC sessions, for each session directory analyzed, select a region of interest (ROI) containing the mouse's eye from a trial average image.
    NOTE: Final analysis data files and summary graphs will populate to a subdirectory of each analyzed session directory.
  8. Type python summarizeSessions.py to generate summary data across multiple sessions.
  9. Type in the prompt python session2mp4s.py to convert imaging data files into viewable .mp4 files.

Subscription Required. Please recommend JoVE to your librarian.

Representative Results

Workflow for DEC experiments and analysis
Proper experimental parameter selection is important for successful delay eyeblink conditioning (DEC) training. For the data presented here, the GUI was used to choose a CS duration of 350 ms and a US duration of 50 ms. This pairing results in an inter-stimulus interval of 300 ms: long enough to prevent low-amplitude CR production10 and short enough to avoid getting into the regime of poor learning or trace conditioning, a process that engages additional brain regions11. The time between trials was set using the ITI low and high fields to be randomly chosen uniformly from a range of 5-15 s. The randomization of the inter-trial intervals makes it impossible for animal subjects to use timing cues other than the CS and US themselves for task performance.

Including trials that omit either the CS or US allows assessment of the CR and UR kinematics even in trained animals. The user can define the proportion of trials in which CS and US are paired or presented in isolation. In the data presented here, we ran all sessions at 10% CS-only trials with paired trials constituting the rest and no US-only trials. Note that including excessive numbers of unpaired trials can negatively impact training. For example, sessions with greater than 50% of trials unpaired are commonly used to drive the extinction of CRs in trained animals19,26.

Camera preparation and lighting conditions are also critical for acquiring high-quality data. The frame rate of acquisition can be adjusted in the Picamera acquisition software. In the data presented here, we set a frame rate of 120 Hz for DEC experiments. The Picamera module itself allows frame rates of up to ~200 Hz, but we find that lower rates prevent frame loss and give adequate temporal resolution for eyelid tracking. The infrared light must be placed to illuminate the periocular fur evenly without creating excessive reflection from the cornea when the eye is open. Figure 3A shows a sample image from a recording session with acceptable lighting. The picamera acquisition software (picameraStream.py) is designed to provide consistent settings across a session by setting and holding the camera's white balance and gain based on illumination conditions when the camera is initialized.

Once a behavioral session is initialized, data from the camera and other platform hardware components will be automatically logged. Data logs are created in a directory named by the date and value input to the animal ID field in the GUI. Camera frames and time stamps for each trial are stored in individual files which are named using the animal ID, experiment date, and trial number. Platform events for each session, including wheel speed, trial starts, trial stops, and CS and US timing, are saved as a single .txt file.

Data transferred to the host machine can then be analyzed as described in section 4 of the protocol. Running analyzeData.py on a target directory will create a .npy container for eyelid position versus time for all trials in an array based on analysis of the camera files. This container file is created in the directory that is analyzed. Once all sessions have been analyzed for a given animal, all sessions can be aligned and concatenated using summarizeSessions.py. Results from an animal trained for 8 sessions of DEC are shown in Figure 3B. In addition, individual trials can be rendered as viewable .mp4 files using the session2mp4s.py utility. This utility imprints a square in the upper lefthand corner of the movie to indicate when the CS and US are applied. Sample DEC trials prepared in this way are presented side by side as Supplementary Video 1. The left panel shows a trial in which the animal successfully closes its eye in response to the LED CS. In the right panel, the animal does not blink until the US starts.

Animals trained on DEC following the protocols in section 3 and recorded with the preceding considerations should show clear evidence of well-timed CRs acquired gradually over multiple training days. Examples of behavioral traces with no CRs in an untrained animal and traces containing robust CRs from a trained animal are presented in Figure 3B. As these traces show, naïve animals should show no response to the CS but a robust response to the US. CRs should increase progressively in both size and frequency through behavioral sessions performed across days (Figure 3B-D). In contrast, suboptimal lighting conditions severely limit the quality of data acquired. When the contrast between the eye and surrounding fur is low (Figure 3E), slight changes in the image can significantly alter the recorded shape of the UR over a single session and decrease the signal-to-noise ratio for detecting eyelid position (Figure 3F-G).

To ensure high fidelity eyelid recordings, optimal light source placement is critical. The illumination LED should be trained directly on the recorded eye. If placement results in excessive glare on the corneal surface, a diffuser can be placed over the LED array to reduce this effect.

Workflow for DTSC experiments and analysis
Many of the considerations for experimental parameter selection are similar between delay tactile startle conditioning (DTSC) and DEC. Here, we will point out those that differ. In the example data, DTSC CS duration was set to 250 ms with a US duration of 50 ms. This shorter inter-stimulus interval was chosen to closely align with the shorter duration described as optimal for DTSC learning20. Other platform parameters set through the GUI were identical to those used for DEC.

Proper placement of the tactile stimulus is critical for learning in DTSC. We mount the tactile stimulus such that the foam end is centered slightly above the animal's nose at a distance of approximately 1.5 cm when in the neutral position. Once mounted, the stimulus can be turned by hand when a session is not running. During sessions, the stepper motor holds the stimulus at a precise location until an US is triggered. To ensure that the positioning is correct, we run a preparatory session of around three trials. Events logged on the rotary encoder are printed to the terminal screen, and this printout can be used to monitor the amplitude of animal URs in real time. While the maximum amplitude will vary from trial to trial, animals with an average maximum of ~40 counts on the encoder across the short session should perform well in the DTSC task. Based on the rotary encoder control settings, this value corresponds to 24 cm/s, with a negative value indicating that the animal is moving backward on the wheel.

The organization and naming of files produced in the course of DTSC sessions are the same as those produced in DEC. Running analyzeSession.py will create a .npy container for wheel speed versus time for all trials in an array from analysis of the data logged in the .csv file. Once all sessions have been analyzed for a given animal, all sessions can be aligned and concatenated using summarizeSession.py. Results from an animal trained for 5 sessions of DEC are presented in Figure 4A. As for DEC, the camera captures from DTSC can be converted to viewable .mp4 files. Sample DTSC trials are shown side by side in Supplementary Video 2. The left panel shows a trial in which the animal successfully backs the wheel in response to the LED CS. In the right panel, the animal fails to move the wheel until the tactile stimulus US is applied.

The time course and amplitude relative to the UR of responses in animals trained on the DTSC paradigm show qualitative similarities to those trained on DEC. Naïve animals should show no response to the CS, and learn to move the wheel backward in response to the CS only after repeated exposures to the paired CS and US. The frequency and amplitude of CRs increase as training proceeds (Figure 4A,B). In the case of DTSC, we have found that UR amplitude early in training is a good predictor of the success of learning. In a cohort of animals trained with an US that produced low amplitude URs (<20 cm/s), no animal learned to consistently produce CRs after 4 days of training (Figure 4C,D).

Differences between DEC and DTSC training
DEC and DTSC differ in important ways. First, DTSC learning on this platform occurs more rapidly, with most animals achieving a high degree of task proficiency by the third day of training and asymptotic performance by day five. DEC learning is slower for most animals by at least 3 days. Second, the DTSC system incorporates automatic detection of successful CRs, which serve as a feedback signal to the apparatus to decrease the amplitude of the tactile stimulus. This training procedure mimics eyeblink conditioning, in which improved CR performance provides partial protection from an aversive corneal air puff. In contrast, head-fixed animals in the DTSC paradigm are unable to protect themselves from the tactile stimulus by their motor response alone. By basing US amplitude on the presence of a CR, animals have the opportunity to shield themselves from the aversive stimulus.

Figure 1
Figure 1: Platform attributes and design. (A) Platform elements for recording animal behavior under head-fixed conditions. The mouse was adapted from a Biorender image. (B) Timing and stimuli for DEC and DTSC conditioning. A user-defined inter-stimulus interval (ISI) determines how long the CS only epoch lasts. CS and US epochs are designed to co-terminate. (C) Picture demonstrating placement of key platform elements. 1) Stepper motor for control of the DTSC US. 2) Running wheel for the animal. 3) Rotary encoder for tracking wheel movement. 4) Foam taped over an acrylic arm that serves as the DTSC tactile stimulus. 5) LED CS. 6) Solenoid valve and outlet that provides the DEC US. 7) Picamera for recording animal behavior. 8) Infrared LED for stage illumination. Please click here to view a larger version of this figure.

Figure 2
Figure 2: Wiring of platform hardware elements. (A) Fritzing wiring diagram of platform hardware when fully assembled. Wires are colored by modules with orange = Camera module; yellow = DEC US module; blue = LED CS module; purple = DTSC US module; green = Rotary encoder module. The picamera is excluded but attaches to the camera serial interface located on the surface of the Raspberry Pi. Batteries indicate direct current power supplies at the specified voltage. (B-F) Equivalent wiring scheme for isolated modules. Wires have been recolored, so that red and black always indicate positive supply rail and ground, respectively, while other wires are colored to allow easy following of the circuit. Please click here to view a larger version of this figure.

Figure 3
Figure 3: Representative results of DEC training. (A) Example camera frame from a session with acceptable illumination conditions. Note the high contrast between the eye and periocular fur. (B) Performance of a single animal during sessions performed across days in the DEC paradigm. Horizontal lines indicate performance on each trial, with warm colors indicating more eyelid closure. The leftmost red black vertical line indicates the onset of the CS, while the dotted line indicates the initiation of the US. The second solid line indicates cessation of the CS and US. Note that the number of trials with successful responses during the CS increases across training sessions. (C) Animal performance from (B) with individual traces derived from the trial average for the session each day. The hue saturation indicates session number with higher saturation for later sessions. (D) Performance for all animals in the DEC group (n = 7). The thin lines indicate the percent of trials with a detectable CR from each session for each animal. The thick lines indicate the session means across all animals. (E) Example camera frame from a session with sub-optimal illumination conditions. (F) Quantification of single trials recorded with poor illumination. The UR is detectable but with lower contrast and higher variability than under optimal light conditions. (G) Session average traces from trials presented in (F). Please click here to view a larger version of this figure.

Figure 4
Figure 4: Representative results of DTSC training. (A) Performance of a single animal during sessions performed across days in the DTSC paradigm. Horizontal lines indicate performance on each trial, with warm colors indicating backward wheel movement. The leftmost black vertical line indicates the onset of the CS, while the dotted line indicates the initiation of the US. The second solid line indicates cessation of the CS and US. (B) Animal performance from (A) with individual traces derived from the trial average for the session each day. The hue saturation indicates session number with higher saturation for later sessions. (C) Performance for all animals in the DTSC group (n = 6). The thin lines indicate the percent of trials with a detectable CR from each session for each animal. The thick lines indicate the session means across all animals. (D) Single trials as in (A) from a cohort where the US intensity elicited low amplitude URs. (E) Session average traces presented as in (B) for the animals subjected to the weak US. (F) Performance for all animals in DTSC with weak US (n = 6). Please click here to view a larger version of this figure.

Supplementary Video 1: Sample DEC hit and miss trials. DEC trials are compared in video 1. Each video shows trials in which the subject makes (Left) or fails to make (Right) the target CR synchronized and played side by side for comparison. The LED CS comes on when the blue square appears in the upper left corner of each video. The US control signal is active when a white square replaces the blue square. CS and US control signals co-terminate when the square disappears. Please click here to download this Video.

Supplementary Video 2: Sample DTSC hit and miss trials. Video 2 shows DTSC trial comparison. Each video shows trials in which the subject makes (Left) or fails to make (Right) the target CR synchronized and played side by side for comparison. The LED CS comes on when the blue square appears in the upper left corner of each video. The US control signal is active when a white square replaces the blue square. CS and US control signals co-terminate when the square disappears. Please click here to download this Video.

Subscription Required. Please recommend JoVE to your librarian.

Discussion

The platform with associated protocols outlined here can be used to reliably track animal behavior in two sensory associative learning tasks. Each task depends on intact communication through the climbing fiber pathway. In the design described here, we incorporate elements to facilitate learning and recording/perturbation of cerebellar response. These include a wheel to allow for free locomotion11,18 as well as head fixation. The wheel allows mouse subjects to locomote freely, which has been observed to be critical for DEC acquisition18. Head fixation in mice allows researchers to take advantage of genetic, electrophysiological, imaging, and optogenetic approaches that are more difficult to use in other model species or under freely moving conditions12. We have used our design for each of these applications. The software run on the microcontrollers can easily be adapted to control timing signals for multiphoton acquisition or synchronization with optogenetic stimulation, both with sub-millisecond precision. Care must be taken to minimize the animal perception of optogenetic and imaging equipment when these are combined with behavioral experiments. For example, many multiphoton systems emit an audible sound from their galvanometric scanners or shutters when imaging acquisitions start. If the acquisitions are triggered by trial starts, such sounds can serve as an inadvertent cue to animal subjects that a stimulus is forthcoming.

Control of the behavioral apparatus is built around an SBC, which is used to generate a graphical user interface for managing the experiment, the camera, and data export. The SBC also sends commands to two microcontrollers that handle the timing of trials and directly control hardware components such as stimulus presentation and the rotary encoder. The protocols detailed here were tested using either a Raspberry Pi 3B+ or 4B attached to an Arduino Due to control experiment timing and an Arduino Uno to control the presentation of the DTSC US. Other hardware design implementations are possible but have not been tested with the provided software.

To facilitate using multiple rigs in parallel, we recommend operating the SBC in "headless" mode. In this configuration, a host computer is used to interact with the SBC. An ethernet switch allows simultaneous internet connectivity to both a host computer as well as SBC. The switch also allows for direct communication between the host and SBC with fast data transfer. As a result, the switch allows for easy data transfer and SBC package maintenance.

For running multiple rigs in parallel, each rig should be placed in its own specialized enclosure. These enclosures must include soundproofing if placed in close proximity to one another. Suppressing sound between adjacent rigs can help to avoid unintentional auditory cues from stimuli produced in neighboring enclosures.

Use of a single platform for DEC and DTSC enables investigators to flexibly navigate each paradigms' strengths and weaknesses. DEC enjoys insight derived from decades of research into what brain regions and specific cerebellar circuit elements are involved in task learning and execution1,4,11,13,14,15,19. However, in mice, the region of the cerebellar cortex most often associated with eyeblink conditioning11,12 is located deep within the primary cerebellar fissure (though see15,17,27 which demonstrate a DEC-associated region of superficial lobule VI). A deep locus for learning complicates access for optical experiments, particularly multiphoton imaging of cell activity and optogenetic perturbation experiments. In contrast, the cerebellar substrates of DTSC are located partially in the superficial aspect of lobules IV/V20. DTSC therefore presents optical access comparable to that of the dorsal neocortex, a popular site for systems neuroscience investigations.

In our design, animal behavior is tracked using a rotary encoder attached to the wheel and a camera. We selected these methods for low cost and ease of implementation. In some instances, other tracking methods may provide more spatial and temporal accuracy. For example, eyelid position in DEC has commonly been tracked using Hall effect sensors28,29 or electromyogram recordings of the periorbital region of the musculus orbicularis oculi30,31. Similarly, tracking of locomotion by detecting wheel motion gives a less detailed picture of animal behavior than image-based pose tracking algorithms such as SLEAP32 and DeepLabCut33. Camera-based recordings allow the addition of such approaches.

Here, we have presented a platform for tracking animal behavior during two climbing fiber-dependent associative learning paradigms. Our platform is intended to increase the accessibility of these methods both in terms of cost as well as ease of implementation.

Subscription Required. Please recommend JoVE to your librarian.

Disclosures

The authors have no conflicts of interest to disclose.

Acknowledgments

This work is supported by grants from the National Institutes of Mental Health NRSA F32 MH120887-03 (to G.J.B.) and R01 NS045193 and R01 MH115750 (to S.S-H.W.). We thank Drs. Bas Koekkoek and Henk-Jan Boele for helpful discussions for optimizing the DEC setup and Drs. Yue Wang and Xiaoying Chen for helpful discussions for optimizing the DTSC setup.

Materials

Name Company Catalog Number Comments
"B" Quick Base For C&B METABOND - 10 mL bottle Parkell S398 Dental cement solvent
"C" Universal TBB Catalyst - 0.7 mL Parkell S371 Catalyst
#8 Washers Thorlabs W8S038 Washers
0.250" (1/4") x 8.00" Stainless Steel Precision Shafting Servocity 634172 1/4" shaft
0.250” (0.770") Clamping Hub Servocity 545588 Clamping hub
1/4" to 6 mm Set Screw Shaft Coupler- 5 pack Actobotics 625106 Shaft-coupling sleeve
1/4"-20 Cap Screws, 3/4" Long Thorlabs SH25S075 1/4" bolt
100 pcs 5 mm 395–400 nm UV Ultraviolet LED Light Emitting Diode Clear Round Lens 29 mm Long Lead (DC 3V) LEDs Lights +100 pcs Resistors EDGELEC ‎ED_YT05_U_100Pcs CS LEDs
2 m Micro HDMI to DVI-D Cable - M/M - 2 m Micro HDMI to DVI Cable - 19 pin HDMI (D) Male to DVI-D Male - 1920 x 1200 Video Star-tech ‎HDDDVIMM2M Raspberry Pi4B to monitor cable
256 GB Ultra Fit USB 3.1 Flash Drive SanDisk ‎SDCZ430-256G-G46 USB thumb drive
3.3 V–5 V 4 Channels Logic Level Converter Bi-Directional Shifter Module Amazon B00ZC6B8VM Logic level shifter
32 GB 95 MB/s (U1) microSDHC EVO Select Memory Card Samsung ‎MB-ME32GA/AM microSD card
4.50" Aluminum Channel Servocity 585444 4.5" aluminum channel
48-LED CCTV Ir Infrared Night Vision Illuminator Towallmark SODIAL Infrared light array
4PCS Breadboards Kit Include 2PCS 830 Point 2PCS 400 Point Solderless Breadboards for Proto Shield Distribution Connecting Blocks REXQualis B07DL13RZH Breadboard
5 Port Gigabit Unmanaged Ethernet Network Switch TP-Link ‎TL-SG105 Ethernet switch
5 V 2.5 A Raspberry Pi 3 B+ Power Supply/Adapter Canakit ‎DCAR-RSP-2A5 Power supply for Raspberry Pi 3B+
5-0 ETHILON BLACK 1 x 18" C-3 Ethicon 668G Sutures
6 mm Shaft Encoder 2000 PPR Pushpull Line Driver Universal Output Line Driver Output 5-26 V dc Supply Calt  B01EWER68I Rotary encoder
Ø1/2" Optical Post, SS, 8-32 Setscrew, 1/4"-20 Tap, L = 1", 5 Pack Thorlabs TR1-P5 Optical posts
Ø1/2" Optical Post, SS, 8-32 Setscrew, 1/4"-20 Tap, L = 2", 5 Pack Thorlabs TR2-P5 Optical posts
Ø1/2" Optical Post, SS, 8-32 Setscrew, 1/4"-20 Tap, L = 4", 5 Pack Thorlabs TR4-P5 Optical posts
Ø1/2" Optical Post, SS, 8-32 Setscrew, 1/4"-20 Tap, L = 6", 5 Pack Thorlabs TR6-P5 Optical posts
Ø1/2" Post Holder, Spring-Loaded Hex-Locking Thumbscrew, L = 2" Thorlabs PH2 Optical post holder
Adapter-062-M X LUER LOCK-F The Lee Co. TMRA3201950Z Solenoid valve luer adapter
Aeromat Foam Roller Size: 36" Length Aeromat B002H3CMUE Foam roller
Aluminum Breadboard 10" x 12" x 1/2", 1/4"-20 Taps Thorlabs MB1012 Aluminum breadboard
Amazon Basics HDMI to DVI Adapter Cable, Black, 6 Feet, 1-Pack Amazon HL-007347 Raspberry Pi3B+ to monitor cable
Arduino  Uno R3 Arduino A000066 Arduino Uno (microcontroller board)
Arduino Due Arduino ‎A000062 Arduino Due (microcontroller board)
Bench Power Supply, Single, Adjustable, 3 Output, 0 V, 24 V, 0 A, 2 A Tenma 72-8335A Power supply
Clear Scratch- and UV-Resistant Cast Acrylic Sheet, 12" x 24" x 1/8" McMaster Carr 8560K257 Acrylic sheet
CNC Stepper Motor Driver 1.0–4.2 A 20–50 V DC 1/128 Micro-Step Resolutions for Nema 17 and 23 Stepper Motor Stepper Online B06Y5VPSFN Stepper motor driver
Compact Compressed Air Regulator, Inline Relieving, Brass Housing, 1/4 NPT McMaster Carr 6763K13 Air source regulator
Cotton Swab Puritan 806-WC Cotton swab
Dell 1908FP 19" Flat Panel Monitor - 1908FPC Dell 1908FPC Computer monitor
Flex Cable for Raspberry Pi Camera Adafruit 2144 camera serial interface cable
High Torque Nema 17 Bipolar Stepper Motor 92 oz·in/65 N·cm 2.1 A Extruder Motor Stepper Online 17HS24-2104S Stepper motor
Isoflurane Henry Schein 66794001725 Isoflurane
Krazy Maximum Bond Permanent Glue, 0.18 oz. Krazy Glue KG483 Cyanoacrylate glue
Lidocaine HCl VetOne 510212 Lidocaine
Low-Strength Steel Hex Nut, Grade 2, Zinc-Plated, 1/4"-20 Thread Size McMaster Carr 90473A029 Nuts
M3 x 50 mm Partially Threaded Hex Key Socket Cap Head Screws 10 pcs Uxcell A16040100ux1380 M3 bolt
NEMA 17 Stepper Motor Mount ACTOBOTICS 555152 Stepper motor mount
Official Raspberry Pi Power Supply 5.1 V 3 A with USB C - 1.5 m long Adafruit 4298 Power supply for Raspberry Pi 4B
Optixcare Dog & Cat Eye Lube Lubricating Gel, 0.70-oz tube Optixcare 142422 Opthalimic ointment
Precision Stainless Steel Ball Bearing, Shielded, Trade No. R188-2Z, 13000 rpm Maximum Speed McMaster-Carr 3759T57 Bearing
Premium Female/Female Jumper Wires - 40 x 6" Adafruit 266 Wires
Premium Female/Male 'Extension' Jumper Wires - 40 x 6" (150 mm) Adafruit 826 Wires
Premium Male/Male Jumper Wires - 40 x 6" Adafruit 758 Wires
Radiopaque L-Powder for C&B METABOND - 5 g Parkell S396 Dental cement powder
Raspberry Pi (3B+ or 4B) Adafruit 3775 or 4295 Raspberry Pi
Raspberry Pi NoIR Camera Module V2 - 8MP 1080P30 Raspberry Pi Foundation RPI3-NOIR-V2 Raspberry NoIR V2 camera
Right-Angle Bracket, 1/4" (M6) Counterbored Slot, 8-32 Taps Thorlabs AB90E Right-angle bracket
Right-Angle Clamp for Ø1/2" Posts, 3/16" Hex Thorlabs RA90 Right-angle optical post clamp
Right-Angle End Clamp for Ø1/2" Posts, 1/4"-20 Stud and 3/16" Hex Thorlabs RA180 Right-angle end clamp
RJ45 Cat-6 Ethernet Patch Internet Cable Amazon ‎CAT6-7FT-5P-BLUE Ethernet cable
Rotating Clamp for Ø1/2" Posts, 360° Continuously Adjustable, 3/16" Hex Thorlabs SWC Rotating optical post clamps
Spike & Hold Driver-0.1 TO 5 MS The Lee Co. IECX0501350A Solenoid valve driver
Swivel Base Adapter Thorlabs UPHA Post holder adapter
USB 2.0 A-Male to Micro B Cable, 6 feet Amazon ‎7T9MV4 USB2 type A to USB2 micro cable
USB 2.0 Printer Cable - A-Male to B-Male, 6 Feet (1.8 m) Amazon B072L34SZS USB2 type B to USB2 type A cable
VHS-M/SP-12 V The Lee Co. INKX0514900A Solenoid valve
Zinc-Plated Steel 1/4" washer, OD 1.000" McMaster Carr 91090A108 Washers

DOWNLOAD MATERIALS LIST

References

  1. McCormick, D. A., Lavond, D. G., Clark, G. A., Kettner, R. E., Rising, C. E., Thompson, R. F. The engram found? Role of the cerebellum in classical conditioning of nictitating membrane and eyelid responses. Bulletin of the Psychonomic Society. 18 (3), 103-105 (1981).
  2. McCormick, D. A., Clark, G. A., Lavond, D. G., Thompson, R. F. Initial localization of the memory trace for a basic form of learning. Proceedings of the National Academy of Sciences of the United States of America. 79 (8), 2731-2735 (1982).
  3. McCormick, D. A., Thompson, R. F. Cerebellum: essential involvement in the classically conditioned eyelid response. Science. 223 (4633), New York, N.Y. 296-299 (1984).
  4. Krupa, D. J., Thompson, J. K., Thompson, R. F. Localization of a memory trace in the mammalian brain. Science. 260 (5110), New York, N.Y. 989-991 (1993).
  5. Llinás, R., Sugimori, M. Electrophysiological properties of in vitro Purkinje cell dendrites in mammalian cerebellar slices. The Journal of Physiology. 305, 197-213 (1980).
  6. Mintz, M., Lavond, D. G., Zhang, A. A., Yun, Y., Thompson, R. F. Unilateral inferior olive NMDA lesion leads to unilateral deficit in acquisition and retention of eyelid classical conditioning. Behavioral and Neural Biology. 61 (3), 218-224 (1994).
  7. Welsh, J. P., Harvey, J. A. Cerebellar lesions and the nictitating membrane reflex: performance deficits of the conditioned and unconditioned response. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience. 9 (1), 299-311 (1989).
  8. Mauk, M. D., Steinmetz, J. E., Thompson, R. F. Classical conditioning using stimulation of the inferior olive as the unconditioned stimulus. Proceedings of the National Academy of Sciences of the United States of America. 83 (14), 5349-5353 (1986).
  9. Steinmetz, J. E., Lavond, D. G., Thompson, R. F. Classical conditioning in rabbits using pontine nucleus stimulation as a conditioned stimulus and inferior olive stimulation as an unconditioned stimulus. Synapse. 3 (3), New York, N.Y. 225-233 (1989).
  10. Chettih, S. N., McDougle, S. D., Ruffolo, L. I., Medina, J. F. Adaptive timing of motor output in the mouse: The role of movement oscillations in eyelid conditioning. Frontiers in Integrative Neuroscience. 5, 72 (2011).
  11. Heiney, S. A., Wohl, M. P., Chettih, S. N., Ruffolo, L. I., Medina, J. F. Cerebellar-dependent expression of motor learning during eyeblink conditioning in head-fixed mice. The Journal of Neuroscience. 34 (45), 14845-14853 (2014).
  12. Heiney, S. A., Kim, J., Augustine, G. J., Medina, J. F. Precise control of movement kinematics by optogenetic inhibition of purkinje cell activity. Journal of Neuroscience. 34 (6), 2321-2330 (2014).
  13. Ten Brinke, M. M., et al. Evolving models of pavlovian conditioning: Cerebellar cortical dynamics in awake behaving mice. Cell Reports. 13 (9), 1977-1988 (2015).
  14. Gao, Z., et al. Excitatory cerebellar nucleocortical circuit provides internal amplification during associative conditioning. Neuron. 89 (3), 645-657 (2016).
  15. Giovannucci, A., et al. Cerebellar granule cells acquire a widespread predictive feedback signal during motor learning. Nature Neuroscience. 20 (5), 727-734 (2017).
  16. Ten Brinke, M. M., et al. Dynamic modulation of activity in cerebellar nuclei neurons during pavlovian eyeblink conditioning in mice. eLife. 6, 28132 (2017).
  17. Wang, X., Yu, S., Ren, Z., De Zeeuw, C. I., Gao, Z. A FN-MdV pathway and its role in cerebellar multimodular control of sensorimotor behavior. Nature Communications. 11 (1), 6050 (2020).
  18. Albergaria, C., Silva, N. T., Pritchett, D. L., Carey, M. R. Locomotor activity modulates associative learning in mouse cerebellum. Nature Neuroscience. 21 (5), 725-735 (2018).
  19. Kim, O. A., Ohmae, S., Medina, J. F. A cerebello-olivary signal for negative prediction error is sufficient to cause extinction of associative motor learning. Nature Neuroscience. 23 (12), 1550-1554 (2020).
  20. Yamada, T., et al. Sensory experience remodels genome architecture in neural circuit to drive motor learning. Nature. 569 (7758), 708-713 (2019).
  21. Horlington, M. Startle response circadian rhythm in rats: lack of correlation with motor activity. Physiology & Behavior. 5 (1), 49-53 (1970).
  22. Yeomans, J. S., Li, L., Scott, B. W., Frankland, P. W. Tactile, acoustic and vestibular systems sum to elicit the startle reflex. Neuroscience and Biobehavioral Reviews. 26 (1), 1-11 (2002).
  23. Raspberry Pi Operating system images. , Available from: https://www.raspberrypi.com/software/operationg-systems/ (2021).
  24. VNC Server. VNC® Connect. , Available from: https://www.realvnc.com/en/connect/download/vnc/ (2021).
  25. Anaconda: The world's most popular data science platform. , Available from: https://xddebuganaconda.xdlab.co/ (2021).
  26. De Zeeuw, C. I., Ten Brinke, M. M. Motor learning and the cerebellum. Cold Spring Harbor Perspectives in Biology. 7 (9), 021683 (2015).
  27. Badura, A., et al. Normal cognitive and social development require posterior cerebellar activity. eLife. 7, 36401 (2018).
  28. Koekkoek, S. K. E., Den Ouden, W. L., Perry, G., Highstein, S. M., De Zeeuw, C. I. Monitoring kinetic and frequency-domain properties of eyelid responses in mice with magnetic distance measurement technique. Journal of Neurophysiology. 88 (4), 2124-2133 (2002).
  29. Kloth, A. D., et al. Cerebellar associative sensory learning defects in five mouse autism models. eLife. 4, 06085 (2015).
  30. Boele, H. -J., Koekkoek, S. K. E., De Zeeuw, C. I. Cerebellar and extracerebellar involvement in mouse eyeblink conditioning: the ACDC model. Frontiers in Cellular Neuroscience. 3, (2010).
  31. Lin, C., Disterhoft, J., Weiss, C. Whisker-signaled eyeblink classical conditioning in head-fixed Mice. Journal of Visualized Experiments: JoVE. (109), e53310 (2016).
  32. Pereira, T. D., et al. Fast animal pose estimation using deep neural networks. Nature Methods. 16 (1), 117-125 (2019).
  33. Mathis, A., et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nature Neuroscience. 21 (9), 1281-1289 (2018).

Tags

Flexible Platform Monitoring Cerebellum-dependent Sensory Associative Learning Mouse Model Organism Method Implementation Inexpensive Protocol Multiple Behaviors Research Platform Modification Camera Interface Cable Camera Port Operating System MicroSD Card Wired Connection Ethernet IP Address Raspberry Pi Configuration Setting Camera Option SSH Option VNC Option Ethernet Cable Desktop Access
A Flexible Platform for Monitoring Cerebellum-Dependent Sensory Associative Learning
Play Video
PDF DOI DOWNLOAD MATERIALS LIST

Cite this Article

Broussard, G. J., Kislin, M., Jung,More

Broussard, G. J., Kislin, M., Jung, C., Wang, S. S. H. A Flexible Platform for Monitoring Cerebellum-Dependent Sensory Associative Learning. J. Vis. Exp. (179), e63205, doi:10.3791/63205 (2022).

Less
Copy Citation Download Citation Reprints and Permissions
View Video

Get cutting-edge science videos from JoVE sent straight to your inbox every month.

Waiting X
Simple Hit Counter