The ability to adjust behavior to sudden changes in the environment develops gradually in childhood and adolescence. For example, in the Dimensional Change Card Sort task, participants switch from sorting cards one way, such as shape, to sorting them a different way, such as color. Adjusting behavior in this way exacts a small performance cost, or switch cost, such that responses are typically slower and more error-prone on switch trials in which the sorting rule changes as compared to repeat trials in which the sorting rule remains the same. The ability to flexibly adjust behavior is often said to develop gradually, in part because behavioral costs such as switch costs typically decrease with increasing age. Why aspects of higher-order cognition, such as behavioral flexibility, develop so gradually remains an open question. One hypothesis is that these changes occur in association with functional changes in broad-scale cognitive control networks. On this view, complex mental operations, such as switching, involve rapid interactions between several distributed brain regions, including those that update and maintain task rules, re-orient attention, and select behaviors. With development, functional connections between these regions strengthen, leading to faster and more efficient switching operations. The current video describes a method of testing this hypothesis through the collection and multivariate analysis of fMRI data from participants of different ages.
20 Related JoVE Articles!
Embolic Middle Cerebral Artery Occlusion (MCAO) for Ischemic Stroke with Homologous Blood Clots in Rats
Institutions: Louisiana State University Health Science Center, Shreveport.
Clinically, thrombolytic therapy with use of recombinant tissue plasminogen activator (tPA) remains the most effective treatment for acute ischemic stroke. However, the use of tPA is limited by its narrow therapeutic window and by increased risk of hemorrhagic transformation. There is an urgent need to develop suitable stroke models to study new thrombolytic agents and strategies for treatment of ischemic stroke. At present, two major types of ischemic stroke models have been developed in rats and mice: intraluminal suture MCAO and embolic MCAO. Although MCAO models via the intraluminal suture technique have been widely used in mechanism-driven stroke research, these suture models do not mimic the clinical situation and are not suitable for thrombolytic studies. Among these models, the embolic MCAO model closely mimics human ischemic stroke and is suitable for preclinical investigation of thrombolytic therapy. This embolic model was first developed in rats by Overgaard et al.1
in 1992 and further characterized by Zhang et al.
. Although embolic MCAO has gained increasing attention, there are technical problems faced by many laboratories. To meet increasing needs for thrombolytic research, we present a highly reproducible model of embolic MCAO in the rat, which can develop a predictable infarct volume within the MCA territory. In brief, a modified PE-50 tube is gently advanced from the external carotid artery (ECA) into the lumen of the internal carotid artery (ICA) until the tip of the catheter reaches the origin of the MCA. Through the catheter, a single homologous blood clot is placed at the origin of the MCA. To identify the success of MCA occlusion, regional cerebral blood flow was monitored, neurological deficits and infarct volumes were measured. The techniques presented in this paper should help investigators to overcome technical problems for establishing this model for stroke research.
Medicine, Issue 91, ischemic stroke, model, embolus, middle cerebral artery occlusion, thrombolytic therapy
Simultaneous Scalp Electroencephalography (EEG), Electromyography (EMG), and Whole-body Segmental Inertial Recording for Multi-modal Neural Decoding
Institutions: National Institutes of Health, University of Houston, University of Houston, University of Houston, University of Houston.
Recent studies support the involvement of supraspinal networks in control of bipedal human walking. Part of this evidence encompasses studies, including our previous work, demonstrating that gait kinematics and limb coordination during treadmill walking can be inferred from the scalp electroencephalogram (EEG) with reasonably high decoding accuracies. These results provide impetus for development of non-invasive brain-machine-interface (BMI) systems for use in restoration and/or augmentation of gait- a primary goal of rehabilitation research. To date, studies examining EEG decoding of activity during gait have been limited to treadmill walking in a controlled environment. However, to be practically viable a BMI system must be applicable for use in everyday locomotor tasks such as over ground walking and turning. Here, we present a novel protocol for non-invasive collection of brain activity (EEG), muscle activity (electromyography (EMG)), and whole-body kinematic data (head, torso, and limb trajectories) during both treadmill and over ground walking tasks. By collecting these data in the uncontrolled environment insight can be gained regarding the feasibility of decoding unconstrained gait and surface EMG from scalp EEG.
Behavior, Issue 77, Neuroscience, Neurobiology, Medicine, Anatomy, Physiology, Biomedical Engineering, Molecular Biology, Electroencephalography, EEG, Electromyography, EMG, electroencephalograph, gait, brain-computer interface, brain machine interface, neural decoding, over-ground walking, robotic gait, brain, imaging, clinical techniques
A Murine Model of Subarachnoid Hemorrhage
Institutions: University of Munich Medical Center.
In this video publication a standardized mouse model of subarachnoid hemorrhage (SAH) is presented. Bleeding is induced by endovascular Circle of Willis perforation (CWp) and proven by intracranial pressure (ICP) monitoring. Thereby a homogenous blood distribution in subarachnoid spaces surrounding the arterial circulation and cerebellar fissures is achieved. Animal physiology is maintained by intubation, mechanical ventilation, and continuous on-line monitoring of various physiological and cardiovascular parameters: body temperature, systemic blood pressure, heart rate, and hemoglobin saturation. Thereby the cerebral perfusion pressure can be tightly monitored resulting in a less variable volume of extravasated blood. This allows a better standardization of endovascular filament perforation in mice and makes the whole model highly reproducible. Thus it is readily available for pharmacological and pathophysiological studies in wild type and genetically altered mice.
Medicine, Issue 81, Nervous System Diseases, Subarachnoid hemorrhage (SAH), mouse model, filament perforation, intracranial pressure monitoring, blood distribution, surgical technique
Time Multiplexing Super Resolving Technique for Imaging from a Moving Platform
Institutions: Bar-Ilan University, Kfar Saba, Israel.
We propose a method for increasing the resolution of an object and overcoming the diffraction limit of an optical system installed on top of a moving imaging system, such as an airborne platform or satellite. The resolution improvement is obtained in a two-step process. First, three low resolution differently defocused images are being captured and the optical phase is retrieved using an improved iterative Gerchberg-Saxton based algorithm. The phase retrieval allows to numerically back propagate the field to the aperture plane. Second, the imaging system is shifted and the first step is repeated. The obtained optical fields at the aperture plane are combined and a synthetically increased lens aperture is generated along the direction of movement, yielding higher imaging resolution. The method resembles a well-known approach from the microwave regime called the Synthetic Aperture Radar (SAR) in which the antenna size is synthetically increased along the platform propagation direction. The proposed method is demonstrated through laboratory experiment.
Physics, Issue 84, Superresolution, Fourier optics, Remote Sensing and Sensors, Digital Image Processing, optics, resolution
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Network Analysis of the Default Mode Network Using Functional Connectivity MRI in Temporal Lobe Epilepsy
Institutions: Baylor College of Medicine, Michael E. DeBakey VA Medical Center, University of California, Los Angeles, University of California, Los Angeles.
Functional connectivity MRI (fcMRI) is an fMRI method that examines the connectivity of different brain areas based on the correlation of BOLD signal fluctuations over time. Temporal Lobe Epilepsy (TLE) is the most common type of adult epilepsy and involves multiple brain networks. The default mode network (DMN) is involved in conscious, resting state cognition and is thought to be affected in TLE where seizures cause impairment of consciousness. The DMN in epilepsy was examined using seed based fcMRI. The anterior and posterior hubs of the DMN were used as seeds in this analysis. The results show a disconnection between the anterior and posterior hubs of the DMN in TLE during the basal state. In addition, increased DMN connectivity to other brain regions in left TLE along with decreased connectivity in right TLE is revealed. The analysis demonstrates how seed-based fcMRI can be used to probe cerebral networks in brain disorders such as TLE.
Medicine, Issue 90, Default Mode Network (DMN), Temporal Lobe Epilepsy (TLE), fMRI, MRI, functional connectivity MRI (fcMRI), blood oxygenation level dependent (BOLD)
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
High-resolution Spatiotemporal Analysis of Receptor Dynamics by Single-molecule Fluorescence Microscopy
Institutions: University of Würzburg, Germany.
Single-molecule microscopy is emerging as a powerful approach to analyze the behavior of signaling molecules, in particular concerning those aspect (e.g
., kinetics, coexistence of different states and populations, transient interactions), which are typically hidden in ensemble measurements, such as those obtained with standard biochemical or microscopy methods. Thus, dynamic events, such as receptor-receptor interactions, can be followed in real time in a living cell with high spatiotemporal resolution. This protocol describes a method based on labeling with small and bright organic fluorophores and total internal reflection fluorescence (TIRF) microscopy to directly visualize single receptors on the surface of living cells. This approach allows one to precisely localize receptors, measure the size of receptor complexes, and capture dynamic events such as transient receptor-receptor interactions. The protocol provides a detailed description of how to perform a single-molecule experiment, including sample preparation, image acquisition and image analysis. As an example, the application of this method to analyze two G-protein-coupled receptors, i.e
-adrenergic and γ-aminobutyric acid type B (GABAB
) receptor, is reported. The protocol can be adapted to other membrane proteins and different cell models, transfection methods and labeling strategies.
Bioengineering, Issue 89, pharmacology, microscopy, receptor, live-cell imaging, single-molecule, total internal reflection fluorescence, tracking, dimerization, protein-protein interactions
Test Samples for Optimizing STORM Super-Resolution Microscopy
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
High-speed Particle Image Velocimetry Near Surfaces
Institutions: University of Michigan.
Multi-dimensional and transient flows play a key role in many areas of science, engineering, and health sciences but are often not well understood. The complex nature of these flows may be studied using particle image velocimetry (PIV), a laser-based imaging technique for optically accessible flows. Though many forms of PIV exist that extend the technique beyond the original planar two-component velocity measurement capabilities, the basic PIV system consists of a light source (laser), a camera, tracer particles, and analysis algorithms. The imaging and recording parameters, the light source, and the algorithms are adjusted to optimize the recording for the flow of interest and obtain valid velocity data.
Common PIV investigations measure two-component velocities in a plane at a few frames per second. However, recent developments in instrumentation have facilitated high-frame rate (> 1 kHz) measurements capable of resolving transient flows with high temporal resolution. Therefore, high-frame rate measurements have enabled investigations on the evolution of the structure and dynamics of highly transient flows. These investigations play a critical role in understanding the fundamental physics of complex flows.
A detailed description for performing high-resolution, high-speed planar PIV to study a transient flow near the surface of a flat plate is presented here. Details for adjusting the parameter constraints such as image and recording properties, the laser sheet properties, and processing algorithms to adapt PIV for any flow of interest are included.
Physics, Issue 76, Mechanical Engineering, Fluid Mechanics, flow measurement, fluid heat transfer, internal flow in turbomachinery (applications), boundary layer flow (general), flow visualization (instrumentation), laser instruments (design and operation), Boundary layer, micro-PIV, optical laser diagnostics, internal combustion engines, flow, fluids, particle, velocimetry, visualization
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Focal Cerebral Ischemia Model by Endovascular Suture Occlusion of the Middle Cerebral Artery in the Rat
Institutions: University of Wisconsin-Madison.
Stroke is the leading cause of disability and the third leading cause of death in adults worldwide1
. In human stroke, there exists a highly variable clinical state; in the development of animal models of focal ischemia, however, achieving reproducibility of experimentally induced infarct volume is essential. The rat is a widely used animal model for stroke due to its relatively low animal husbandry costs and to the similarity of its cranial circulation to that of humans2,3
. In humans, the middle cerebral artery (MCA) is most commonly affected in stroke syndromes and multiple methods of MCA occlusion (MCAO) have been described to mimic this clinical syndrome in animal models. Because recanalization commonly occurs following an acute stroke in the human, reperfusion after a period of occlusion has been included in many of these models. In this video, we demonstrate the transient endovascular suture MCAO model in the spontaneously hypertensive rat (SHR). A filament with a silicon tip coating is placed intraluminally at the MCA origin for 60 minutes, followed by reperfusion. Note that the optimal occlusion period may vary in other rat strains, such as Wistar or Sprague-Dawley. Several behavioral indicators of stroke in the rat are shown. Focal ischemia is confirmed using T2-weighted magnetic resonance images and by staining brain sections with 2,3,5-triphenyltetrazolium chloride (TTC) 24 hours after MCAO.
Neuroscience, Issue 48, Stroke, cerebral ischemia, middle cerebral artery occlusion, intraluminal filament, rat, magnetic resonance imaging, surgery, neuroscience, brain
Intraluminal Middle Cerebral Artery Occlusion (MCAO) Model for Ischemic Stroke with Laser Doppler Flowmetry Guidance in Mice
Institutions: University of Florida, Shiraz University of Medical Sciences.
Stroke is the third leading cause of death and the leading cause of disability in the world, with an estimated cost of near $70 billion
in the United States in 20091,2
. The intraluminal middle cerebral artery occlusion (MCAO) model was developed by Koizumi4
in 1986 to simulate this
impactful human pathology in the rat. A modification of the MCAO method was later presented by Longa3
. Both techniques have been widely used to identify
molecular mechanisms of brain injury resulting from ischemic stroke and potential therapeutic modalities5
. This relatively noninvasive method in rats has been
extended to use in mice to take advantage of transgenic and knockout strains6,7
. To model focal cerebral ischemia, an intraluminal suture is advanced via
the internal carotid artery to occlude the base of the MCA. Retracting the suture after a specified period of time mimics spontaneous reperfusion, but the
suture can also be permanently retained. This video will be demonstrating the two major approaches for performing intraluminal MCAO procedure in mice in a
stepwise fashion, as well as providing insights for potential drawbacks and pitfalls. The ischemic brain tissue will subsequently be stained by
2,3,5-triphenyltetrazolium chloride (TTC) to evaluate the extent of cerebral infarction8
Medicine, Issue 51, Cerebral ischemia, mouse, middle cerebral artery occlusion, intraluminal suture, Laser Doppler
Creating Objects and Object Categories for Studying Perception and Perceptual Learning
Institutions: Georgia Health Sciences University, Georgia Health Sciences University, Georgia Health Sciences University, Palo Alto Research Center, Palo Alto Research Center, University of Minnesota .
In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1
. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes
) with such properties2
Many innovative and useful methods currently exist for creating novel objects and object categories3-6
(also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings.
First, shape variations are generally imposed by the experimenter5,9,10
, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints.
Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13
. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases.
Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms.
Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14
. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13
. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics15,16
. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects9,13
. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper.
We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have.
Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis.
Neuroscience, Issue 69, machine learning, brain, classification, category learning, cross-modal perception, 3-D prototyping, inference
Mouse Model of Intraluminal MCAO: Cerebral Infarct Evaluation by Cresyl Violet Staining
Institutions: Clinical Research Institute of Montreal, Laval University.
Stroke is the third cause of mortality and the leading cause of disability in the World. Ischemic stroke accounts for approximately 80% of all strokes. However, the thrombolytic tissue plasminogen activator (tPA) is the only treatment of acute ischemic stroke that exists. This led researchers to develop several ischemic stroke models in a variety of species. Two major types of rodent models have been developed: models of global cerebral ischemia or focal cerebral ischemia. To mimic ischemic stroke in patients, in whom approximately 80% thrombotic or embolic strokes occur in the territory of the middle cerebral artery (MCA), the intraluminal middle cerebral artery occlusion (MCAO) model is quite relevant for stroke studies. This model was first developed in rats by Koizumi et al.
in 1986 1
. Because of the ease of genetic manipulation in mice, these models have also been developed in this species 2-3
Herein, we present the transient MCA occlusion procedure in C57/Bl6 mice. Previous studies have reported that physical properties of the occluder such as tip diameter, length, shape, and flexibility are critical for the reproducibility of the infarct volume 4
. Herein, a commercial silicon coated monofilaments (Doccol Corporation) have been used. Another great advantage is that this monofilament reduces the risk to induce subarachnoid hemorrhages. Using the Zeiss stereo-microscope Stemi 2000, the silicon coated monofilament was introduced into the internal carotid artery (ICA) via
a cut in the external carotid artery (ECA) until the monofilament occludes the base of the MCA. Blood flow was restored 1 hour later by removal of the monofilament to mimic the restoration of blood flow after lysis of a thromboembolic clot in humans. The extent of cerebral infarct may be evaluated first by a neurologic score and by the measurement of the infarct volume. Ischemic mice were thus analyzed for their neurologic score at different post-reperfusion times. To evaluate the infarct volume, staining with 2,3,5-triphenyltetrazolium chloride (TTC) was usually performed. Herein, we used cresyl violet staining since it offers the opportunity to test many critical markers by immunohistochemistry. In this video, we report the MCAO procedure; neurological scores and the evaluation of the infarct volume by cresyl violet staining.
Medicine, Issue 69, Neuroscience, Biochemistry, Anatomy, Physiology, transient ischemic stroke, middle cerebral artery occlusion, intraluminal model, neuroscore, cresyl violet staining, mice, imaging
Bilateral Common Carotid Artery Occlusion as an Adequate Preconditioning Stimulus to Induce Early Ischemic Tolerance to Focal Cerebral Ischemia
Institutions: Charité - Universitätsmedizin Berlin, Germany.
There is accumulating evidence, that ischemic preconditioning - a non-damaging ischemic challenge to the brain - confers a transient protection to a subsequent damaging ischemic insult. We have established bilateral common carotid artery occlusion as a preconditioning stimulus to induce early ischemic tolerance to transient focal cerebral ischemia in C57Bl6/J mice. In this video, we will demonstrate the methodology used for this study.
Medicine, Issue 75, Neurobiology, Anatomy, Physiology, Neuroscience, Immunology, Surgery, stroke, cerebral ischemia, ischemic preconditioning, ischemic tolerance, IT, ischemic stroke, middle cerebral artery occlusion, MCAO, bilateral common carotid artery occlusion, BCCAO, brain, ischemia, occlusion, reperfusion, mice, animal model, surgical techniques
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4
is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1
). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6
. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8
. Using logistic regression analysis of subject scores (i.e.
pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e.
composite networks with improved discrimination of patients from healthy control subjects5,6
. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9
. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10
. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11
. These standardized values can in turn be used to assist in differential diagnosis12,13
and to assess disease progression and treatment effects at the network level7,14-16
. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Mouse Model of Middle Cerebral Artery Occlusion
Institutions: Ernest Gallo Clinic and Research Center, University of California, San Francisco, Kent State University.
Stroke is the most common fatal neurological disease in the United States 1
. The majority of strokes (88%) result from blockage of blood vessels in the brain (ischemic stroke) 2
. Since most ischemic strokes (~80%) occur in the territory of middle cerebral artery (MCA) 3
, many animal stroke models that have been developed have focused on this artery. The intraluminal monofilament model of middle cerebral artery occlusion (MCAO) involves the insertion of a surgical filament into the external carotid artery and threading it forward into the internal carotid artery (ICA) until the tip occludes the origin of the MCA, resulting in a cessation of blood flow and subsequent brain infarction in the MCA territory 4
. The technique can be used to model permanent or transient occlusion 5
. If the suture is removed after a certain interval (30 min, 1 h, or 2 h), reperfusion is achieved (transient MCAO); if the filament is left in place (24 h) the procedure is suitable as a model of permanent MCAO. This technique does not require craniectomy, a neurosurgical procedure to remove a portion of skull, which may affect intracranial pressure and temperature 6
. It has become the most frequently used method to mimic permanent and transient focal cerebral ischemia in rats and mice 7,8
. To evaluate the extent of cerebral infarction, we stain brain slices with 2,3,5-triphenyltetrazolium chloride (TTC) to identify ischemic brain tissue 9
. In this video, we demonstrate the MCAO method and the determination of infarct size by TTC staining.
Medicine, Issue 48, Neurology, Stroke, mice, ischemia
Quantitative Real-Time PCR using the Thermo Scientific Solaris qPCR Assay
Institutions: Thermo Scientific Solaris qPCR Products.
The Solaris qPCR Gene Expression Assay is a novel type of primer/probe set, designed to simplify the qPCR process while maintaining the sensitivity and accuracy of the assay. These primer/probe sets are pre-designed to >98% of the human and mouse genomes and feature significant improvements from previously available technologies. These improvements were made possible by virtue of a novel design algorithm, developed by Thermo Scientific bioinformatics experts.
Several convenient features have been incorporated into the Solaris qPCR Assay to streamline the process of performing quantitative real-time PCR. First, the protocol is similar to commonly employed alternatives, so the methods used during qPCR are likely to be familiar. Second, the master mix is blue, which makes setting the qPCR reactions easier to track. Third, the thermal cycling conditions are the same for all assays (genes), making it possible to run many samples at a time and reducing the potential for error. Finally, the probe and primer sequence information are provided, simplifying the publication process.
Here, we demonstrate how to obtain the appropriate Solaris reagents using the GENEius product search feature found on the ordering web site (www.thermo.com/solaris) and how to use the Solaris reagents for performing qPCR using the standard curve method.
Cellular Biology, Issue 40, qPCR, probe, real-time PCR, molecular biology, Solaris, primer, gene expression assays