JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Automatic and reproducible positioning of phase-contrast MRI for the quantification of global cerebral blood flow.
PLoS ONE
PUBLISHED: 01-01-2014
Phase-Contrast MRI (PC-MRI) is a noninvasive technique to measure blood flow. In particular, global but highly quantitative cerebral blood flow (CBF) measurement using PC-MRI complements several other CBF mapping methods such as arterial spin labeling and dynamic susceptibility contrast MRI by providing a calibration factor. The ability to estimate blood supply in physiological units also lays a foundation for assessment of brain metabolic rate. However, a major obstacle before wider applications of this method is that the slice positioning of the scan, ideally placed perpendicular to the feeding arteries, requires considerable expertise and can present a burden to the operator. In the present work, we proposed that the majority of PC-MRI scans can be positioned using an automatic algorithm, leaving only a small fraction of arteries requiring manual positioning. We implemented and evaluated an algorithm for this purpose based on feature extraction of a survey angiogram, which is of minimal operator dependence. In a comparative test-retest study with 7 subjects, the blood flow measurement using this algorithm showed an inter-session coefficient of variation (CoV) of 4.07 ± 3.03%. The Bland-Altman method showed that the automatic method differs from the manual method by between -8% and 11%, for 95% of the CBF measurements. This is comparable to the variance in CBF measurement using manually-positioned PC MRI alone. In a further application of this algorithm to 157 consecutive subjects from typical clinical cohorts, the algorithm provided successful positioning in 89.7% of the arteries. In 79.6% of the subjects, all four arteries could be planned using the algorithm. Chi-square tests of independence showed that the success rate was not dependent on the age or gender, but the patients showed a trend of lower success rate (p = 0.14) compared to healthy controls. In conclusion, this automatic positioning algorithm could improve the application of PC-MRI in CBF quantification.
Authors: Phoebe Spetsieris, Yilong Ma, Shichun Peng, Ji Hyun Ko, Vijay Dhawan, Chris C. Tang, David Eidelberg.
Published: 06-26-2013
ABSTRACT
The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
20 Related JoVE Articles!
Play Button
Tilt Testing with Combined Lower Body Negative Pressure: a "Gold Standard" for Measuring Orthostatic Tolerance
Authors: Clare L. Protheroe, Henrike (Rianne) J.C. Ravensbergen, Jessica A. Inskip, Victoria E. Claydon.
Institutions: Simon Fraser University .
Orthostatic tolerance (OT) refers to the ability to maintain cardiovascular stability when upright, against the hydrostatic effects of gravity, and hence to maintain cerebral perfusion and prevent syncope (fainting). Various techniques are available to assess OT and the effects of gravitational stress upon the circulation, typically by reproducing a presyncopal event (near-fainting episode) in a controlled laboratory environment. The time and/or degree of stress required to provoke this response provides the measure of OT. Any technique used to determine OT should: enable distinction between patients with orthostatic intolerance (of various causes) and asymptomatic control subjects; be highly reproducible, enabling evaluation of therapeutic interventions; avoid invasive procedures, which are known to impair OT1. In the late 1980s head-upright tilt testing was first utilized for diagnosing syncope2. Since then it has been used to assess OT in patients with syncope of unknown cause, as well as in healthy subjects to study postural cardiovascular reflexes2-6. Tilting protocols comprise three categories: passive tilt; passive tilt accompanied by pharmacological provocation; and passive tilt with combined lower body negative pressure (LBNP). However, the effects of tilt testing (and other orthostatic stress testing modalities) are often poorly reproducible, with low sensitivity and specificity to diagnose orthostatic intolerance7. Typically, a passive tilt includes 20-60 min of orthostatic stress continued until the onset of presyncope in patients2-6. However, the main drawback of this procedure is its inability to invoke presyncope in all individuals undergoing the test, and corresponding low sensitivity8,9. Thus, different methods were explored to increase the orthostatic stress and improve sensitivity. Pharmacological provocation has been used to increase the orthostatic challenge, for example using isoprenaline4,7,10,11 or sublingual nitrate12,13. However, the main drawback of these approaches are increases in sensitivity at the cost of unacceptable decreases in specificity10,14, with a high positive response rate immediately after administration15. Furthermore, invasive procedures associated with some pharmacological provocations greatly increase the false positive rate1. Another approach is to combine passive tilt testing with LBNP, providing a stronger orthostatic stress without invasive procedures or drug side-effects, using the technique pioneered by Professor Roger Hainsworth in the 1990s16-18. This approach provokes presyncope in almost all subjects (allowing for symptom recognition in patients with syncope), while discriminating between patients with syncope and healthy controls, with a specificity of 92%, sensitivity of 85%, and repeatability of 1.1±0.6 min16,17. This allows not only diagnosis and pathophysiological assessment19-22, but also the evaluation of treatments for orthostatic intolerance due to its high repeatability23-30. For these reasons, we argue this should be the "gold standard" for orthostatic stress testing, and accordingly this will be the method described in this paper.
Medicine, Issue 73, Anatomy, Physiology, Biomedical Engineering, Neurobiology, Kinesiology, Cardiology, tilt test, lower body negative pressure, orthostatic stress, syncope, orthostatic tolerance, fainting, gravitational stress, head upright, stroke, clinical techniques
4315
Play Button
Metabolomic Analysis of Rat Brain by High Resolution Nuclear Magnetic Resonance Spectroscopy of Tissue Extracts
Authors: Norbert W. Lutz, Evelyne Béraud, Patrick J. Cozzone.
Institutions: Aix-Marseille Université, Aix-Marseille Université.
Studies of gene expression on the RNA and protein levels have long been used to explore biological processes underlying disease. More recently, genomics and proteomics have been complemented by comprehensive quantitative analysis of the metabolite pool present in biological systems. This strategy, termed metabolomics, strives to provide a global characterization of the small-molecule complement involved in metabolism. While the genome and the proteome define the tasks cells can perform, the metabolome is part of the actual phenotype. Among the methods currently used in metabolomics, spectroscopic techniques are of special interest because they allow one to simultaneously analyze a large number of metabolites without prior selection for specific biochemical pathways, thus enabling a broad unbiased approach. Here, an optimized experimental protocol for metabolomic analysis by high-resolution NMR spectroscopy is presented, which is the method of choice for efficient quantification of tissue metabolites. Important strengths of this method are (i) the use of crude extracts, without the need to purify the sample and/or separate metabolites; (ii) the intrinsically quantitative nature of NMR, permitting quantitation of all metabolites represented by an NMR spectrum with one reference compound only; and (iii) the nondestructive nature of NMR enabling repeated use of the same sample for multiple measurements. The dynamic range of metabolite concentrations that can be covered is considerable due to the linear response of NMR signals, although metabolites occurring at extremely low concentrations may be difficult to detect. For the least abundant compounds, the highly sensitive mass spectrometry method may be advantageous although this technique requires more intricate sample preparation and quantification procedures than NMR spectroscopy. We present here an NMR protocol adjusted to rat brain analysis; however, the same protocol can be applied to other tissues with minor modifications.
Neuroscience, Issue 91, metabolomics, brain tissue, rodents, neurochemistry, tissue extracts, NMR spectroscopy, quantitative metabolite analysis, cerebral metabolism, metabolic profile
51829
Play Button
Non-invasive Optical Measurement of Cerebral Metabolism and Hemodynamics in Infants
Authors: Pei-Yi Lin, Nadege Roche-Labarbe, Mathieu Dehaes, Stefan Carp, Angela Fenoglio, Beniamino Barbieri, Katherine Hagan, P. Ellen Grant, Maria Angela Franceschini.
Institutions: Massachusetts General Hospital, Harvard Medical School, Université de Caen Basse-Normandie, Boston Children's Hospital, Harvard Medical School, ISS, INC..
Perinatal brain injury remains a significant cause of infant mortality and morbidity, but there is not yet an effective bedside tool that can accurately screen for brain injury, monitor injury evolution, or assess response to therapy. The energy used by neurons is derived largely from tissue oxidative metabolism, and neural hyperactivity and cell death are reflected by corresponding changes in cerebral oxygen metabolism (CMRO2). Thus, measures of CMRO2 are reflective of neuronal viability and provide critical diagnostic information, making CMRO2 an ideal target for bedside measurement of brain health. Brain-imaging techniques such as positron emission tomography (PET) and single-photon emission computed tomography (SPECT) yield measures of cerebral glucose and oxygen metabolism, but these techniques require the administration of radionucleotides, so they are used in only the most acute cases. Continuous-wave near-infrared spectroscopy (CWNIRS) provides non-invasive and non-ionizing radiation measures of hemoglobin oxygen saturation (SO2) as a surrogate for cerebral oxygen consumption. However, SO2 is less than ideal as a surrogate for cerebral oxygen metabolism as it is influenced by both oxygen delivery and consumption. Furthermore, measurements of SO2 are not sensitive enough to detect brain injury hours after the insult 1,2, because oxygen consumption and delivery reach equilibrium after acute transients 3. We investigated the possibility of using more sophisticated NIRS optical methods to quantify cerebral oxygen metabolism at the bedside in healthy and brain-injured newborns. More specifically, we combined the frequency-domain NIRS (FDNIRS) measure of SO2 with the diffuse correlation spectroscopy (DCS) measure of blood flow index (CBFi) to yield an index of CMRO2 (CMRO2i) 4,5. With the combined FDNIRS/DCS system we are able to quantify cerebral metabolism and hemodynamics. This represents an improvement over CWNIRS for detecting brain health, brain development, and response to therapy in neonates. Moreover, this method adheres to all neonatal intensive care unit (NICU) policies on infection control and institutional policies on laser safety. Future work will seek to integrate the two instruments to reduce acquisition time at the bedside and to implement real-time feedback on data quality to reduce the rate of data rejection.
Medicine, Issue 73, Developmental Biology, Neurobiology, Neuroscience, Biomedical Engineering, Anatomy, Physiology, Near infrared spectroscopy, diffuse correlation spectroscopy, cerebral hemodynamic, cerebral metabolism, brain injury screening, brain health, brain development, newborns, neonates, imaging, clinical techniques
4379
Play Button
Magnetic Resonance Imaging Quantification of Pulmonary Perfusion using Calibrated Arterial Spin Labeling
Authors: Tatsuya J. Arai, G. Kim Prisk, Sebastiaan Holverda, Rui Carlos Sá, Rebecca J. Theilmann, A. Cortney Henderson, Matthew V. Cronin, Richard B. Buxton, Susan R. Hopkins.
Institutions: University of California San Diego - UCSD, University of California San Diego - UCSD, University of California San Diego - UCSD.
This demonstrates a MR imaging method to measure the spatial distribution of pulmonary blood flow in healthy subjects during normoxia (inspired O2, fraction (FIO2) = 0.21) hypoxia (FIO2 = 0.125), and hyperoxia (FIO2 = 1.00). In addition, the physiological responses of the subject are monitored in the MR scan environment. MR images were obtained on a 1.5 T GE MRI scanner during a breath hold from a sagittal slice in the right lung at functional residual capacity. An arterial spin labeling sequence (ASL-FAIRER) was used to measure the spatial distribution of pulmonary blood flow 1,2 and a multi-echo fast gradient echo (mGRE) sequence 3 was used to quantify the regional proton (i.e. H2O) density, allowing the quantification of density-normalized perfusion for each voxel (milliliters blood per minute per gram lung tissue). With a pneumatic switching valve and facemask equipped with a 2-way non-rebreathing valve, different oxygen concentrations were introduced to the subject in the MR scanner through the inspired gas tubing. A metabolic cart collected expiratory gas via expiratory tubing. Mixed expiratory O2 and CO2 concentrations, oxygen consumption, carbon dioxide production, respiratory exchange ratio, respiratory frequency and tidal volume were measured. Heart rate and oxygen saturation were monitored using pulse-oximetry. Data obtained from a normal subject showed that, as expected, heart rate was higher in hypoxia (60 bpm) than during normoxia (51) or hyperoxia (50) and the arterial oxygen saturation (SpO2) was reduced during hypoxia to 86%. Mean ventilation was 8.31 L/min BTPS during hypoxia, 7.04 L/min during normoxia, and 6.64 L/min during hyperoxia. Tidal volume was 0.76 L during hypoxia, 0.69 L during normoxia, and 0.67 L during hyperoxia. Representative quantified ASL data showed that the mean density normalized perfusion was 8.86 ml/min/g during hypoxia, 8.26 ml/min/g during normoxia and 8.46 ml/min/g during hyperoxia, respectively. In this subject, the relative dispersion4, an index of global heterogeneity, was increased in hypoxia (1.07 during hypoxia, 0.85 during normoxia, and 0.87 during hyperoxia) while the fractal dimension (Ds), another index of heterogeneity reflecting vascular branching structure, was unchanged (1.24 during hypoxia, 1.26 during normoxia, and 1.26 during hyperoxia). Overview. This protocol will demonstrate the acquisition of data to measure the distribution of pulmonary perfusion noninvasively under conditions of normoxia, hypoxia, and hyperoxia using a magnetic resonance imaging technique known as arterial spin labeling (ASL). Rationale: Measurement of pulmonary blood flow and lung proton density using MR technique offers high spatial resolution images which can be quantified and the ability to perform repeated measurements under several different physiological conditions. In human studies, PET, SPECT, and CT are commonly used as the alternative techniques. However, these techniques involve exposure to ionizing radiation, and thus are not suitable for repeated measurements in human subjects.
Medicine, Issue 51, arterial spin labeling, lung proton density, functional lung imaging, hypoxic pulmonary vasoconstriction, oxygen consumption, ventilation, magnetic resonance imaging
2712
Play Button
Doppler Optical Coherence Tomography of Retinal Circulation
Authors: Ou Tan, Yimin Wang, Ranjith K. Konduru, Xinbo Zhang, SriniVas R. Sadda, David Huang.
Institutions: Oregon Health and Science University , University of Southern California.
Noncontact retinal blood flow measurements are performed with a Fourier domain optical coherence tomography (OCT) system using a circumpapillary double circular scan (CDCS) that scans around the optic nerve head at 3.40 mm and 3.75 mm diameters. The double concentric circles are performed 6 times consecutively over 2 sec. The CDCS scan is saved with Doppler shift information from which flow can be calculated. The standard clinical protocol calls for 3 CDCS scans made with the OCT beam passing through the superonasal edge of the pupil and 3 CDCS scan through the inferonal pupil. This double-angle protocol ensures that acceptable Doppler angle is obtained on each retinal branch vessel in at least 1 scan. The CDCS scan data, a 3-dimensional volumetric OCT scan of the optic disc scan, and a color photograph of the optic disc are used together to obtain retinal blood flow measurement on an eye. We have developed a blood flow measurement software called "Doppler optical coherence tomography of retinal circulation" (DOCTORC). This semi-automated software is used to measure total retinal blood flow, vessel cross section area, and average blood velocity. The flow of each vessel is calculated from the Doppler shift in the vessel cross-sectional area and the Doppler angle between the vessel and the OCT beam. Total retinal blood flow measurement is summed from the veins around the optic disc. The results obtained at our Doppler OCT reading center showed good reproducibility between graders and methods (<10%). Total retinal blood flow could be useful in the management of glaucoma, other retinal diseases, and retinal diseases. In glaucoma patients, OCT retinal blood flow measurement was highly correlated with visual field loss (R2>0.57 with visual field pattern deviation). Doppler OCT is a new method to perform rapid, noncontact, and repeatable measurement of total retinal blood flow using widely available Fourier-domain OCT instrumentation. This new technology may improve the practicality of making these measurements in clinical studies and routine clinical practice.
Medicine, Issue 67, Ophthalmology, Physics, Doppler optical coherence tomography, total retinal blood flow, dual circular scan pattern, image analysis, semi-automated grading software, optic disc
3524
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
A Comprehensive Protocol for Manual Segmentation of the Medial Temporal Lobe Structures
Authors: Matthew Moore, Yifan Hu, Sarah Woo, Dylan O'Hearn, Alexandru D. Iordan, Sanda Dolcos, Florin Dolcos.
Institutions: University of Illinois Urbana-Champaign, University of Illinois Urbana-Champaign, University of Illinois Urbana-Champaign.
The present paper describes a comprehensive protocol for manual tracing of the set of brain regions comprising the medial temporal lobe (MTL): amygdala, hippocampus, and the associated parahippocampal regions (perirhinal, entorhinal, and parahippocampal proper). Unlike most other tracing protocols available, typically focusing on certain MTL areas (e.g., amygdala and/or hippocampus), the integrative perspective adopted by the present tracing guidelines allows for clear localization of all MTL subregions. By integrating information from a variety of sources, including extant tracing protocols separately targeting various MTL structures, histological reports, and brain atlases, and with the complement of illustrative visual materials, the present protocol provides an accurate, intuitive, and convenient guide for understanding the MTL anatomy. The need for such tracing guidelines is also emphasized by illustrating possible differences between automatic and manual segmentation protocols. This knowledge can be applied toward research involving not only structural MRI investigations but also structural-functional colocalization and fMRI signal extraction from anatomically defined ROIs, in healthy and clinical groups alike.
Neuroscience, Issue 89, Anatomy, Segmentation, Medial Temporal Lobe, MRI, Manual Tracing, Amygdala, Hippocampus, Perirhinal Cortex, Entorhinal Cortex, Parahippocampal Cortex
50991
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
52070
Play Button
MRI and PET in Mouse Models of Myocardial Infarction
Authors: Guido Buonincontri, Carmen Methner, T. Adrian Carpenter, Robert C. Hawkes, Stephen J. Sawiak, Thomas Krieg.
Institutions: Unversity of Cambridge, University of Cambridge, University of Cambridge.
Myocardial infarction is one of the leading causes of death in the Western world. The similarity of the mouse heart to the human heart has made it an ideal model for testing novel therapeutic strategies. In vivo magnetic resonance imaging (MRI) gives excellent views of the heart noninvasively with clear anatomical detail, which can be used for accurate functional assessment. Contrast agents can provide basic measures of tissue viability but these are nonspecific. Positron emission tomography (PET) is a complementary technique that is highly specific for molecular imaging, but lacks the anatomical detail of MRI. Used together, these techniques offer a sensitive, specific and quantitative tool for the assessment of the heart in disease and recovery following treatment. In this paper we explain how these methods are carried out in mouse models of acute myocardial infarction. The procedures described here were designed for the assessment of putative protective drug treatments. We used MRI to measure systolic function and infarct size with late gadolinium enhancement, and PET with fluorodeoxyglucose (FDG) to assess metabolic function in the infarcted region. The paper focuses on practical aspects such as slice planning, accurate gating, drug delivery, segmentation of images, and multimodal coregistration. The methods presented here achieve good repeatability and accuracy maintaining a high throughput.
Medicine, Issue 82, anatomy, Late Gadolinium Enhancement (LGE), MRI, FDG PET, MRI/PET imaging, myocardial infarction, mouse model, contrast agents, coregistration
50806
Play Button
Dual-phase Cone-beam Computed Tomography to See, Reach, and Treat Hepatocellular Carcinoma during Drug-eluting Beads Transarterial Chemo-embolization
Authors: Vania Tacher, MingDe Lin, Nikhil Bhagat, Nadine Abi Jaoudeh, Alessandro Radaelli, Niels Noordhoek, Bart Carelsen, Bradford J. Wood, Jean-François Geschwind.
Institutions: The Johns Hopkins Hospital, Philips Research North America, National Institutes of Health, Philips Healthcare.
The advent of cone-beam computed tomography (CBCT) in the angiography suite has been revolutionary in interventional radiology. CBCT offers 3 dimensional (3D) diagnostic imaging in the interventional suite and can enhance minimally-invasive therapy beyond the limitations of 2D angiography alone. The role of CBCT has been recognized in transarterial chemo-embolization (TACE) treatment of hepatocellular carcinoma (HCC). The recent introduction of a CBCT technique: dual-phase CBCT (DP-CBCT) improves intra-arterial HCC treatment with drug-eluting beads (DEB-TACE). DP-CBCT can be used to localize liver tumors with the diagnostic accuracy of multi-phasic multidetector computed tomography (M-MDCT) and contrast enhanced magnetic resonance imaging (CE-MRI) (See the tumor), to guide intra-arterially guidewire and microcatheter to the desired location for selective therapy (Reach the tumor), and to evaluate treatment success during the procedure (Treat the tumor). The purpose of this manuscript is to illustrate how DP-CBCT is used in DEB-TACE to see, reach, and treat HCC.
Medicine, Issue 82, Carcinoma, Hepatocellular, Tomography, X-Ray Computed, Surgical Procedures, Minimally Invasive, Digestive System Diseases, Diagnosis, Therapeutics, Surgical Procedures, Operative, Equipment and Supplies, Transarterial chemo-embolization, Hepatocellular carcinoma, Dual-phase cone-beam computed tomography, 3D roadmap, Drug-Eluting Beads
50795
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
51631
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
50427
Play Button
2-Vessel Occlusion/Hypotension: A Rat Model of Global Brain Ischemia
Authors: Thomas H. Sanderson, Joseph M. Wider.
Institutions: Wayne State University School of Medicine, Wayne State University School of Medicine, Wayne State University School of Medicine.
Cardiac arrest followed by resuscitation often results in dramatic brain damage caused by ischemia and subsequent reperfusion of the brain. Global brain ischemia produces damage to specific brain regions shown to be highly sensitive to ischemia 1. Hippocampal neurons have higher sensitivity to ischemic insults compared to other cell populations, and specifically, the CA1 region of the hippocampus is particularly vulnerable to ischemia/reperfusion 2. The design of therapeutic interventions, or study of mechanisms involved in cerebral damage, requires a model that produces damage similar to the clinical condition and in a reproducible manner. Bilateral carotid vessel occlusion with hypotension (2VOH) is a model that produces reversible forebrain ischemia, emulating the cerebral events that can occur during cardiac arrest and resuscitation. We describe a model modified from Smith et al. (1984) 2, as first presented in its current form in Sanderson, et al. (2008) 3, which produces reproducible injury to selectively vulnerable brain regions 3-6. The reliability of this model is dictated by precise control of systemic blood pressure during applied hypotension, the duration of ischemia, close temperature control, a specific anesthesia regimen, and diligent post-operative care. An 8-minute ischemic insult produces cell death of CA1 hippocampal neurons that progresses over the course of 6 to 24 hr of reperfusion, while less vulnerable brain regions are spared. This progressive cell death is easily quantified after 7-14 days of reperfusion, as a near complete loss of CA1 neurons is evident at this time. In addition to this brain injury model, we present a method for CA1 damage quantification using a simple, yet thorough, methodology. Importantly, quantification can be accomplished using a simple camera-mounted microscope, and a free ImageJ (NIH) software plugin, obviating the need for cost-prohibitive stereology software programs and a motorized microscopic stage for damage assessment.
Medicine, Issue 76, Biomedical Engineering, Neurobiology, Neuroscience, Immunology, Anatomy, Physiology, Cardiology, Brain Ischemia, ischemia, reperfusion, cardiac arrest, resuscitation, 2VOH, brain injury model, CA1 hippocampal neurons, brain, neuron, blood vessel, occlusion, hypotension, animal model
50173
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
A Dual Tracer PET-MRI Protocol for the Quantitative Measure of Regional Brain Energy Substrates Uptake in the Rat
Authors: Maggie Roy, Scott Nugent, Sébastien Tremblay, Maxime Descoteaux, Jean-François Beaudoin, Luc Tremblay, Roger Lecomte, Stephen C Cunnane.
Institutions: Université de Sherbrooke, Université de Sherbrooke, Université de Sherbrooke, Université de Sherbrooke.
We present a method for comparing the uptake of the brain's two key energy substrates: glucose and ketones (acetoacetate [AcAc] in this case) in the rat. The developed method is a small-animal positron emission tomography (PET) protocol, in which 11C-AcAc and 18F-fluorodeoxyglucose (18F-FDG) are injected sequentially in each animal. This dual tracer PET acquisition is possible because of the short half-life of 11C (20.4 min). The rats also undergo a magnetic resonance imaging (MRI) acquisition seven days before the PET protocol. Prior to image analysis, PET and MRI images are coregistered to allow the measurement of regional cerebral uptake (cortex, hippocampus, striatum, and cerebellum). A quantitative measure of 11C-AcAc and 18F-FDG brain uptake (cerebral metabolic rate; μmol/100 g/min) is determined by kinetic modeling using the image-derived input function (IDIF) method. Our new dual tracer PET protocol is robust and flexible; the two tracers used can be replaced by different radiotracers to evaluate other processes in the brain. Moreover, our protocol is applicable to the study of brain fuel supply in multiple conditions such as normal aging and neurodegenerative pathologies such as Alzheimer's and Parkinson's diseases.
Neuroscience, Issue 82, positron emission tomography (PET), 18F-fluorodeoxyglucose, 11C-acetoacetate, magnetic resonance imaging (MRI), kinetic modeling, cerebral metabolic rate, rat
50761
Play Button
Lesion Explorer: A Video-guided, Standardized Protocol for Accurate and Reliable MRI-derived Volumetrics in Alzheimer's Disease and Normal Elderly
Authors: Joel Ramirez, Christopher J.M. Scott, Alicia A. McNeely, Courtney Berezuk, Fuqiang Gao, Gregory M. Szilagyi, Sandra E. Black.
Institutions: Sunnybrook Health Sciences Centre, University of Toronto.
Obtaining in vivo human brain tissue volumetrics from MRI is often complicated by various technical and biological issues. These challenges are exacerbated when significant brain atrophy and age-related white matter changes (e.g. Leukoaraiosis) are present. Lesion Explorer (LE) is an accurate and reliable neuroimaging pipeline specifically developed to address such issues commonly observed on MRI of Alzheimer's disease and normal elderly. The pipeline is a complex set of semi-automatic procedures which has been previously validated in a series of internal and external reliability tests1,2. However, LE's accuracy and reliability is highly dependent on properly trained manual operators to execute commands, identify distinct anatomical landmarks, and manually edit/verify various computer-generated segmentation outputs. LE can be divided into 3 main components, each requiring a set of commands and manual operations: 1) Brain-Sizer, 2) SABRE, and 3) Lesion-Seg. Brain-Sizer's manual operations involve editing of the automatic skull-stripped total intracranial vault (TIV) extraction mask, designation of ventricular cerebrospinal fluid (vCSF), and removal of subtentorial structures. The SABRE component requires checking of image alignment along the anterior and posterior commissure (ACPC) plane, and identification of several anatomical landmarks required for regional parcellation. Finally, the Lesion-Seg component involves manual checking of the automatic lesion segmentation of subcortical hyperintensities (SH) for false positive errors. While on-site training of the LE pipeline is preferable, readily available visual teaching tools with interactive training images are a viable alternative. Developed to ensure a high degree of accuracy and reliability, the following is a step-by-step, video-guided, standardized protocol for LE's manual procedures.
Medicine, Issue 86, Brain, Vascular Diseases, Magnetic Resonance Imaging (MRI), Neuroimaging, Alzheimer Disease, Aging, Neuroanatomy, brain extraction, ventricles, white matter hyperintensities, cerebrovascular disease, Alzheimer disease
50887
Play Button
High-resolution Functional Magnetic Resonance Imaging Methods for Human Midbrain
Authors: Sucharit Katyal, Clint A. Greene, David Ress.
Institutions: The University of Texas at Austin.
Functional MRI (fMRI) is a widely used tool for non-invasively measuring correlates of human brain activity. However, its use has mostly been focused upon measuring activity on the surface of cerebral cortex rather than in subcortical regions such as midbrain and brainstem. Subcortical fMRI must overcome two challenges: spatial resolution and physiological noise. Here we describe an optimized set of techniques developed to perform high-resolution fMRI in human SC, a structure on the dorsal surface of the midbrain; the methods can also be used to image other brainstem and subcortical structures. High-resolution (1.2 mm voxels) fMRI of the SC requires a non-conventional approach. The desired spatial sampling is obtained using a multi-shot (interleaved) spiral acquisition1. Since, T2* of SC tissue is longer than in cortex, a correspondingly longer echo time (TE ~ 40 msec) is used to maximize functional contrast. To cover the full extent of the SC, 8-10 slices are obtained. For each session a structural anatomy with the same slice prescription as the fMRI is also obtained, which is used to align the functional data to a high-resolution reference volume. In a separate session, for each subject, we create a high-resolution (0.7 mm sampling) reference volume using a T1-weighted sequence that gives good tissue contrast. In the reference volume, the midbrain region is segmented using the ITK-SNAP software application2. This segmentation is used to create a 3D surface representation of the midbrain that is both smooth and accurate3. The surface vertices and normals are used to create a map of depth from the midbrain surface within the tissue4. Functional data is transformed into the coordinate system of the segmented reference volume. Depth associations of the voxels enable the averaging of fMRI time series data within specified depth ranges to improve signal quality. Data is rendered on the 3D surface for visualization. In our lab we use this technique for measuring topographic maps of visual stimulation and covert and overt visual attention within the SC1. As an example, we demonstrate the topographic representation of polar angle to visual stimulation in SC.
Neuroscience, Issue 63, fMRI, midbrain, brainstem, colliculus, BOLD, brain, Magentic Resonance Imaging, MRI
3746
Play Button
Coherence between Brain Cortical Function and Neurocognitive Performance during Changed Gravity Conditions
Authors: Vera Brümmer, Stefan Schneider, Tobias Vogt, Heiko Strüder, Heather Carnahan, Christopher D. Askew, Roland Csuhaj.
Institutions: German Sport University Cologne, University of Toronto, Queensland University of Technology, Gilching, Germany.
Previous studies of cognitive, mental and/or motor processes during short-, medium- and long-term weightlessness have only been descriptive in nature, and focused on psychological aspects. Until now, objective observation of neurophysiological parameters has not been carried out - undoubtedly because the technical and methodological means have not been available -, investigations into the neurophysiological effects of weightlessness are in their infancy (Schneider et al. 2008). While imaging techniques such as positron emission tomography (PET) and magnetic resonance imaging (MRI) would be hardly applicable in space, the non-invasive near-infrared spectroscopy (NIRS) technique represents a method of mapping hemodynamic processes in the brain in real time that is both relatively inexpensive and that can be employed even under extreme conditions. The combination with electroencephalography (EEG) opens up the possibility of following the electrocortical processes under changing gravity conditions with a finer temporal resolution as well as with deeper localization, for instance with electrotomography (LORETA). Previous studies showed an increase of beta frequency activity under normal gravity conditions and a decrease under weightlessness conditions during a parabolic flight (Schneider et al. 2008a+b). Tilt studies revealed different changes in brain function, which let suggest, that changes in parabolic flight might reflect emotional processes rather than hemodynamic changes. However, it is still unclear whether these are effects of changed gravity or hemodynamic changes within the brain. Combining EEG/LORETA and NIRS should for the first time make it possible to map the effect of weightlessness and reduced gravity on both hemodynamic and electrophysiological processes in the brain. Initially, this is to be done as part of a feasibility study during a parabolic flight. Afterwards, it is also planned to use both techniques during medium- and long-term space flight. It can be assumed that the long-term redistribution of the blood volume and the associated increase in the supply of oxygen to the brain will lead to changes in the central nervous system that are also responsible for anaemic processes, and which can in turn reduce performance (De Santo et al. 2005), which means that they could be crucial for the success and safety of a mission (Genik et al. 2005, Ellis 2000). Depending on these results, it will be necessary to develop and employ extensive countermeasures. Initial results for the MARS500 study suggest that, in addition to their significance in the context of the cardiovascular and locomotor systems, sport and physical activity can play a part in improving neurocognitive parameters. Before this can be fully established, however, it seems necessary to learn more about the influence of changing gravity conditions on neurophysiological processes and associated neurocognitive impairment.
Neuroscience, Issue 51, EEG, NIRS, electrotomography, parabolic flight, weightlessness, imaging, cognitive performance
2670
Play Button
Microsurgical Clip Obliteration of Middle Cerebral Aneurysm Using Intraoperative Flow Assessment
Authors: Bob S. Carter, Christopher Farrell, Christopher Owen.
Institutions: Havard Medical School, Massachusetts General Hospital.
Cerebral aneurysms are abnormal widening or ballooning of a localized segment of an intracranial blood vessel. Surgical clipping is an important treatment for aneurysms which attempts to exclude blood from flowing into the aneurysmal segment of the vessel while preserving blood flow in a normal fashion. Improper clip placement may result in residual aneurysm with the potential for subsequent aneurysm rupture or partial or full occlusion of distal arteries resulting in cerebral infarction. Here we describe the use of an ultrasonic flow probe to provide quantitative evaluation of arterial flow before and after microsurgical clip placement at the base of a middle cerebral artery aneurysm. This information helps ensure adequate aneurysm reconstruction with preservation of normal distal blood flow.
Medicine, Issue 31, Aneurysm, intraoperative, brain, surgery, surgical clipping, blood flow, aneurysmal segment, ultrasonic flow probe
1294
Play Button
Simultaneous fMRI and Electrophysiology in the Rodent Brain
Authors: Wen-ju Pan, Garth Thompson, Matthew Magnuson, Waqas Majeed, Dieter Jaeger, Shella Keilholz.
Institutions: Emory University, Georgia Institute of Technology, Emory University.
To examine the neural basis of the blood oxygenation level dependent (BOLD) magnetic resonance imaging (MRI) signal, we have developed a rodent model in which functional MRI data and in vivo intracortical recording can be performed simultaneously. The combination of MRI and electrical recording is technically challenging because the electrodes used for recording distort the MRI images and the MRI acquisition induces noise in the electrical recording. To minimize the mutual interference of the two modalities, glass microelectrodes were used rather than metal and a noise removal algorithm was implemented for the electrophysiology data. In our studies, two microelectrodes were separately implanted in bilateral primary somatosensory cortices (SI) of the rat and fixed in place. One coronal slice covering the electrode tips was selected for functional MRI. Electrode shafts and fixation positions were not included in the image slice to avoid imaging artifacts. The removed scalp was replaced with toothpaste to reduce susceptibility mismatch and prevent Gibbs ringing artifacts in the images. The artifact structure induced in the electrical recordings by the rapidly-switching magnetic fields during image acquisition was characterized by averaging all cycles of scans for each run. The noise structure during imaging was then subtracted from original recordings. The denoised time courses were then used for further analysis in combination with the fMRI data. As an example, the simultaneous acquisition was used to determine the relationship between spontaneous fMRI BOLD signals and band-limited intracortical electrical activity. Simultaneous fMRI and electrophysiological recording in the rodent will provide a platform for many exciting applications in neuroscience in addition to elucidating the relationship between the fMRI BOLD signal and neuronal activity.
Neuroscience, Issue 42, fMRI, electrophysiology, rat, BOLD, brain, resting state
1901
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.