DTI is a technique that identifies white matter tracts (WMT) non-invasively in healthy and non-healthy patients using diffusion measurements. Similar to visual pathways (VP), WMT are not visible with classical MRI or intra-operatively with microscope. DTI will help neurosurgeons to prevent destruction of the VP while removing lesions adjacent to this WMT. We have performed DTI on fifty patients before and after surgery between March 2012 to January 2014. To navigate we used a 3DT1-weighted sequence. Additionally, we performed a T2-weighted and DTI-sequences. The parameters used were, FOV: 200 x 200 mm, slice thickness: 2 mm, and acquisition matrix: 96 x 96 yielding nearly isotropic voxels of 2 x 2 x 2 mm. Axial MRI was carried out using a 32 gradient direction and one b0-image. We used Echo-Planar-Imaging (EPI) and ASSET parallel imaging with an acceleration factor of 2 and b-value of 800 s/mm². The scanning time was less than 9 min.
The DTI-data obtained were processed using a FDA approved surgical navigation system program which uses a straightforward fiber-tracking approach known as fiber assignment by continuous tracking (FACT). This is based on the propagation of lines between regions of interest (ROI) which is defined by a physician. A maximum angle of 50, FA start value of 0.10 and ADC stop value of 0.20 mm²/s were the parameters used for tractography.
There are some limitations to this technique. The limited acquisition time frame enforces trade-offs in the image quality. Another important point not to be neglected is the brain shift during surgery. As for the latter intra-operative MRI might be helpful. Furthermore the risk of false positive or false negative tracts needs to be taken into account which might compromise the final results.
23 Related JoVE Articles!
Dual-phase Cone-beam Computed Tomography to See, Reach, and Treat Hepatocellular Carcinoma during Drug-eluting Beads Transarterial Chemo-embolization
Institutions: The Johns Hopkins Hospital, Philips Research North America, National Institutes of Health, Philips Healthcare.
The advent of cone-beam computed tomography (CBCT) in the angiography suite has been revolutionary in interventional radiology. CBCT offers 3 dimensional (3D) diagnostic imaging in the interventional suite and can enhance minimally-invasive therapy beyond the limitations of 2D angiography alone. The role of CBCT has been recognized in transarterial chemo-embolization (TACE) treatment of hepatocellular carcinoma (HCC). The recent introduction of a CBCT technique: dual-phase CBCT (DP-CBCT) improves intra-arterial HCC treatment with drug-eluting beads (DEB-TACE). DP-CBCT can be used to localize liver tumors with the diagnostic accuracy of multi-phasic multidetector computed tomography (M-MDCT) and contrast enhanced magnetic resonance imaging (CE-MRI) (See the tumor), to guide intra-arterially guidewire and microcatheter to the desired location for selective therapy (Reach the tumor), and to evaluate treatment success during the procedure (Treat the tumor). The purpose of this manuscript is to illustrate how DP-CBCT is used in DEB-TACE to see, reach, and treat HCC.
Medicine, Issue 82, Carcinoma, Hepatocellular, Tomography, X-Ray Computed, Surgical Procedures, Minimally Invasive, Digestive System Diseases, Diagnosis, Therapeutics, Surgical Procedures, Operative, Equipment and Supplies, Transarterial chemo-embolization, Hepatocellular carcinoma, Dual-phase cone-beam computed tomography, 3D roadmap, Drug-Eluting Beads
Lesion Explorer: A Video-guided, Standardized Protocol for Accurate and Reliable MRI-derived Volumetrics in Alzheimer's Disease and Normal Elderly
Institutions: Sunnybrook Health Sciences Centre, University of Toronto.
Obtaining in vivo
human brain tissue volumetrics from MRI is often complicated by various technical and biological issues. These challenges are exacerbated when significant brain atrophy and age-related white matter changes (e.g.
Leukoaraiosis) are present. Lesion Explorer (LE) is an accurate and reliable neuroimaging pipeline specifically developed to address such issues commonly observed on MRI of Alzheimer's disease and normal elderly. The pipeline is a complex set of semi-automatic procedures which has been previously validated in a series of internal and external reliability tests1,2
. However, LE's accuracy and reliability is highly dependent on properly trained manual operators to execute commands, identify distinct anatomical landmarks, and manually edit/verify various computer-generated segmentation outputs.
LE can be divided into 3 main components, each requiring a set of commands and manual operations: 1) Brain-Sizer, 2) SABRE, and 3) Lesion-Seg. Brain-Sizer's manual operations involve editing of the automatic skull-stripped total intracranial vault (TIV) extraction mask, designation of ventricular cerebrospinal fluid (vCSF), and removal of subtentorial structures. The SABRE component requires checking of image alignment along the anterior and posterior commissure (ACPC) plane, and identification of several anatomical landmarks required for regional parcellation. Finally, the Lesion-Seg component involves manual checking of the automatic lesion segmentation of subcortical hyperintensities (SH) for false positive errors.
While on-site training of the LE pipeline is preferable, readily available visual teaching tools with interactive training images are a viable alternative. Developed to ensure a high degree of accuracy and reliability, the following is a step-by-step, video-guided, standardized protocol for LE's manual procedures.
Medicine, Issue 86, Brain, Vascular Diseases, Magnetic Resonance Imaging (MRI), Neuroimaging, Alzheimer Disease, Aging, Neuroanatomy, brain extraction, ventricles, white matter hyperintensities, cerebrovascular disease, Alzheimer disease
A Practical Guide to Phylogenetics for Nonexperts
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
Engineering Platform and Experimental Protocol for Design and Evaluation of a Neurally-controlled Powered Transfemoral Prosthesis
Institutions: North Carolina State University & University of North Carolina at Chapel Hill, University of North Carolina School of Medicine, Atlantic Prosthetics & Orthotics, LLC.
To enable intuitive operation of powered artificial legs, an interface between user and prosthesis that can recognize the user's movement intent is desired. A novel neural-machine interface (NMI) based on neuromuscular-mechanical fusion developed in our previous study has demonstrated a great potential to accurately identify the intended movement of transfemoral amputees. However, this interface has not yet been integrated with a powered prosthetic leg for true neural control. This study aimed to report (1) a flexible platform to implement and optimize neural control of powered lower limb prosthesis and (2) an experimental setup and protocol to evaluate neural prosthesis control on patients with lower limb amputations. First a platform based on a PC and a visual programming environment were developed to implement the prosthesis control algorithms, including NMI training algorithm, NMI online testing algorithm, and intrinsic control algorithm. To demonstrate the function of this platform, in this study the NMI based on neuromuscular-mechanical fusion was hierarchically integrated with intrinsic control of a prototypical transfemoral prosthesis. One patient with a unilateral transfemoral amputation was recruited to evaluate our implemented neural controller when performing activities, such as standing, level-ground walking, ramp ascent, and ramp descent continuously in the laboratory. A novel experimental setup and protocol were developed in order to test the new prosthesis control safely and efficiently. The presented proof-of-concept platform and experimental setup and protocol could aid the future development and application of neurally-controlled powered artificial legs.
Biomedical Engineering, Issue 89, neural control, powered transfemoral prosthesis, electromyography (EMG), neural-machine interface, experimental setup and protocol
Real-Time DC-dynamic Biasing Method for Switching Time Improvement in Severely Underdamped Fringing-field Electrostatic MEMS Actuators
Institutions: University of California, Davis, Texas Instruments, Purdue University.
Mechanically underdamped electrostatic fringing-field MEMS actuators are well known for their fast switching operation in response to a unit step input bias voltage. However, the tradeoff for the improved switching performance is a relatively long settling time to reach each gap height in response to various applied voltages. Transient applied bias waveforms are employed to facilitate reduced switching times for electrostatic fringing-field MEMS actuators with high mechanical quality factors. Removing the underlying substrate of the fringing-field actuator creates the low mechanical damping environment necessary to effectively test the concept. The removal of the underlying substrate also a has substantial improvement on the reliability performance of the device in regards to failure due to stiction. Although DC-dynamic biasing is useful in improving settling time, the required slew rates for typical MEMS devices may place aggressive requirements on the charge pumps for fully-integrated on-chip designs. Additionally, there may be challenges integrating the substrate removal step into the back-end-of-line commercial CMOS processing steps. Experimental validation of fabricated actuators demonstrates an improvement of 50x in switching time when compared to conventional step biasing results. Compared to theoretical calculations, the experimental results are in good agreement.
Physics, Issue 90, microelectromechanical systems, actuators, switching time, settling time, electrostatic devices, micromachining, thin film devices
Electric Cell-substrate Impedance Sensing for the Quantification of Endothelial Proliferation, Barrier Function, and Motility
Institutions: Institute for Cardiovascular Research, VU University Medical Center, Institute for Cardiovascular Research, VU University Medical Center.
Electric Cell-substrate Impedance Sensing (ECIS) is an in vitro
impedance measuring system to quantify the behavior of cells within adherent cell layers. To this end, cells are grown in special culture chambers on top of opposing, circular gold electrodes. A constant small alternating current is applied between the electrodes and the potential across is measured. The insulating properties of the cell membrane create a resistance towards the electrical current flow resulting in an increased electrical potential between the electrodes. Measuring cellular impedance in this manner allows the automated study of cell attachment, growth, morphology, function, and motility. Although the ECIS measurement itself is straightforward and easy to learn, the underlying theory is complex and selection of the right settings and correct analysis and interpretation of the data is not self-evident. Yet, a clear protocol describing the individual steps from the experimental design to preparation, realization, and analysis of the experiment is not available. In this article the basic measurement principle as well as possible applications, experimental considerations, advantages and limitations of the ECIS system are discussed. A guide is provided for the study of cell attachment, spreading and proliferation; quantification of cell behavior in a confluent layer, with regard to barrier function, cell motility, quality of cell-cell and cell-substrate adhesions; and quantification of wound healing and cellular responses to vasoactive stimuli. Representative results are discussed based on human microvascular (MVEC) and human umbilical vein endothelial cells (HUVEC), but are applicable to all adherent growing cells.
Bioengineering, Issue 85, ECIS, Impedance Spectroscopy, Resistance, TEER, Endothelial Barrier, Cell Adhesions, Focal Adhesions, Proliferation, Migration, Motility, Wound Healing
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
Human Skeletal Muscle Biopsy Procedures Using the Modified Bergström Technique
Institutions: Appalacian State University, Appalachian State University, Carolinas Medical Center NorthEast.
The percutaneous biopsy technique enables researchers and clinicians to collect skeletal muscle tissue samples. The technique is safe and highly effective. This video describes the percutaneous biopsy technique using a modified Bergström needle to obtain skeletal muscle tissue samples from the vastus lateralis of human subjects. The Bergström needle consists of an outer cannula with a small opening (‘window’) at the side of the tip and an inner trocar with a cutting blade at the distal end. Under local anesthesia and aseptic conditions, the needle is advanced into the skeletal muscle through an incision in the skin, subcutaneous tissue, and fascia. Next, suction is applied to the inner trocar, the outer trocar is pulled back, skeletal muscle tissue is drawn into the window of the outer cannula by the suction, and the inner trocar is rapidly closed, thus cutting or clipping the skeletal muscle tissue sample. The needle is rotated 90° and another cut is made. This process may be repeated three more times. This multiple cutting technique typically produces a sample of 100-200 mg or more in healthy subjects and can be done immediately before, during, and after a bout of exercise or other intervention. Following post-biopsy dressing of the incision site, subjects typically resume their activities of daily living right away and can fully participate in vigorous physical activity within 48-72 hr. Subjects should avoid heavy resistance exercise for 48 hr to reduce the risk of herniation of the muscle through the incision in the fascia.
Medicine, Issue 91, percutaneous muscle biopsy, needle biopsy, suction-modified, metabolism, enzyme activity, mRNA, gene function, fiber type, histology, metabolomics, skeletal muscle function, humans
Using Eye Movements to Evaluate the Cognitive Processes Involved in Text Comprehension
Institutions: University of Illinois at Chicago.
The present article describes how to use eye tracking methodologies to study the cognitive processes involved in text comprehension. Measuring eye movements during reading is one of the most precise methods for measuring moment-by-moment (online) processing demands during text comprehension. Cognitive processing demands are reflected by several aspects of eye movement behavior, such as fixation duration, number of fixations, and number of regressions (returning to prior parts of a text). Important properties of eye tracking equipment that researchers need to consider are described, including how frequently the eye position is measured (sampling rate), accuracy of determining eye position, how much head movement is allowed, and ease of use. Also described are properties of stimuli that influence eye movements that need to be controlled in studies of text comprehension, such as the position, frequency, and length of target words. Procedural recommendations related to preparing the participant, setting up and calibrating the equipment, and running a study are given. Representative results are presented to illustrate how data can be evaluated. Although the methodology is described in terms of reading comprehension, much of the information presented can be applied to any study in which participants read verbal stimuli.
Behavior, Issue 83, Eye movements, Eye tracking, Text comprehension, Reading, Cognition
Prehospital Thrombolysis: A Manual from Berlin
Institutions: Charité - Universitätsmedizin Berlin, Charité - Universitätsmedizin Berlin, Universitätsklinikum Hamburg - Eppendorf, Berliner Feuerwehr, STEMO-Consortium.
In acute ischemic stroke, time from symptom onset to intervention is a decisive prognostic factor. In order to reduce this time, prehospital thrombolysis at the emergency site would be preferable. However, apart from neurological expertise and laboratory investigations a computed tomography (CT) scan is necessary to exclude hemorrhagic stroke prior to thrombolysis. Therefore, a specialized ambulance equipped with a CT scanner and point-of-care laboratory was designed and constructed. Further, a new stroke identifying interview algorithm was developed and implemented in the Berlin emergency medical services. Since February 2011 the identification of suspected stroke in the dispatch center of the Berlin Fire Brigade prompts the deployment of this ambulance, a stroke emergency mobile (STEMO). On arrival, a neurologist, experienced in stroke care and with additional training in emergency medicine, takes a neurological examination. If stroke is suspected a CT scan excludes intracranial hemorrhage. The CT-scans are telemetrically transmitted to the neuroradiologist on-call. If coagulation status of the patient is normal and patient's medical history reveals no contraindication, prehospital thrombolysis is applied according to current guidelines (intravenous recombinant tissue plasminogen activator, iv rtPA, alteplase, Actilyse).
Thereafter patients are transported to the nearest hospital with a certified stroke unit for further treatment and assessment of strokeaetiology. After a pilot-phase, weeks were randomized into blocks either with or without STEMO care. Primary end-point of this study is time from alarm to the initiation of thrombolysis. We hypothesized that alarm-to-treatment time can be reduced by at least 20 min compared to regular care.
Medicine, Issue 81, Telemedicine, Emergency Medical Services, Stroke, Tomography, X-Ray Computed, Emergency Treatment,[stroke, thrombolysis, prehospital, emergency medical services, ambulance
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Staining Protocols for Human Pancreatic Islets
Institutions: University of Florida .
Estimates of islet area and numbers and endocrine cell composition in the adult human pancreas vary from several hundred thousand to several million and beta mass ranges from 500 to 1500 mg 1-3
. With this known heterogeneity, a standard processing and staining procedure was developed so that pancreatic regions were clearly defined and islets characterized using rigorous histopathology and immunolocalization examinations.
Standardized procedures for processing human pancreas recovered from organ donors are described in part 1 of this series. The pancreas is processed into 3 main regions (head, body, tail) followed by transverse sections. Transverse sections from the pancreas head are further divided, as indicated based on size, and numbered alphabetically to denote subsections. This standardization allows for a complete cross sectional analysis of the head region including the uncinate region which contains islets composed primarily of pancreatic polypeptide cells to the tail region.
The current report comprises part 2 of this series and describes the procedures used for serial sectioning and histopathological characterization of the pancreatic paraffin sections with an emphasis on islet endocrine cells, replication, and T-cell infiltrates. Pathology of pancreatic sections is intended to characterize both exocrine, ductular, and endocrine components. The exocrine compartment is evaluated for the presence of pancreatitis (active or chronic), atrophy, fibrosis, and fat, as well as the duct system, particularly in relationship to the presence of pancreatic intraductal neoplasia4
. Islets are evaluated for morphology, size, and density, endocrine cells, inflammation, fibrosis, amyloid, and the presence of replicating or apoptotic cells using H&E and IHC stains.
The final component described in part 2 is the provision of the stained slides as digitized whole slide images. The digitized slides are organized by case and pancreas region in an online pathology database creating a virtual biobank. Access to this online collection is currently provided to over 200 clinicians and scientists involved in type 1 diabetes research. The online database provides a means for rapid and complete data sharing and for investigators to select blocks for paraffin or frozen serial sections.
Medicine, Issue 63, Physiology, type 1 diabetes, histology, H&E, immunohistochemistry, insulin, beta-cells, glucagon, alpha-cells, pancreatic polypeptide, islet, pancreas, spleen, organ donor
Mapping the After-effects of Theta Burst Stimulation on the Human Auditory Cortex with Functional Imaging
Institutions: McGill University .
Auditory cortex pertains to the processing of sound, which is at the basis of speech or music-related processing1
. However, despite considerable recent progress, the functional properties and lateralization of the human auditory cortex are far from being fully understood. Transcranial Magnetic Stimulation (TMS) is a non-invasive technique that can transiently or lastingly modulate cortical excitability via the application of localized magnetic field pulses, and represents a unique method of exploring plasticity and connectivity. It has only recently begun to be applied to understand auditory cortical function 2
An important issue in using TMS is that the physiological consequences of the stimulation are difficult to establish. Although many TMS studies make the implicit assumption that the area targeted by the coil is the area affected, this need not be the case, particularly for complex cognitive functions which depend on interactions across many brain regions 3
. One solution to this problem is to combine TMS with functional Magnetic resonance imaging (fMRI). The idea here is that fMRI will provide an index of changes in brain activity associated with TMS. Thus, fMRI would give an independent means of assessing which areas are affected by TMS and how they are modulated 4
. In addition, fMRI allows the assessment of functional connectivity, which represents a measure of the temporal coupling between distant regions. It can thus be useful not only to measure the net activity modulation induced by TMS in given locations, but also the degree to which the network properties are affected by TMS, via any observed changes in functional connectivity.
Different approaches exist to combine TMS and functional imaging according to the temporal order of the methods. Functional MRI can be applied before, during, after, or both before and after TMS. Recently, some studies interleaved TMS and fMRI in order to provide online mapping of the functional changes induced by TMS 5-7
. However, this online combination has many technical problems, including the static artifacts resulting from the presence of the TMS coil in the scanner room, or the effects of TMS pulses on the process of MR image formation. But more importantly, the loud acoustic noise induced by TMS (increased compared with standard use because of the resonance of the scanner bore) and the increased TMS coil vibrations (caused by the strong mechanical forces due to the static magnetic field of the MR scanner) constitute a crucial problem when studying auditory processing.
This is one reason why fMRI was carried out before and after TMS in the present study. Similar approaches have been used to target the motor cortex 8,9
, premotor cortex 10
, primary somatosensory cortex 11,12
and language-related areas 13
, but so far no combined TMS-fMRI study has investigated the auditory cortex. The purpose of this article is to provide details concerning the protocol and considerations necessary to successfully combine these two neuroscientific tools to investigate auditory processing.
Previously we showed that repetitive TMS (rTMS) at high and low frequencies (resp. 10 Hz and 1 Hz) applied over the auditory cortex modulated response time (RT) in a melody discrimination task 2
. We also showed that RT modulation was correlated with functional connectivity in the auditory network assessed using fMRI: the higher the functional connectivity between left and right auditory cortices during task performance, the higher the facilitatory effect (i.e.
decreased RT) observed with rTMS. However those findings were mainly correlational, as fMRI was performed before rTMS. Here, fMRI was carried out before and immediately after TMS to provide direct measures of the functional organization of the auditory cortex, and more specifically of the plastic reorganization of the auditory neural network occurring after the neural intervention provided by TMS.
Combined fMRI and TMS applied over the auditory cortex should enable a better understanding of brain mechanisms of auditory processing, providing physiological information about functional effects of TMS. This knowledge could be useful for many cognitive neuroscience applications, as well as for optimizing therapeutic applications of TMS, particularly in auditory-related disorders.
Neuroscience, Issue 67, Physiology, Physics, Theta burst stimulation, functional magnetic resonance imaging, MRI, auditory cortex, frameless stereotaxy, sound, transcranial magnetic stimulation
DNA Fingerprinting of Mycobacterium leprae Strains Using Variable Number Tandem Repeat (VNTR) - Fragment Length Analysis (FLA)
Institutions: Colorado State University.
The study of the transmission of leprosy is particularly difficult since the causative agent, Mycobacterium leprae
, cannot be cultured in the laboratory. The only sources of the bacteria are leprosy patients, and experimentally infected armadillos and nude mice. Thus, many of the methods used in modern epidemiology are not available for the study of leprosy. Despite an extensive global drug treatment program for leprosy implemented by the WHO1
, leprosy remains endemic in many countries with approximately 250,000 new cases each year.2
The entire M. leprae
genome has been mapped3,4
and many loci have been identified that have repeated segments of 2 or more base pairs (called micro- and minisatellites).5
Clinical strains of M. leprae
may vary in the number of tandem repeated segments (short tandem repeats, STR) at many of these loci.5,6,7
Variable number tandem repeat (VNTR)5
analysis has been used to distinguish different strains of the leprosy bacilli. Some of the loci appear to be more stable than others, showing less variation in repeat numbers, while others seem to change more rapidly, sometimes in the same patient. While the variability of certain VNTRs has brought up questions regarding their suitability for strain typing7,8,9
, the emerging data suggest that analyzing multiple loci, which are diverse in their stability, can be used as a valuable epidemiological tool. Multiple locus VNTR analysis (MLVA)10
has been used to study leprosy evolution and transmission in several countries including China11,12
, the Philippines10,13
, and Brazil14
. MLVA involves multiple steps. First, bacterial DNA is extracted along with host tissue DNA from clinical biopsies or slit skin smears (SSS).10
The desired loci are then amplified from the extracted DNA via polymerase chain reaction (PCR). Fluorescently-labeled primers for 4-5 different loci are used per reaction, with 18 loci being amplified in a total of four reactions.10
The PCR products may be subjected to agarose gel electrophoresis to verify the presence of the desired DNA segments, and then submitted for fluorescent fragment length analysis (FLA) using capillary electrophoresis. DNA from armadillo passaged bacteria with a known number of repeat copies for each locus is used as a positive control. The FLA chromatograms are then examined using Peak Scanner
software and fragment length is converted to number of VNTR copies (allele). Finally, the VNTR haplotypes are analyzed for patterns, and when combined with patient clinical data can be used to track distribution of strain types.
Immunology, Issue 53, Mycobacterium leprae, leprosy, biopsy, STR, VNTR, PCR, fragment length analysis
Genomic MRI - a Public Resource for Studying Sequence Patterns within Genomic DNA
Institutions: University of Toledo Health Science Campus.
Non-coding genomic regions in complex eukaryotes, including intergenic areas, introns, and untranslated segments of exons, are profoundly non-random in their nucleotide composition and consist of a complex mosaic of sequence patterns. These patterns include so-called Mid-Range Inhomogeneity (MRI) regions -- sequences 30-10000 nucleotides in length that are enriched by a particular base or combination of bases (e.g. (G+T)-rich, purine-rich, etc.). MRI regions are associated with unusual (non-B-form) DNA structures that are often involved in regulation of gene expression, recombination, and other genetic processes (Fedorova & Fedorov 2010). The existence of a strong fixation bias within MRI regions against mutations that tend to reduce their sequence inhomogeneity additionally supports the functionality and importance of these genomic sequences (Prakash et al.
Here we demonstrate a freely available Internet resource -- the Genomic MRI
program package -- designed for computational analysis of genomic sequences in order to find and characterize various MRI patterns within them (Bechtel et al.
2008). This package also allows generation of randomized sequences with various properties and level of correspondence to the natural input DNA sequences. The main goal of this resource is to facilitate examination of vast regions of non-coding DNA that are still scarcely investigated and await thorough exploration and recognition.
Genetics, Issue 51, bioinformatics, computational biology, genomics, non-randomness, signals, gene regulation, DNA conformation
Enrichment of NK Cells from Human Blood with the RosetteSep Kit from StemCell Technologies
Institutions: University of California, Irvine (UCI).
Natural killer (NK) cells are large granular cytotoxic lymphocytes that belong to the innate immune system and play major roles in fighting against cancer and infections, but are also implicated in the early stages of pregnancy and transplant rejection. These cells are present in peripheral blood, from which they can be isolated. Cells can be isolated using either positive or negative selection. For positive selection we use antibodies directed to a surface marker present only on the cells of interest whereas for negative selection we use cocktails of antibodies targeted to surface markers present on all cells but the cells of interest. This latter technique presents the advantage of leaving the cells of interest free of antibodies, thereby reducing the risk of unwanted cell activation or differenciation. In this video-protocol we demonstrate how to separate NK cells from human blood by negative selection, using the RosetteSep kit from StemCell technologies. The procedure involves obtaining human peripheral blood (under an institutional review board-approved protocol to protect the human subjects) and mixing it with a cocktail of antibodies that will bind to markers absent on NK cells, but present on all other mononuclear cells present in peripheral blood (e.g., T lymphocytes, monocytes...). The antibodies present in the cocktail are conjugated to antibodies directed to glycophorin A on erythrocytes. All unwanted cells and red blood cells will therefore be trapped in complexes. The mix of blood and antibody cocktail is then diluted, overlayed on a Histopaque gradient, and centrifuged. NK cells (>80% pure) can be collected at the interface between the Histopaque and the diluted plasma. Similar cocktails are available for enrichment of other cell populations, such as human T lymphocytes.
Immunology, issue 8, blood, cell isolation, natural killer, lymphocyte, primary cells, negative selection, PBMC, Ficoll gradient, cell separation
Facilitating the Analysis of Immunological Data with Visual Analytic Techniques
Institutions: University of British Columbia, University of British Columbia, University of British Columbia.
Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.
Immunology, Issue 47, Visual analytics, flow cytometry, Luminex, Tableau, cytokine, innate immunity, single nucleotide polymorphism
Expired CO2 Measurement in Intubated or Spontaneously Breathing Patients from the Emergency Department
Institutions: Universit Catholique de Louvain Cliniques Universitaires Saint-Luc.
Carbon dioxide (CO2
) along with oxygen (O2
) share the role of being the most important gases in the human body. The measuring of expired CO2
at the mouth has solicited growing clinical interest among physicians in the emergency department for various indications: (1) surveillance et monitoring of the intubated patient; (2) verification of the correct positioning of an endotracheal tube; (3) monitoring of a patient in cardiac arrest; (4) achieving normocapnia in intubated head trauma patients; (5) monitoring ventilation during procedural sedation. The video allows physicians to familiarize themselves with the use of capnography and the text offers a review of the theory and principals involved. In particular, the importance of CO2
for the organism, the relevance of measuring expired CO2
, the differences between arterial and expired CO2
, the material used in capnography with their artifacts and traps, will be reviewed. Since the main reluctance in the use of expired CO2
measurement is due to lack of correct knowledge concerning the physiopathology of CO2
by the physician, we hope that this explanation and the video sequences accompanying will help resolve this limitation.
Medicine, Issue 47, capnography, CO2, emergency medicine, end-tidal CO2
Using SCOPE to Identify Potential Regulatory Motifs in Coregulated Genes
Institutions: Dartmouth College.
SCOPE is an ensemble motif finder that uses three component algorithms in parallel to identify potential regulatory motifs by over-representation and motif position preference1
. Each component algorithm is optimized to find a different kind of motif. By taking the best of these three approaches, SCOPE performs better than any single algorithm, even in the presence of noisy data1
. In this article, we utilize a web version of SCOPE2
to examine genes that are involved in telomere maintenance. SCOPE has been incorporated into at least two other motif finding programs3,4
and has been used in other studies5-8
The three algorithms that comprise SCOPE are BEAM9
, which finds non-degenerate motifs (ACCGGT), PRISM10
, which finds degenerate motifs (ASCGWT), and SPACER11
, which finds longer bipartite motifs (ACCnnnnnnnnGGT). These three algorithms have been optimized to find their corresponding type of motif. Together, they allow SCOPE to perform extremely well.
Once a gene set has been analyzed and candidate motifs identified, SCOPE can look for other genes that contain the motif which, when added to the original set, will improve the motif score. This can occur through over-representation or motif position preference. Working with partial gene sets that have biologically verified transcription factor binding sites, SCOPE was able to identify most of the rest of the genes also regulated by the given transcription factor.
Output from SCOPE shows candidate motifs, their significance, and other information both as a table and as a graphical motif map. FAQs and video tutorials are available at the SCOPE web site which also includes a "Sample Search" button that allows the user to perform a trial run.
Scope has a very friendly user interface that enables novice users to access the algorithm's full power without having to become an expert in the bioinformatics of motif finding. As input, SCOPE can take a list of genes, or FASTA sequences. These can be entered in browser text fields, or read from a file. The output from SCOPE contains a list of all identified motifs with their scores, number of occurrences, fraction of genes containing the motif, and the algorithm used to identify the motif. For each motif, result details include a consensus representation of the motif, a sequence logo, a position weight matrix, and a list of instances for every motif occurrence (with exact positions and "strand" indicated). Results are returned in a browser window and also optionally by email. Previous papers describe the SCOPE algorithms in detail1,2,9-11
Genetics, Issue 51, gene regulation, computational biology, algorithm, promoter sequence motif
Counting Human Neural Stem Cells
Institutions: University of California, Irvine (UCI).
Knowledge of the exact number of viable cells in a given volume of a cell suspension is required for many routine tissue culture manipulations, such as plating cells for immunocytochemistry or for cell transfections. This protocol describes a straightforward and fast method for differentiating between live and dead cells and quantifying the cell concentration and total cell number using a hemacytometer. This procedure first requires detaching cells from a growth surface and resuspending them in media. Next, the cells are diluted in a solution of Trypan blue (ideally to a concentration that will give 20-50 cells per quadrant) and placed in the hemacytometer. Finally, averaging the counts of viable cells in several randomly selected quadrants, dividing the average by the volume of one 1 mm2
quadrant (0.1 μl) and multiplying by the dilution factor gives the number of cells per l. Multiplying this cell concentration by the total volume in μl gives the total cell number. This protocol describes counting human neural stem/precursor cells (hNSPCs), but can also be used for many other cell types.
Issue 7, Basic Protocols, Stem Cells, Cell Culture, Cell Counting, Hemocytometer