Estrogen plays vital roles in mammary gland development and breast cancer progression. It mediates its function by binding to and activating the estrogen receptors (ERs), ERα, and ERβ. ERα is frequently upregulated in breast cancer and drives the proliferation of breast cancer cells. The ERs function as transcription factors and regulate gene expression. Whereas ERα's regulation of protein-coding genes is well established, its regulation of noncoding microRNA (miRNA) is less explored. miRNAs play a major role in the post-transcriptional regulation of genes, inhibiting their translation or degrading their mRNA. miRNAs can function as oncogenes or tumor suppressors and are also promising biomarkers. Among the miRNA assays available, microarray and quantitative real-time polymerase chain reaction (qPCR) have been extensively used to detect and quantify miRNA levels. To identify miRNAs regulated by estrogen signaling in breast cancer, their expression in ERα-positive breast cancer cell lines were compared before and after estrogen-activation using both the µParaflo-microfluidic microarrays and Dual Labeled Probes-low density arrays. Results were validated using specific qPCR assays, applying both Cyanine dye-based and Dual Labeled Probes-based chemistry. Furthermore, a time-point assay was used to identify regulations over time. Advantages of the miRNA assay approach used in this study is that it enables a fast screening of mature miRNA regulations in numerous samples, even with limited sample amounts. The layout, including the specific conditions for cell culture and estrogen treatment, biological and technical replicates, and large-scale screening followed by in-depth confirmations using separate techniques, ensures a robust detection of miRNA regulations, and eliminates false positives and other artifacts. However, mutated or unknown miRNAs, or regulations at the primary and precursor transcript level, will not be detected. The method presented here represents a thorough investigation of estrogen-mediated miRNA regulation.
17 Related JoVE Articles!
Comparative in vivo Study of gp96 Adjuvanticity in the Frog Xenopus laevis
Institutions: University of Rochester.
We have developed in the amphibian Xenopus laevis
a unique non-mammalian model to study the ability of certain heat shock proteins (hsps) such as gp96 to facilitate cross-presentation of chaperoned antigens and elicit innate and adaptive T cell responses. Xenopus
skin graft rejection provides an excellent platform to study the ability of gp96 to elicit classical MHC class Ia (class Ia) restricted T cell responses. Additionally, the Xenopus
model system also provides an attractive alternative to mice for exploring the ability of gp96 to generate responses against tumors that have down-regulated their class Ia molecules thereby escaping immune surveillance. Recently, we have developed an adoptive cell transfer assay in Xenopus
clones using peritoneal leukocytes as antigen presenting cells (APCs), and shown that gp96 can prime CD8 T cell responses in vivo
against minor histocompatibility skin antigens as well as against the Xenopus
thymic tumor 15/0 that does not express class Ia molecules. We describe here the methodology involved to perform these assays including the elicitation, pulsing and adoptive transfer of peritoneal leukocytes, as well as the skin graft and tumor transplantation assays. Additionally we are also describing the harvesting and separation of peripheral blood leukocytes used for flow cytometry and proliferation assays which allow for further characterization of the effector populations involved in skin rejection and anti-tumor responses.
Immunology, Issue 43, Immunological, properties, Xenopus, gp96
Metabolomic Analysis of Rat Brain by High Resolution Nuclear Magnetic Resonance Spectroscopy of Tissue Extracts
Institutions: Aix-Marseille Université, Aix-Marseille Université.
Studies of gene expression on the RNA and protein levels have long been used to explore biological processes underlying disease. More recently, genomics and proteomics have been complemented by comprehensive quantitative analysis of the metabolite pool present in biological systems. This strategy, termed metabolomics, strives to provide a global characterization of the small-molecule complement involved in metabolism. While the genome and the proteome define the tasks cells can perform, the metabolome is part of the actual phenotype. Among the methods currently used in metabolomics, spectroscopic techniques are of special interest because they allow one to simultaneously analyze a large number of metabolites without prior selection for specific biochemical pathways, thus enabling a broad unbiased approach. Here, an optimized experimental protocol for metabolomic analysis by high-resolution NMR spectroscopy is presented, which is the method of choice for efficient quantification of tissue metabolites. Important strengths of this method are (i) the use of crude extracts, without the need to purify the sample and/or separate metabolites; (ii) the intrinsically quantitative nature of NMR, permitting quantitation of all metabolites represented by an NMR spectrum with one reference compound only; and (iii) the nondestructive nature of NMR enabling repeated use of the same sample for multiple measurements. The dynamic range of metabolite concentrations that can be covered is considerable due to the linear response of NMR signals, although metabolites occurring at extremely low concentrations may be difficult to detect. For the least abundant compounds, the highly sensitive mass spectrometry method may be advantageous although this technique requires more intricate sample preparation and quantification procedures than NMR spectroscopy. We present here an NMR protocol adjusted to rat brain analysis; however, the same protocol can be applied to other tissues with minor modifications.
Neuroscience, Issue 91, metabolomics, brain tissue, rodents, neurochemistry, tissue extracts, NMR spectroscopy, quantitative metabolite analysis, cerebral metabolism, metabolic profile
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33
. To help improve this understanding, proton magnetic resonance spectroscopy (1
H-MRS) can be used as it allows the in vivo
quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41
. In fact, a recent study demonstrated that 1
H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34
. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1
H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31
. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Biochemical and High Throughput Microscopic Assessment of Fat Mass in Caenorhabditis Elegans
Institutions: Massachusetts General Hospital and Harvard Medical School, Massachusetts Institute of Technology.
The nematode C. elegans
has emerged as an important model for the study of conserved genetic pathways regulating fat metabolism as it relates to human obesity and its associated pathologies. Several previous methodologies developed for the visualization of C. elegans
triglyceride-rich fat stores have proven to be erroneous, highlighting cellular compartments other than lipid droplets. Other methods require specialized equipment, are time-consuming, or yield inconsistent results. We introduce a rapid, reproducible, fixative-based Nile red staining method for the accurate and rapid detection of neutral lipid droplets in C. elegans
. A short fixation step in 40% isopropanol makes animals completely permeable to Nile red, which is then used to stain animals. Spectral properties of this lipophilic dye allow it to strongly and selectively fluoresce in the yellow-green spectrum only when in a lipid-rich environment, but not in more polar environments. Thus, lipid droplets can be visualized on a fluorescent microscope equipped with simple GFP imaging capability after only a brief Nile red staining step in isopropanol. The speed, affordability, and reproducibility of this protocol make it ideally suited for high throughput screens. We also demonstrate a paired method for the biochemical determination of triglycerides and phospholipids using gas chromatography mass-spectrometry. This more rigorous protocol should be used as confirmation of results obtained from the Nile red microscopic lipid determination. We anticipate that these techniques will become new standards in the field of C. elegans
Genetics, Issue 73, Biochemistry, Cellular Biology, Molecular Biology, Developmental Biology, Physiology, Anatomy, Caenorhabditis elegans, Obesity, Energy Metabolism, Lipid Metabolism, C. elegans, fluorescent lipid staining, lipids, Nile red, fat, high throughput screening, obesity, gas chromatography, mass spectrometry, GC/MS, animal model
Using the Threat Probability Task to Assess Anxiety and Fear During Uncertain and Certain Threat
Institutions: University of Wisconsin-Madison.
Fear of certain threat and anxiety about uncertain threat are distinct emotions with unique behavioral, cognitive-attentional, and neuroanatomical components. Both anxiety and fear can be studied in the laboratory by measuring the potentiation of the startle reflex. The startle reflex is a defensive reflex that is potentiated when an organism is threatened and the need for defense is high. The startle reflex is assessed via electromyography (EMG) in the orbicularis oculi muscle elicited by brief, intense, bursts of acoustic white noise (i.e.
, “startle probes”). Startle potentiation is calculated as the increase in startle response magnitude during presentation of sets of visual threat cues that signal delivery of mild electric shock relative to sets of matched cues that signal the absence of shock (no-threat cues). In the Threat Probability Task, fear is measured via startle potentiation to high probability (100% cue-contingent shock; certain) threat cues whereas anxiety is measured via startle potentiation to low probability (20% cue-contingent shock; uncertain) threat cues. Measurement of startle potentiation during the Threat Probability Task provides an objective and easily implemented alternative to assessment of negative affect via self-report or other methods (e.g.
, neuroimaging) that may be inappropriate or impractical for some researchers. Startle potentiation has been studied rigorously in both animals (e.g
., rodents, non-human primates) and humans which facilitates animal-to-human translational research. Startle potentiation during certain and uncertain threat provides an objective measure of negative affective and distinct emotional states (fear, anxiety) to use in research on psychopathology, substance use/abuse and broadly in affective science. As such, it has been used extensively by clinical scientists interested in psychopathology etiology and by affective scientists interested in individual differences in emotion.
Behavior, Issue 91,
Startle; electromyography; shock; addiction; uncertainty; fear; anxiety; humans; psychophysiology; translational
Proteomic Profiling of Macrophages by 2D Electrophoresis
Institutions: University Lille Nord de France.
The goal of the two-dimensional (2D) electrophoresis protocol described here is to show how to analyse the phenotype of human cultured macrophages. The key role of macrophages has been shown in various pathological disorders such as inflammatory, immunological, and infectious diseases. In this protocol, we use primary cultures of human monocyte-derived macrophages that can be differentiated into the M1 (pro-inflammatory) or the M2 (anti-inflammatory) phenotype. This in vitro
model is reliable for studying the biological activities of M1 and M2 macrophages and also for a proteomic approach. Proteomic techniques are useful for comparing the phenotype and behaviour of M1 and M2 macrophages during host pathogenicity. 2D gel electrophoresis is a powerful proteomic technique for mapping large numbers of proteins or polypeptides simultaneously. We describe the protocol of 2D electrophoresis using fluorescent dyes, named 2D Differential Gel Electrophoresis (DIGE). The M1 and M2 macrophages proteins are labelled with cyanine dyes before separation by isoelectric focusing, according to their isoelectric point in the first dimension, and their molecular mass, in the second dimension. Separated protein or polypeptidic spots are then used to detect differences in protein or polypeptide expression levels. The proteomic approaches described here allows the investigation of the macrophage protein changes associated with various disorders like host pathogenicity or microbial toxins.
Immunology, Issue 93, Biology, Human, Buffy coat, Monocytes, Macrophages, Culture, Proteins, Proteome, 2D DIGE-electrophoresis, 2D software
Determining Optimal Cytotoxic Activity of Human Her2neu Specific CD8 T cells by Comparing the Cr51 Release Assay to the xCELLigence System
Institutions: College of Medicine, Mayo Clinic.
Cytotoxic CD8 T cells constitute a subgroup of T cells that are capable of inducing the death of infected or malignant host cells1
. These cells express a specialized receptor, called the T cell receptor (TCR), which can recognize a specific antigenic peptide bound to HLA class I molecules2
. Engagement of infected cells or tumor cells through their HLA class I molecule results in production of lytic molecules such as granzymes and perforin resulting in target cell death. While it is useful to determine frequencies of antigen-specific CD8 T cells using assays such as the ELIspot or flow cytometry, it is also helpful to ascertain the strength of CD8 T cell responses using cytotoxicity assays3
. The most recognizable assay for assessing cytotoxic function is the Chromium Release Assay (CRA), which is considered a standard assay 4
. The CRA has several limitations, including exposure of cells to gamma radiation, lack of reproducibility, and a requirement for large numbers of cells. Over the past decade, there has been interest in adopting new strategies to overcome these limitations. Newer approaches include those that measure caspase
, BLT esterase activity 5
and surface expression of CD107 6
. The impedance-based assay, using the Roche xCelligence system, was examined in the present paper for its potential as an alternative to the CRA. Impedance or opposition to an electric current occurs when adherent tumor cells bind to electrode plates. Tumor cells detach following killing and electrical impedance is reduced which can be measured by the xCelligence system. The ability to adapt the impedance-based approach to assess cell-mediated killing rests on the observation that T cells do not adhere tightly to most surfaces and do not appear to have much impact on impedance thus diminishing any concern of direct interference of the T cells with the measurement. Results show that the impedance-based assay can detect changes in the levels of antigen-specific cytotoxic CD8 T cells with increased sensitivity relative to the standard CRA. Based on these results, impedance-based approaches may be good alternatives to CRAs or other approaches that aim to measure cytotoxic CD8 T cell functionality.
Immunology, Issue 66, Medicine, Cancer Biology, vaccine, immunity, adoptive T cell therapy, lymphocyte, CD8, T cells
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4
is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1
). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6
. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8
. Using logistic regression analysis of subject scores (i.e.
pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e.
composite networks with improved discrimination of patients from healthy control subjects5,6
. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9
. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10
. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11
. These standardized values can in turn be used to assist in differential diagnosis12,13
and to assess disease progression and treatment effects at the network level7,14-16
. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
An Investigation of the Effects of Sports-related Concussion in Youth Using Functional Magnetic Resonance Imaging and the Head Impact Telemetry System
Institutions: University of Toronto, University of Toronto, University of Toronto, Bloorview Kids Rehab, Toronto Rehab, Sunnybrook Health Sciences Centre, University of Toronto.
One of the most commonly reported injuries in children who participate in sports is concussion or mild traumatic brain injury (mTBI)1
. Children and youth involved in organized sports such as competitive hockey are nearly six times more likely to suffer a severe concussion compared to children involved in other leisure physical activities2
. While the most common cognitive sequelae of mTBI appear similar for children and adults, the recovery profile and breadth of consequences in children remains largely unknown2
, as does the influence of pre-injury characteristics (e.g. gender) and injury details (e.g. magnitude and direction of impact) on long-term outcomes. Competitive sports, such as hockey, allow the rare opportunity to utilize a pre-post design to obtain pre-injury data before concussion occurs on youth characteristics and functioning and to relate this to outcome following injury. Our primary goals are to refine pediatric concussion diagnosis and management based on research evidence that is specific to children and youth. To do this we use new, multi-modal and integrative approaches that will:
1.Evaluate the immediate effects of head trauma in youth
2.Monitor the resolution of post-concussion symptoms (PCS) and cognitive performance during recovery
3.Utilize new methods to verify brain injury and recovery
To achieve our goals, we have implemented the Head Impact Telemetry (HIT) System. (Simbex; Lebanon, NH, USA). This system equips commercially available Easton S9 hockey helmets (Easton-Bell Sports; Van Nuys, CA, USA) with single-axis accelerometers designed to measure real-time head accelerations during contact sport participation 3 - 5
. By using telemetric technology, the magnitude of acceleration and location of all head impacts during sport participation can be objectively detected and recorded. We also use functional magnetic resonance imaging (fMRI) to localize and assess changes in neural activity specifically in the medial temporal and frontal lobes during the performance of cognitive tasks, since those are the cerebral regions most sensitive to concussive head injury 6
. Finally, we are acquiring structural imaging data sensitive to damage in brain white matter.
Medicine, Issue 47, Mild traumatic brain injury, concussion, fMRI, youth, Head Impact Telemetry System
Hyperpolarized Xenon for NMR and MRI Applications
Institutions: Leibniz-Institut für Molekulare Pharmakologie.
Nuclear magnetic resonance (NMR) spectroscopy and imaging (MRI) suffer from intrinsic low sensitivity because even strong external magnetic fields of ~10 T generate only a small detectable net-magnetization of the sample at room temperature 1
. Hence, most NMR and MRI applications rely on the detection of molecules at relative high concentration (e.g.
, water for imaging of biological tissue) or require excessive acquisition times. This limits our ability to exploit the very useful molecular specificity of NMR signals for many biochemical and medical applications. However, novel approaches have emerged in the past few years: Manipulation of the detected spin species prior to detection inside the NMR/MRI magnet can dramatically increase the magnetization and therefore allows detection of molecules at much lower concentration 2
Here, we present a method for polarization of a xenon gas mixture (2-5% Xe, 10% N2
, He balance) in a compact setup with a ca. 16000-fold signal enhancement. Modern line-narrowed diode lasers allow efficient polarization 7
and immediate use of gas mixture even if the noble gas is not separated from the other components. The SEOP apparatus is explained and determination of the achieved spin polarization is demonstrated for performance control of the method.
The hyperpolarized gas can be used for void space imaging, including gas flow imaging or diffusion studies at the interfaces with other materials 8,9
. Moreover, the Xe NMR signal is extremely sensitive to its molecular environment 6
. This enables the option to use it as an NMR/MRI contrast agent when dissolved in aqueous solution with functionalized molecular hosts that temporarily trap the gas 10,11
. Direct detection and high-sensitivity indirect detection of such constructs is demonstrated in both spectroscopic and imaging mode.
Physics, Issue 67, NMR, MRI, hyperpolarization, optical pumping, SEOP, xenon, molecular imaging, biosensor
Fabrication of Uniform Nanoscale Cavities via Silicon Direct Wafer Bonding
Institutions: The State University of New York at Buffalo, University of Maryland, The National Institute of Standards and Technology, NASA Goddard Space Flight Center, HRL Laboratories.
Measurements of the heat capacity and superfluid fraction of confined 4
He have been performed near the lambda transition using lithographically patterned and bonded silicon wafers. Unlike confinements in porous materials often used for these types of experiments3
, bonded wafers provide predesigned uniform spaces for confinement. The geometry of each cell is well known, which removes a large source of ambiguity in the interpretation of data.
Exceptionally flat, 5 cm diameter, 375 µm thick Si wafers with about 1 µm variation over the entire wafer can be obtained commercially (from Semiconductor Processing Company, for example). Thermal oxide is grown on the wafers to define the confinement dimension in the z-direction. A pattern is then etched in the oxide using lithographic techniques so as to create a desired enclosure upon bonding. A hole is drilled in one of the wafers (the top) to allow for the introduction of the liquid to be measured. The wafers are cleaned2
in RCA solutions and then put in a microclean chamber where they are rinsed with deionized water4
. The wafers are bonded at RT and then annealed at ~1,100 °C. This forms a strong and permanent bond. This process can be used to make uniform enclosures for measuring thermal and hydrodynamic properties of confined liquids from the nanometer to the micrometer scale.
Physics, Issue 83, silicon direct wafer bonding, nanoscale, bonded wafers, silicon wafer, confined liquids, lithographic techniques
Tomato Analyzer: A Useful Software Application to Collect Accurate and Detailed Morphological and Colorimetric Data from Two-dimensional Objects
Institutions: The Ohio State University.
Measuring fruit morphology and color traits of vegetable and fruit crops in an objective and reproducible way is important for detailed phenotypic analyses of these traits. Tomato Analyzer (TA) is a software program that measures 37 attributes related to two-dimensional shape in a semi-automatic and reproducible manner1,2
. Many of these attributes, such as angles at the distal and proximal ends of the fruit and areas of indentation, are difficult to quantify manually. The attributes are organized in ten categories within the software: Basic Measurement, Fruit Shape Index, Blockiness, Homogeneity, Proximal Fruit End Shape, Distal Fruit End Shape, Asymmetry, Internal Eccentricity, Latitudinal Section and Morphometrics. The last category requires neither prior knowledge nor predetermined notions of the shape attributes, so morphometric analysis offers an unbiased option that may be better adapted to high-throughput analyses than attribute analysis. TA also offers the Color Test application that was designed to collect color measurements from scanned images and allow scanning devices to be calibrated using color standards3
TA provides several options to export and analyze shape attribute, morphometric, and color data. The data may be exported to an excel file in batch mode (more than 100 images at one time) or exported as individual images. The user can choose between output that displays the average for each attribute for the objects in each image (including standard deviation), or an output that displays the attribute values for each object on the image. TA has been a valuable and effective tool for indentifying and confirming tomato fruit shape Quantitative Trait Loci (QTL), as well as performing in-depth analyses of the effect of key fruit shape genes on plant morphology. Also, TA can be used to objectively classify fruit into various shape categories. Lastly, fruit shape and color traits in other plant species as well as other plant organs such as leaves and seeds can be evaluated with TA.
Plant Biology, Issue 37, morphology, color, image processing, quantitative trait loci, software
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques