The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
23 Related JoVE Articles!
High Throughput Sequential ELISA for Validation of Biomarkers of Acute Graft-Versus-Host Disease
Institutions: University of Michigan .
Unbiased discovery proteomics strategies have the potential to identify large numbers of novel biomarkers that can improve diagnostic and prognostic testing in a clinical setting and may help guide therapeutic interventions. When large numbers of candidate proteins are identified, it may be difficult to validate candidate biomarkers in a timely and efficient fashion from patient plasma samples that are event-driven, of finite volume and irreplaceable, such as at the onset of acute graft-versus-host disease (GVHD), a potentially life-threatening complication of allogeneic hematopoietic stem cell transplantation (HSCT).
Here we describe the process of performing commercially available ELISAs for six validated GVHD proteins: IL-2Rα5
, and REG3α3
(also known as PAP1) in a sequential fashion to minimize freeze-thaw cycles, thawed plasma time and plasma usage. For this procedure we perform the ELISAs in sequential order as determined by sample dilution factor as established in our laboratory using manufacturer ELISA kits and protocols with minor adjustments to facilitate optimal sequential ELISA performance. The resulting plasma biomarker concentrations can then be compiled and analyzed for significant findings within a patient cohort. While these biomarkers are currently for research purposes only, their incorporation into clinical care is currently being investigated in clinical trials.
This technique can be applied to perform ELISAs for multiple proteins/cytokines of interest on the same sample(s) provided the samples do not need to be mixed with other reagents. If ELISA kits do not come with pre-coated plates, 96-well half-well plates or 384-well plates can be used to further minimize use of samples/reagents.
Medicine, Issue 68, ELISA, Sequential ELISA, Cytokine, Blood plasma, biomarkers, proteomics, graft-versus-host disease, Small sample, Quantification
A Sensitive and Specific Quantitation Method for Determination of Serum Cardiac Myosin Binding Protein-C by Electrochemiluminescence Immunoassay
Institutions: Loyola University Chicago.
Biomarkers are becoming increasingly more important in clinical decision-making, as well as basic science. Diagnosing myocardial infarction (MI) is largely driven by detecting cardiac-specific proteins in patients' serum or plasma as an indicator of myocardial injury. Having recently shown that cardiac myosin binding protein-C (cMyBP-C) is detectable in the serum after MI, we have proposed it as a potential biomarker for MI. Biomarkers are typically detected by traditional sandwich enzyme-linked immunosorbent assays. However, this technique requires a large sample volume, has a small dynamic range, and can measure only one protein at a time.
Here we show a multiplex immunoassay in which three cardiac proteins can be measured simultaneously with high sensitivity. Measuring cMyBP-C in uniplex or together with creatine kinase MB and cardiac troponin I showed comparable sensitivity. This technique uses the Meso Scale Discovery (MSD) method of multiplexing in a 96-well plate combined with electrochemiluminescence for detection. While only small sample volumes are required, high sensitivity and a large dynamic range are achieved. Using this technique, we measured cMyBP-C, creatine kinase MB, and cardiac troponin I levels in serum samples from 16 subjects with MI and compared the results with 16 control subjects. We were able to detect all three markers in these samples and found all three biomarkers to be increased after MI. This technique is, therefore, suitable for the sensitive detection of cardiac biomarkers in serum samples.
Molecular Biology, Issue 78, Cellular Biology, Biochemistry, Genetics, Biomedical Engineering, Medicine, Cardiology, Heart Diseases, Myocardial Ischemia, Myocardial Infarction, Cardiovascular Diseases, cardiovascular disease, immunoassay, cardiac myosin binding protein-C, cardiac troponin I, creatine kinase MB, electrochemiluminescence, multiplex biomarkers, ELISA, assay
Controlling Parkinson's Disease With Adaptive Deep Brain Stimulation
Institutions: University of Oxford, UCL Institute of Neurology.
Adaptive deep brain stimulation (aDBS) has the potential to improve the treatment of Parkinson's disease by optimizing stimulation in real time according to fluctuating disease and medication state. In the present realization of adaptive DBS we record and stimulate from the DBS electrodes implanted in the subthalamic nucleus of patients with Parkinson's disease in the early post-operative period. Local field potentials are analogue filtered between 3 and 47 Hz before being passed to a data acquisition unit where they are digitally filtered again around the patient specific beta peak, rectified and smoothed to give an online reading of the beta amplitude. A threshold for beta amplitude is set heuristically, which, if crossed, passes a trigger signal to the stimulator. The stimulator then ramps up stimulation to a pre-determined clinically effective voltage over 250 msec and continues to stimulate until the beta amplitude again falls down below threshold. Stimulation continues in this manner with brief episodes of ramped DBS during periods of heightened beta power.
Clinical efficacy is assessed after a minimum period of stabilization (5 min) through the unblinded and blinded video assessment of motor function using a selection of scores from the Unified Parkinson's Rating Scale (UPDRS). Recent work has demonstrated a reduction in power consumption with aDBS as well as an improvement in clinical scores compared to conventional DBS. Chronic aDBS could now be trialed in Parkinsonism.
Medicine, Issue 89, Parkinson's, deep brain stimulation, adaptive, closed loop
A Comparative Approach to Characterize the Landscape of Host-Pathogen Protein-Protein Interactions
Institutions: Institut Pasteur , Université Sorbonne Paris Cité, Dana Farber Cancer Institute.
Significant efforts were gathered to generate large-scale comprehensive protein-protein interaction network maps. This is instrumental to understand the pathogen-host relationships and was essentially performed by genetic screenings in yeast two-hybrid systems. The recent improvement of protein-protein interaction detection by a Gaussia
luciferase-based fragment complementation assay now offers the opportunity to develop integrative comparative interactomic approaches necessary to rigorously compare interaction profiles of proteins from different pathogen strain variants against a common set of cellular factors.
This paper specifically focuses on the utility of combining two orthogonal methods to generate protein-protein interaction datasets: yeast two-hybrid (Y2H) and a new assay, high-throughput Gaussia princeps
protein complementation assay (HT-GPCA) performed in mammalian cells.
A large-scale identification of cellular partners of a pathogen protein is performed by mating-based yeast two-hybrid screenings of cDNA libraries using multiple pathogen strain variants. A subset of interacting partners selected on a high-confidence statistical scoring is further validated in mammalian cells for pair-wise interactions with the whole set of pathogen variants proteins using HT-GPCA. This combination of two complementary methods improves the robustness of the interaction dataset, and allows the performance of a stringent comparative interaction analysis. Such comparative interactomics constitute a reliable and powerful strategy to decipher any pathogen-host interplays.
Immunology, Issue 77, Genetics, Microbiology, Biochemistry, Molecular Biology, Cellular Biology, Biomedical Engineering, Infection, Cancer Biology, Virology, Medicine, Host-Pathogen Interactions, Host-Pathogen Interactions, Protein-protein interaction, High-throughput screening, Luminescence, Yeast two-hybrid, HT-GPCA, Network, protein, yeast, cell, culture
Experimental Protocol for Manipulating Plant-induced Soil Heterogeneity
Institutions: Case Western Reserve University.
Coexistence theory has often treated environmental heterogeneity as being independent of the community composition; however biotic feedbacks such as plant-soil feedbacks (PSF) have large effects on plant performance, and create environmental heterogeneity that depends on the community composition. Understanding the importance of PSF for plant community assembly necessitates understanding of the role of heterogeneity in PSF, in addition to mean PSF effects. Here, we describe a protocol for manipulating plant-induced soil heterogeneity. Two example experiments are presented: (1) a field experiment with a 6-patch grid of soils to measure plant population responses and (2) a greenhouse experiment with 2-patch soils to measure individual plant responses. Soils can be collected from the zone of root influence (soils from the rhizosphere and directly adjacent to the rhizosphere) of plants in the field from conspecific and heterospecific plant species. Replicate collections are used to avoid pseudoreplicating soil samples. These soils are then placed into separate patches for heterogeneous treatments or mixed for a homogenized treatment. Care should be taken to ensure that heterogeneous and homogenized treatments experience the same degree of soil disturbance. Plants can then be placed in these soil treatments to determine the effect of plant-induced soil heterogeneity on plant performance. We demonstrate that plant-induced heterogeneity results in different outcomes than predicted by traditional coexistence models, perhaps because of the dynamic nature of these feedbacks. Theory that incorporates environmental heterogeneity influenced by the assembling community and additional empirical work is needed to determine when heterogeneity intrinsic to the assembling community will result in different assembly outcomes compared with heterogeneity extrinsic to the community composition.
Environmental Sciences, Issue 85, Coexistence, community assembly, environmental drivers, plant-soil feedback, soil heterogeneity, soil microbial communities, soil patch
Free Radicals in Chemical Biology: from Chemical Behavior to Biomarker Development
Institutions: Consiglio Nazionale delle Ricerche.
The involvement of free radicals in life sciences has constantly increased with time and has been connected to several physiological and pathological processes. This subject embraces diverse scientific areas, spanning from physical, biological and bioorganic chemistry to biology and medicine, with applications to the amelioration of quality of life, health and aging. Multidisciplinary skills are required for the full investigation of the many facets of radical processes in the biological environment and chemical knowledge plays a crucial role in unveiling basic processes and mechanisms. We developed a chemical biology approach able to connect free radical chemical reactivity with biological processes, providing information on the mechanistic pathways and products. The core of this approach is the design of biomimetic models to study biomolecule behavior (lipids, nucleic acids and proteins) in aqueous systems, obtaining insights of the reaction pathways as well as building up molecular libraries of the free radical reaction products. This context can be successfully used for biomarker discovery and examples are provided with two classes of compounds: mono-trans isomers of cholesteryl esters, which are synthesized and used as references for detection in human plasma, and purine 5',8-cyclo-2'-deoxyribonucleosides, prepared and used as reference in the protocol for detection of such lesions in DNA samples, after ionizing radiations or obtained from different health conditions.
Chemistry, Issue 74, Biochemistry, Chemical Engineering, Chemical Biology, chemical analysis techniques, chemistry (general), life sciences, radiation effects (biological, animal and plant), biomarker, biomimetic chemistry, free radicals, trans lipids, cyclopurine lesions, DNA, chromatography, spectroscopy, synthesis
Quantitative, Real-time Analysis of Base Excision Repair Activity in Cell Lysates Utilizing Lesion-specific Molecular Beacons
Institutions: University of Pittsburgh School of Medicine, University of Pittsburgh Cancer Institute, The Netherlands Cancer Institute, University of Pittsburgh School of Public Health.
We describe a method for the quantitative, real-time measurement of DNA glycosylase and AP endonuclease activities in cell nuclear lysates using base excision repair (BER) molecular beacons. The substrate (beacon) is comprised of a deoxyoligonucleotide containing a single base lesion with a 6-Carboxyfluorescein (6-FAM) moiety conjugated to the 5'end and a Dabcyl moiety conjugated to the 3' end of the oligonucleotide. The BER molecular beacon is 43 bases in length and the sequence is designed to promote the formation of a stem-loop structure with 13 nucleotides in the loop and 15 base pairs in the stem1,2
. When folded in this configuration the 6-FAM moiety is quenched by Dabcyl in a non-fluorescent manner via Förster Resonance Energy Transfer (FRET)3,4
. The lesion is positioned such that following base lesion removal and strand scission the remaining 5 base oligonucleotide containing the 6-FAM moiety is released from the stem. Release and detachment from the quencher (Dabcyl) results in an increase of fluorescence that is proportionate to the level of DNA repair. By collecting multiple reads of the fluorescence values, real-time assessment of BER activity is possible. The use of standard quantitative real-time PCR instruments allows the simultaneous analysis of numerous samples. The design of these BER molecular beacons, with a single base lesion, is amenable to kinetic analyses, BER quantification and inhibitor validation and is adaptable for quantification of DNA Repair activity in tissue and tumor cell lysates or with purified proteins. The analysis of BER activity in tumor lysates or tissue aspirates using these molecular beacons may be applicable to functional biomarker measurements. Further, the analysis of BER activity with purified proteins using this quantitative assay provides a rapid, high-throughput method for the discovery and validation of BER inhibitors.
Molecular Biology, Issue 66, Genetics, Cancer Biology, Base excision repair, DNA glycosylase, AP endonuclease, fluorescent, real-time, activity assay, molecular beacon, biomarker, DNA Damage, base lesion
Multi-step Preparation Technique to Recover Multiple Metabolite Compound Classes for In-depth and Informative Metabolomic Analysis
Institutions: National Jewish Health, University of Colorado Denver.
Metabolomics is an emerging field which enables profiling of samples from living organisms in order to obtain insight into biological processes. A vital aspect of metabolomics is sample preparation whereby inconsistent techniques generate unreliable results. This technique encompasses protein precipitation, liquid-liquid extraction, and solid-phase extraction as a means of fractionating metabolites into four distinct classes. Improved enrichment of low abundance molecules with a resulting increase in sensitivity is obtained, and ultimately results in more confident identification of molecules. This technique has been applied to plasma, bronchoalveolar lavage fluid, and cerebrospinal fluid samples with volumes as low as 50 µl. Samples can be used for multiple downstream applications; for example, the pellet resulting from protein precipitation can be stored for later analysis. The supernatant from that step undergoes liquid-liquid extraction using water and strong organic solvent to separate the hydrophilic and hydrophobic compounds. Once fractionated, the hydrophilic layer can be processed for later analysis or discarded if not needed. The hydrophobic fraction is further treated with a series of solvents during three solid-phase extraction steps to separate it into fatty acids, neutral lipids, and phospholipids. This allows the technician the flexibility to choose which class of compounds is preferred for analysis. It also aids in more reliable metabolite identification since some knowledge of chemical class exists.
Bioengineering, Issue 89, plasma, chemistry techniques, analytical, solid phase extraction, mass spectrometry, metabolomics, fluids and secretions, profiling, small molecules, lipids, liquid chromatography, liquid-liquid extraction, cerebrospinal fluid, bronchoalveolar lavage fluid
High Efficiency Differentiation of Human Pluripotent Stem Cells to Cardiomyocytes and Characterization by Flow Cytometry
Institutions: Medical College of Wisconsin, Stanford University School of Medicine, Medical College of Wisconsin, Hong Kong University, Johns Hopkins University School of Medicine, Medical College of Wisconsin.
There is an urgent need to develop approaches for repairing the damaged heart, discovering new therapeutic drugs that do not have toxic effects on the heart, and improving strategies to accurately model heart disease. The potential of exploiting human induced pluripotent stem cell (hiPSC) technology to generate cardiac muscle “in a dish” for these applications continues to generate high enthusiasm. In recent years, the ability to efficiently generate cardiomyogenic cells from human pluripotent stem cells (hPSCs) has greatly improved, offering us new opportunities to model very early stages of human cardiac development not otherwise accessible. In contrast to many previous methods, the cardiomyocyte differentiation protocol described here does not require cell aggregation or the addition of Activin A or BMP4 and robustly generates cultures of cells that are highly positive for cardiac troponin I and T (TNNI3, TNNT2), iroquois-class homeodomain protein IRX-4 (IRX4), myosin regulatory light chain 2, ventricular/cardiac muscle isoform (MLC2v) and myosin regulatory light chain 2, atrial isoform (MLC2a) by day 10 across all human embryonic stem cell (hESC) and hiPSC lines tested to date. Cells can be passaged and maintained for more than 90 days in culture. The strategy is technically simple to implement and cost-effective. Characterization of cardiomyocytes derived from pluripotent cells often includes the analysis of reference markers, both at the mRNA and protein level. For protein analysis, flow cytometry is a powerful analytical tool for assessing quality of cells in culture and determining subpopulation homogeneity. However, technical variation in sample preparation can significantly affect quality of flow cytometry data. Thus, standardization of staining protocols should facilitate comparisons among various differentiation strategies. Accordingly, optimized staining protocols for the analysis of IRX4, MLC2v, MLC2a, TNNI3, and TNNT2 by flow cytometry are described.
Cellular Biology, Issue 91, human induced pluripotent stem cell, flow cytometry, directed differentiation, cardiomyocyte, IRX4, TNNI3, TNNT2, MCL2v, MLC2a
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Chemically-blocked Antibody Microarray for Multiplexed High-throughput Profiling of Specific Protein Glycosylation in Complex Samples
Institutions: Institute for Hepatitis and Virus Research, Thomas Jefferson University , Drexel University College of Medicine, Van Andel Research Institute, Serome Biosciences Inc..
In this study, we describe an effective protocol for use in a multiplexed high-throughput antibody microarray with glycan binding protein detection that allows for the glycosylation profiling of specific proteins. Glycosylation of proteins is the most prevalent post-translational modification found on proteins, and leads diversified modifications of the physical, chemical, and biological properties of proteins. Because the glycosylation machinery is particularly susceptible to disease progression and malignant transformation, aberrant glycosylation has been recognized as early detection biomarkers for cancer and other diseases. However, current methods to study protein glycosylation typically are too complicated or expensive for use in most normal laboratory or clinical settings and a more practical method to study protein glycosylation is needed. The new protocol described in this study makes use of a chemically blocked antibody microarray with glycan-binding protein (GBP) detection and significantly reduces the time, cost, and lab equipment requirements needed to study protein glycosylation. In this method, multiple immobilized glycoprotein-specific antibodies are printed directly onto the microarray slides and the N-glycans on the antibodies are blocked. The blocked, immobilized glycoprotein-specific antibodies are able to capture and isolate glycoproteins from a complex sample that is applied directly onto the microarray slides. Glycan detection then can be performed by the application of biotinylated lectins and other GBPs to the microarray slide, while binding levels can be determined using Dylight 549-Streptavidin. Through the use of an antibody panel and probing with multiple biotinylated lectins, this method allows for an effective glycosylation profile of the different proteins found in a given human or animal sample to be developed.
Glycosylation of protein, which is the most ubiquitous post-translational modification on proteins, modifies the physical, chemical, and biological properties of a protein, and plays a fundamental role in various biological processes1-6
. Because the glycosylation machinery is particularly susceptible to disease progression and malignant transformation, aberrant glycosylation has been recognized as early detection biomarkers for cancer and other diseases 7-12
. In fact, most current cancer biomarkers, such as the L3 fraction of α-1 fetoprotein (AFP) for hepatocellular carcinoma 13-15
, and CA199 for pancreatic cancer 16, 17
are all aberrant glycan moieties on glycoproteins. However, methods to study protein glycosylation have been complicated, and not suitable for routine laboratory and clinical settings. Chen et al.
has recently invented a chemically blocked antibody microarray with a glycan-binding protein (GBP) detection method for high-throughput and multiplexed profile glycosylation of native glycoproteins in a complex sample 18
. In this affinity based microarray method, multiple immobilized glycoprotein-specific antibodies capture and isolate glycoproteins from the complex mixture directly on the microarray slide, and the glycans on each individual captured protein are measured by GBPs. Because all normal antibodies contain N-glycans which could be recognized by most GBPs, the critical step of this method is to chemically block the glycans on the antibodies from binding to GBP. In the procedure, the cis
-diol groups of the glycans on the antibodies were first oxidized to aldehyde groups by using NaIO4
in sodium acetate buffer avoiding light. The aldehyde groups were then conjugated to the hydrazide group of a cross-linker, 4-(4-N-MaleimidoPhenyl)butyric acid Hydrazide HCl (MPBH), followed by the conjugation of a dipeptide, Cys-Gly, to the maleimide group of the MPBH. Thus, the cis-diol groups on glycans of antibodies were converted into bulky none hydroxyl groups, which hindered the lectins and other GBPs bindings to the capture antibodies. This blocking procedure makes the GBPs and lectins bind only to the glycans of captured proteins. After this chemically blocking, serum samples were incubated with the antibody microarray, followed by the glycans detection by using different biotinylated lectins and GBPs, and visualized with Cy3-streptavidin. The parallel use of an antibody panel and multiple lectin probing provides discrete glycosylation profiles of multiple proteins in a given sample 18-20
. This method has been used successfully in multiple different labs 1, 7, 13, 19-31
. However, stability of MPBH and Cys-Gly, complicated and extended procedure in this method affect the reproducibility, effectiveness and efficiency of the method. In this new protocol, we replaced both MPBH and Cys-Gly with one much more stable reagent glutamic acid hydrazide (Glu-hydrazide), which significantly improved the reproducibility of the method, simplified and shorten the whole procedure so that the it can be completed within one working day. In this new protocol, we describe the detailed procedure of the protocol which can be readily adopted by normal labs for routine protein glycosylation study and techniques which are necessary to obtain reproducible and repeatable results.
Molecular Biology, Issue 63, Glycoproteins, glycan-binding protein, specific protein glycosylation, multiplexed high-throughput glycan blocked antibody microarray
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Hydrogel Nanoparticle Harvesting of Plasma or Urine for Detecting Low Abundance Proteins
Institutions: George Mason University, Ceres Nanosciences.
Novel biomarker discovery plays a crucial role in providing more sensitive and specific disease detection. Unfortunately many low-abundance biomarkers that exist in biological fluids cannot be easily detected with mass spectrometry or immunoassays because they are present in very low concentration, are labile, and are often masked by high-abundance proteins such as albumin or immunoglobulin. Bait containing poly(N-isopropylacrylamide) (NIPAm) based nanoparticles are able to overcome these physiological barriers. In one step they are able to capture, concentrate and preserve biomarkers from body fluids. Low-molecular weight analytes enter the core of the nanoparticle and are captured by different organic chemical dyes, which act as high affinity protein baits. The nanoparticles are able to concentrate the proteins of interest by several orders of magnitude. This concentration factor is sufficient to increase the protein level such that the proteins are within the detection limit of current mass spectrometers, western blotting, and immunoassays. Nanoparticles can be incubated with a plethora of biological fluids and they are able to greatly enrich the concentration of low-molecular weight proteins and peptides while excluding albumin and other high-molecular weight proteins. Our data show that a 10,000 fold amplification in the concentration of a particular analyte can be achieved, enabling mass spectrometry and immunoassays to detect previously undetectable biomarkers.
Bioengineering, Issue 90, biomarker, hydrogel, low abundance, mass spectrometry, nanoparticle, plasma, protein, urine
Microarray-based Identification of Individual HERV Loci Expression: Application to Biomarker Discovery in Prostate Cancer
Institutions: Joint Unit Hospices de Lyon-bioMérieux, BioMérieux, Hospices Civils de Lyon, Lyon 1 University, BioMérieux, Hospices Civils de Lyon, Hospices Civils de Lyon.
The prostate-specific antigen (PSA) is the main diagnostic biomarker for prostate cancer in clinical use, but it lacks specificity and sensitivity, particularly in low dosage values1
. ‘How to use PSA' remains a current issue, either for diagnosis as a gray zone corresponding to a concentration in serum of 2.5-10 ng/ml which does not allow a clear differentiation to be made between cancer and noncancer2
or for patient follow-up as analysis of post-operative PSA kinetic parameters can pose considerable challenges for their practical application3,4
. Alternatively, noncoding RNAs (ncRNAs) are emerging as key molecules in human cancer, with the potential to serve as novel markers of disease, e.g.
PCA3 in prostate cancer5,6
and to reveal uncharacterized aspects of tumor biology. Moreover, data from the ENCODE project published in 2012 showed that different RNA types cover about 62% of the genome. It also appears that the amount of transcriptional regulatory motifs is at least 4.5x higher than the one corresponding to protein-coding exons. Thus, long terminal repeats (LTRs) of human endogenous retroviruses (HERVs) constitute a wide range of putative/candidate transcriptional regulatory sequences, as it is their primary function in infectious retroviruses. HERVs, which are spread throughout the human genome, originate from ancestral and independent infections within the germ line, followed by copy-paste propagation processes and leading to multicopy families occupying 8% of the human genome (note that exons span 2% of our genome). Some HERV loci still express proteins that have been associated with several pathologies including cancer7-10
. We have designed a high-density microarray, in Affymetrix format, aiming to optimally characterize individual HERV loci expression, in order to better understand whether they can be active, if they drive ncRNA transcription or modulate coding gene expression. This tool has been applied in the prostate cancer field (Figure 1
Medicine, Issue 81, Cancer Biology, Genetics, Molecular Biology, Prostate, Retroviridae, Biomarkers, Pharmacological, Tumor Markers, Biological, Prostatectomy, Microarray Analysis, Gene Expression, Diagnosis, Human Endogenous Retroviruses, HERV, microarray, Transcriptome, prostate cancer, Affymetrix
Heterogeneity Mapping of Protein Expression in Tumors using Quantitative Immunofluorescence
Institutions: University of Edinburgh, HistoRx Inc..
Morphologic heterogeneity within an individual tumor is well-recognized by histopathologists in surgical practice. While this often takes the form of areas of distinct differentiation into recognized histological subtypes, or different pathological grade, often there are more subtle differences in phenotype which defy accurate classification (Figure 1). Ultimately, since morphology is dictated by the underlying molecular phenotype, areas with visible differences are likely to be accompanied by differences in the expression of proteins which orchestrate cellular function and behavior, and therefore, appearance. The significance of visible and invisible (molecular) heterogeneity for prognosis is unknown, but recent evidence suggests that, at least at the genetic level, heterogeneity exists in the primary tumor1,2
, and some of these sub-clones give rise to metastatic (and therefore lethal) disease.
Moreover, some proteins are measured as biomarkers because they are the targets of therapy (for instance ER and HER2 for tamoxifen and trastuzumab (Herceptin), respectively). If these proteins show variable expression within a tumor then therapeutic responses may also be variable. The widely used histopathologic scoring schemes for immunohistochemistry either ignore, or numerically homogenize the quantification of protein expression. Similarly, in destructive techniques, where the tumor samples are homogenized (such as gene expression profiling), quantitative information can be elucidated, but spatial information is lost. Genetic heterogeneity mapping approaches in pancreatic cancer have relied either on generation of a single cell suspension3
, or on macrodissection4
. A recent study has used quantum dots in order to map morphologic and molecular heterogeneity in prostate cancer tissue5
, providing proof of principle that morphology and molecular mapping is feasible, but falling short of quantifying the heterogeneity. Since immunohistochemistry is, at best, only semi-quantitative and subject to intra- and inter-observer bias, more sensitive and quantitative methodologies are required in order to accurately map and quantify tissue heterogeneity in situ
We have developed and applied an experimental and statistical methodology in order to systematically quantify the heterogeneity of protein expression in whole tissue sections of tumors, based on the Automated QUantitative Analysis (AQUA) system6
. Tissue sections are labeled with specific antibodies directed against cytokeratins and targets of interest, coupled to fluorophore-labeled secondary antibodies. Slides are imaged using a whole-slide fluorescence scanner. Images are subdivided into hundreds to thousands of tiles, and each tile is then assigned an AQUA score which is a measure of protein concentration within the epithelial (tumor) component of the tissue. Heatmaps are generated to represent tissue expression of the proteins and a heterogeneity score assigned, using a statistical measure of heterogeneity originally used in ecology, based on the Simpson's biodiversity index7
To date there have been no attempts to systematically map and quantify this variability in tandem with protein expression, in histological preparations. Here, we illustrate the first use of the method applied to ER and HER2 biomarker expression in ovarian cancer. Using this method paves the way for analyzing heterogeneity as an independent variable in studies of biomarker expression in translational studies, in order to establish the significance of heterogeneity in prognosis and prediction of responses to therapy.
Medicine, Issue 56, quantitative immunofluorescence, heterogeneity, cancer, biomarker, targeted therapy, immunohistochemistry, proteomics, histopathology
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
The Use of Reverse Phase Protein Arrays (RPPA) to Explore Protein Expression Variation within Individual Renal Cell Cancers
Institutions: University of Edinburgh, University of St Andrews, University of Edinburgh, University of Edinburgh, Western General Hospital, University of Edinburgh, Queen Mary University of London.
Currently there is no curative treatment for metastatic clear cell renal cell cancer, the commonest variant of the disease. A key factor in this treatment resistance is thought to be the molecular complexity of the disease 1
. Targeted therapy such as the tyrosine kinase inhibitor (TKI)-sunitinib have been utilized, but only 40% of patients will respond, with the overwhelming majority of these patients relapsing within 1 year 2
. As such the question of intrinsic and acquired resistance in renal cell cancer patients is highly relevant 3
In order to study resistance to TKIs, with the ultimate goal of developing effective, personalized treatments, sequential tissue after a specific period of targeted therapy is required, an approach which had proved successful in chronic myeloid leukaemia 4
. However the application of such a strategy in renal cell carcinoma is complicated by the high level of both inter- and intratumoral heterogeneity, which is a feature of renal cell carcinoma5,6
as well as other solid tumors 7
. Intertumoral heterogeneity due to transcriptomic and genetic differences is well established even in patients with similar presentation, stage and grade of tumor. In addition it is clear that there is great morphological (intratumoral) heterogeneity in RCC, which is likely to represent even greater molecular heterogeneity. Detailed mapping and categorization of RCC tumors by combined morphological analysis and Fuhrman grading allows the selection of representative areas for proteomic analysis.
Protein based analysis of RCC8
is attractive due to its widespread availability in pathology laboratories; however, its application can be problematic due to the limited availability of specific antibodies 9
. Due to the dot blot nature of the Reverse Phase Protein Arrays (RPPA), antibody specificity must be pre-validated; as such strict quality control of antibodies used is of paramount importance. Despite this limitation the dot blot format does allow assay miniaturization, allowing for the printing of hundreds of samples onto a single nitrocellulose slide. Printed slides can then be analyzed in a similar fashion to Western analysis with the use of target specific primary antibodies and fluorescently labelled secondary antibodies, allowing for multiplexing. Differential protein expression across all the samples on a slide can then be analyzed simultaneously by comparing the relative level of fluorescence in a more cost-effective and high-throughput manner.
Cancer Biology, Issue 71, Bioengineering, Medicine, Biomedical Engineering, Cellular Biology, Molecular Biology, Genetics, Pathology, Oncology, Proteins, Early Detection of Cancer, Translational Medical Research, RPPA, RCC, Heterogeneity, Proteomics, Tumor Grade, intertumoral, tumor, metastatic, carcinoma, renal cancer, clear cell renal cell cancer, cancer, assay
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Analytical Techniques for Assaying Nitric Oxide Bioactivity
Institutions: University of Texas Health Science Center at Houston , Baylor College of Medicine .
Nitric oxide (NO) is a diatomic free radical that is extremely short lived in biological systems (less than 1 second in circulating blood)1
. NO may be considered one of the most important signaling molecules produced in our body, regulating essential functions including but not limited to regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification in biological matrices is critical to understanding the role of NO in health and disease. With such a short physiological half life of NO, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of relevant NO metabolites in multiple biological compartments provides valuable information with regards to in vivo
NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. The ability to compare blood with select tissues in experimental animals will help bridge the gap between basic science and clinical medicine as far as diagnostic and prognostic utility of NO biomarkers in health and disease. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo
. The established paradigm of NO biochemistry from production by NO synthases to activation of soluble guanylyl cyclase (sGC) to eventual oxidation to nitrite (NO2-
) and nitrate (NO3-
) may only represent part of NO's effects in vivo
. The interaction of NO and NO-derived metabolites with protein thiols, secondary amines, and metals to form S-nitrosothiols (RSNOs), N-nitrosamines (RNNOs), and nitrosyl-heme respectively represent cGMP-independent effects of NO and are likely just as important physiologically as activation of sGC by NO. A true understanding of NO in physiology is derived from in vivo
experiments sampling multiple compartments simultaneously. Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. The elucidation of new mechanisms and signaling pathways involving NO hinges on our ability to specifically, selectively and sensitively detect and quantify NO and all relevant NO products and metabolites in complex biological matrices. Here, we present a method for the rapid and sensitive analysis of nitrite and nitrate by HPLC as well as detection of free NO in biological samples using in vitro
ozone based chemiluminescence with chemical derivitazation to determine molecular source of NO as well as ex vivo
with organ bath myography.
Medicine, Issue 64, Molecular Biology, Nitric oxide, nitrite, nitrate, endothelium derived relaxing factor, HPLC, chemiluminscence
Using a Pan-Viral Microarray Assay (Virochip) to Screen Clinical Samples for Viral Pathogens
Institutions: University of California, San Francisco, University of California, San Francisco.
The diagnosis of viral causes of many infectious diseases is difficult due to the inherent sequence diversity of viruses as well as the ongoing emergence of novel viral pathogens, such as SARS coronavirus and 2009 pandemic H1N1 influenza virus, that are not detectable by traditional methods. To address these challenges, we have previously developed and validated a pan-viral microarray platform called the Virochip with the capacity to detect all known viruses as well as novel variants on the basis of conserved sequence homology1
. Using the Virochip, we have identified the full spectrum of viruses associated with respiratory infections, including cases of unexplained critical illness in hospitalized patients, with a sensitivity equivalent to or superior to conventional clinical testing2-5
. The Virochip has also been used to identify novel viruses, including the SARS coronavirus6,7
, a novel rhinovirus clade5
, XMRV (a retrovirus linked to prostate cancer)8
, avian bornavirus (the cause of a wasting disease in parrots)9
, and a novel cardiovirus in children with respiratory and diarrheal illness10
. The current version of the Virochip has been ported to an Agilent microarray platform and consists of ~36,000 probes derived from over ~1,500 viruses in GenBank as of December of 2009. Here we demonstrate the steps involved in processing a Virochip assay from start to finish (~24 hour turnaround time), including sample nucleic acid extraction, PCR amplification using random primers, fluorescent dye incorporation, and microarray hybridization, scanning, and analysis.
Immunology, Issue 50, virus, microarray, Virochip, viral detection, genomics, clinical diagnostics, viral discovery, metagenomics, novel pathogen discovery
Basics of Multivariate Analysis in Neuroimaging Data
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9
. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience