JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Maximum allowed solvent accessibilites of residues in proteins.
PLoS ONE
PUBLISHED: 01-01-2013
The relative solvent accessibility (RSA) of a residue in a protein measures the extent of burial or exposure of that residue in the 3D structure. RSA is frequently used to describe a proteins biophysical or evolutionary properties. To calculate RSA, a residues solvent accessibility (ASA) needs to be normalized by a suitable reference value for the given amino acid; several normalization scales have previously been proposed. However, these scales do not provide tight upper bounds on ASA values frequently observed in empirical crystal structures. Instead, they underestimate the largest allowed ASA values, by up to 20%. As a result, many empirical crystal structures contain residues that seem to have RSA values in excess of one. Here, we derive a new normalization scale that does provide a tight upper bound on observed ASA values. We pursue two complementary strategies, one based on extensive analysis of empirical structures and one based on systematic enumeration of biophysically allowed tripeptides. Both approaches yield congruent results that consistently exceed published values. We conclude that previously published ASA normalization values were too small, primarily because the conformations that maximize ASA had not been correctly identified. As an application of our results, we show that empirically derived hydrophobicity scales are sensitive to accurate RSA calculation, and we derive new hydrophobicity scales that show increased correlation with experimentally measured scales.
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Published: 11-03-2011
ABSTRACT
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
20 Related JoVE Articles!
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Determination of the Gas-phase Acidities of Oligopeptides
Authors: Jianhua Ren, Ashish Sawhney, Yuan Tian, Bhupinder Padda, Patrick Batoon.
Institutions: University of the Pacific.
Amino acid residues located at different positions in folded proteins often exhibit different degrees of acidities. For example, a cysteine residue located at or near the N-terminus of a helix is often more acidic than that at or near the C-terminus 1-6. Although extensive experimental studies on the acid-base properties of peptides have been carried out in the condensed phase, in particular in aqueous solutions 6-8, the results are often complicated by solvent effects 7. In fact, most of the active sites in proteins are located near the interior region where solvent effects have been minimized 9,10. In order to understand intrinsic acid-base properties of peptides and proteins, it is important to perform the studies in a solvent-free environment. We present a method to measure the acidities of oligopeptides in the gas-phase. We use a cysteine-containing oligopeptide, Ala3CysNH2 (A3CH), as the model compound. The measurements are based on the well-established extended Cooks kinetic method (Figure 1) 11-16. The experiments are carried out using a triple-quadrupole mass spectrometer interfaced with an electrospray ionization (ESI) ion source (Figure 2). For each peptide sample, several reference acids are selected. The reference acids are structurally similar organic compounds with known gas-phase acidities. A solution of the mixture of the peptide and a reference acid is introduced into the mass spectrometer, and a gas-phase proton-bound anionic cluster of peptide-reference acid is formed. The proton-bound cluster is mass isolated and subsequently fragmented via collision-induced dissociation (CID) experiments. The resulting fragment ion abundances are analyzed using a relationship between the acidities and the cluster ion dissociation kinetics. The gas-phase acidity of the peptide is then obtained by linear regression of the thermo-kinetic plots 17,18. The method can be applied to a variety of molecular systems, including organic compounds, amino acids and their derivatives, oligonucleotides, and oligopeptides. By comparing the gas-phase acidities measured experimentally with those values calculated for different conformers, conformational effects on the acidities can be evaluated.
Chemistry, Issue 76, Biochemistry, Molecular Biology, Oligopeptide, gas-phase acidity, kinetic method, collision-induced dissociation, triple-quadrupole mass spectrometry, oligopeptides, peptides, mass spectrometry, MS
4348
Play Button
Monitoring Equilibrium Changes in RNA Structure by 'Peroxidative' and 'Oxidative' Hydroxyl Radical Footprinting
Authors: Ravichandra Bachu, Frances-Camille S. Padlan, Sara Rouhanifard, Michael Brenowitz, Jörg C. Schlatterer.
Institutions: Hunter College , Albert Einstein College of Medicine.
RNA molecules play an essential role in biology. In addition to transmitting genetic information, RNA can fold into unique tertiary structures fulfilling a specific biologic role as regulator, binder or catalyst. Information about tertiary contact formation is essential to understand the function of RNA molecules. Hydroxyl radicals (•OH) are unique probes of the structure of nucleic acids due to their high reactivity and small size.1 When used as a footprinting probe, hydroxyl radicals map the solvent accessible surface of the phosphodiester backbone of DNA1 and RNA2 with as fine as single nucleotide resolution. Hydroxyl radical footprinting can be used to identify the nucleotides within an intermolecular contact surface, e.g. in DNA-protein1 and RNA-protein complexes. Equilibrium3 and kinetic4 transitions can be determined by conducting hydroxyl radical footprinting as a function of a solution variable or time, respectively. A key feature of footprinting is that limited exposure to the probe (e.g., 'single-hit kinetics') results in the uniform sampling of each nucleotide of the polymer.5 In this video article, we use the P4-P6 domain of the Tetrahymena ribozyme to illustrate RNA sample preparation and the determination of a Mg(II)-mediated folding isotherms. We describe the use of the well known hydroxyl radical footprinting protocol that requires H2O2 (we call this the 'peroxidative' protocol) and a valuable, but not widely known, alternative that uses naturally dissolved O2 (we call this the 'oxidative' protocol). An overview of the data reduction, transformation and analysis procedures is presented.
Molecular Biology, Issue 56, hydroxyl radical, footprinting, RNA, Fenton, equilibrium
3244
Play Button
Laboratory-determined Phosphorus Flux from Lake Sediments as a Measure of Internal Phosphorus Loading
Authors: Mary E. Ogdahl, Alan D. Steinman, Maggie E. Weinert.
Institutions: Grand Valley State University.
Eutrophication is a water quality issue in lakes worldwide, and there is a critical need to identify and control nutrient sources. Internal phosphorus (P) loading from lake sediments can account for a substantial portion of the total P load in eutrophic, and some mesotrophic, lakes. Laboratory determination of P release rates from sediment cores is one approach for determining the role of internal P loading and guiding management decisions. Two principal alternatives to experimental determination of sediment P release exist for estimating internal load: in situ measurements of changes in hypolimnetic P over time and P mass balance. The experimental approach using laboratory-based sediment incubations to quantify internal P load is a direct method, making it a valuable tool for lake management and restoration. Laboratory incubations of sediment cores can help determine the relative importance of internal vs. external P loads, as well as be used to answer a variety of lake management and research questions. We illustrate the use of sediment core incubations to assess the effectiveness of an aluminum sulfate (alum) treatment for reducing sediment P release. Other research questions that can be investigated using this approach include the effects of sediment resuspension and bioturbation on P release. The approach also has limitations. Assumptions must be made with respect to: extrapolating results from sediment cores to the entire lake; deciding over what time periods to measure nutrient release; and addressing possible core tube artifacts. A comprehensive dissolved oxygen monitoring strategy to assess temporal and spatial redox status in the lake provides greater confidence in annual P loads estimated from sediment core incubations.
Environmental Sciences, Issue 85, Limnology, internal loading, eutrophication, nutrient flux, sediment coring, phosphorus, lakes
51617
Play Button
Designing Silk-silk Protein Alloy Materials for Biomedical Applications
Authors: Xiao Hu, Solomon Duki, Joseph Forys, Jeffrey Hettinger, Justin Buchicchio, Tabbetha Dobbins, Catherine Yang.
Institutions: Rowan University, Rowan University, Cooper Medical School of Rowan University, Rowan University.
Fibrous proteins display different sequences and structures that have been used for various applications in biomedical fields such as biosensors, nanomedicine, tissue regeneration, and drug delivery. Designing materials based on the molecular-scale interactions between these proteins will help generate new multifunctional protein alloy biomaterials with tunable properties. Such alloy material systems also provide advantages in comparison to traditional synthetic polymers due to the materials biodegradability, biocompatibility, and tenability in the body. This article used the protein blends of wild tussah silk (Antheraea pernyi) and domestic mulberry silk (Bombyx mori) as an example to provide useful protocols regarding these topics, including how to predict protein-protein interactions by computational methods, how to produce protein alloy solutions, how to verify alloy systems by thermal analysis, and how to fabricate variable alloy materials including optical materials with diffraction gratings, electric materials with circuits coatings, and pharmaceutical materials for drug release and delivery. These methods can provide important information for designing the next generation multifunctional biomaterials based on different protein alloys.
Bioengineering, Issue 90, protein alloys, biomaterials, biomedical, silk blends, computational simulation, implantable electronic devices
50891
Play Button
The Cell-based L-Glutathione Protection Assays to Study Endocytosis and Recycling of Plasma Membrane Proteins
Authors: Kristine M. Cihil, Agnieszka Swiatecka-Urban.
Institutions: Children's Hospital of Pittsburgh of UPMC, University of Pittsburgh School of Medicine.
Membrane trafficking involves transport of proteins from the plasma membrane to the cell interior (i.e. endocytosis) followed by trafficking to lysosomes for degradation or to the plasma membrane for recycling. The cell based L-glutathione protection assays can be used to study endocytosis and recycling of protein receptors, channels, transporters, and adhesion molecules localized at the cell surface. The endocytic assay requires labeling of cell surface proteins with a cell membrane impermeable biotin containing a disulfide bond and the N-hydroxysuccinimide (NHS) ester at 4 ºC - a temperature at which membrane trafficking does not occur. Endocytosis of biotinylated plasma membrane proteins is induced by incubation at 37 ºC. Next, the temperature is decreased again to 4 ºC to stop endocytic trafficking and the disulfide bond in biotin covalently attached to proteins that have remained at the plasma membrane is reduced with L-glutathione. At this point, only proteins that were endocytosed remain protected from L-glutathione and thus remain biotinylated. After cell lysis, biotinylated proteins are isolated with streptavidin agarose, eluted from agarose, and the biotinylated protein of interest is detected by western blotting. During the recycling assay, after biotinylation cells are incubated at 37 °C to load endocytic vesicles with biotinylated proteins and the disulfide bond in biotin covalently attached to proteins remaining at the plasma membrane is reduced with L-glutathione at 4 ºC as in the endocytic assay. Next, cells are incubated again at 37 °C to allow biotinylated proteins from endocytic vesicles to recycle to the plasma membrane. Cells are then incubated at 4 ºC, and the disulfide bond in biotin attached to proteins that recycled to the plasma membranes is reduced with L-glutathione. The biotinylated proteins protected from L-glutathione are those that did not recycle to the plasma membrane.
Basic Protocol, Issue 82, Endocytosis, recycling, plasma membrane, cell surface, EZLink, Sulfo-NHS-SS-Biotin, L-Glutathione, GSH, thiol group, disulfide bond, epithelial cells, cell polarization
50867
Play Button
Microwave-assisted Functionalization of Poly(ethylene glycol) and On-resin Peptides for Use in Chain Polymerizations and Hydrogel Formation
Authors: Amy H. Van Hove, Brandon D. Wilson, Danielle S. W. Benoit.
Institutions: University of Rochester, University of Rochester, University of Rochester Medical Center.
One of the main benefits to using poly(ethylene glycol) (PEG) macromers in hydrogel formation is synthetic versatility. The ability to draw from a large variety of PEG molecular weights and configurations (arm number, arm length, and branching pattern) affords researchers tight control over resulting hydrogel structures and properties, including Young’s modulus and mesh size. This video will illustrate a rapid, efficient, solvent-free, microwave-assisted method to methacrylate PEG precursors into poly(ethylene glycol) dimethacrylate (PEGDM). This synthetic method provides much-needed starting materials for applications in drug delivery and regenerative medicine. The demonstrated method is superior to traditional methacrylation methods as it is significantly faster and simpler, as well as more economical and environmentally friendly, using smaller amounts of reagents and solvents. We will also demonstrate an adaptation of this technique for on-resin methacrylamide functionalization of peptides. This on-resin method allows the N-terminus of peptides to be functionalized with methacrylamide groups prior to deprotection and cleavage from resin. This allows for selective addition of methacrylamide groups to the N-termini of the peptides while amino acids with reactive side groups (e.g. primary amine of lysine, primary alcohol of serine, secondary alcohols of threonine, and phenol of tyrosine) remain protected, preventing functionalization at multiple sites. This article will detail common analytical methods (proton Nuclear Magnetic Resonance spectroscopy (;H-NMR) and Matrix Assisted Laser Desorption Ionization Time of Flight mass spectrometry (MALDI-ToF)) to assess the efficiency of the functionalizations. Common pitfalls and suggested troubleshooting methods will be addressed, as will modifications of the technique which can be used to further tune macromer functionality and resulting hydrogel physical and chemical properties. Use of synthesized products for the formation of hydrogels for drug delivery and cell-material interaction studies will be demonstrated, with particular attention paid to modifying hydrogel composition to affect mesh size, controlling hydrogel stiffness and drug release.
Chemistry, Issue 80, Poly(ethylene glycol), peptides, polymerization, polymers, methacrylation, peptide functionalization, 1H-NMR, MALDI-ToF, hydrogels, macromer synthesis
50890
Play Button
Metabolomic Analysis of Rat Brain by High Resolution Nuclear Magnetic Resonance Spectroscopy of Tissue Extracts
Authors: Norbert W. Lutz, Evelyne Béraud, Patrick J. Cozzone.
Institutions: Aix-Marseille Université, Aix-Marseille Université.
Studies of gene expression on the RNA and protein levels have long been used to explore biological processes underlying disease. More recently, genomics and proteomics have been complemented by comprehensive quantitative analysis of the metabolite pool present in biological systems. This strategy, termed metabolomics, strives to provide a global characterization of the small-molecule complement involved in metabolism. While the genome and the proteome define the tasks cells can perform, the metabolome is part of the actual phenotype. Among the methods currently used in metabolomics, spectroscopic techniques are of special interest because they allow one to simultaneously analyze a large number of metabolites without prior selection for specific biochemical pathways, thus enabling a broad unbiased approach. Here, an optimized experimental protocol for metabolomic analysis by high-resolution NMR spectroscopy is presented, which is the method of choice for efficient quantification of tissue metabolites. Important strengths of this method are (i) the use of crude extracts, without the need to purify the sample and/or separate metabolites; (ii) the intrinsically quantitative nature of NMR, permitting quantitation of all metabolites represented by an NMR spectrum with one reference compound only; and (iii) the nondestructive nature of NMR enabling repeated use of the same sample for multiple measurements. The dynamic range of metabolite concentrations that can be covered is considerable due to the linear response of NMR signals, although metabolites occurring at extremely low concentrations may be difficult to detect. For the least abundant compounds, the highly sensitive mass spectrometry method may be advantageous although this technique requires more intricate sample preparation and quantification procedures than NMR spectroscopy. We present here an NMR protocol adjusted to rat brain analysis; however, the same protocol can be applied to other tissues with minor modifications.
Neuroscience, Issue 91, metabolomics, brain tissue, rodents, neurochemistry, tissue extracts, NMR spectroscopy, quantitative metabolite analysis, cerebral metabolism, metabolic profile
51829
Play Button
The ChroP Approach Combines ChIP and Mass Spectrometry to Dissect Locus-specific Proteomic Landscapes of Chromatin
Authors: Monica Soldi, Tiziana Bonaldi.
Institutions: European Institute of Oncology.
Chromatin is a highly dynamic nucleoprotein complex made of DNA and proteins that controls various DNA-dependent processes. Chromatin structure and function at specific regions is regulated by the local enrichment of histone post-translational modifications (hPTMs) and variants, chromatin-binding proteins, including transcription factors, and DNA methylation. The proteomic characterization of chromatin composition at distinct functional regions has been so far hampered by the lack of efficient protocols to enrich such domains at the appropriate purity and amount for the subsequent in-depth analysis by Mass Spectrometry (MS). We describe here a newly designed chromatin proteomics strategy, named ChroP (Chromatin Proteomics), whereby a preparative chromatin immunoprecipitation is used to isolate distinct chromatin regions whose features, in terms of hPTMs, variants and co-associated non-histonic proteins, are analyzed by MS. We illustrate here the setting up of ChroP for the enrichment and analysis of transcriptionally silent heterochromatic regions, marked by the presence of tri-methylation of lysine 9 on histone H3. The results achieved demonstrate the potential of ChroP in thoroughly characterizing the heterochromatin proteome and prove it as a powerful analytical strategy for understanding how the distinct protein determinants of chromatin interact and synergize to establish locus-specific structural and functional configurations.
Biochemistry, Issue 86, chromatin, histone post-translational modifications (hPTMs), epigenetics, mass spectrometry, proteomics, SILAC, chromatin immunoprecipitation , histone variants, chromatome, hPTMs cross-talks
51220
Play Button
Quantifying Glomerular Permeability of Fluorescent Macromolecules Using 2-Photon Microscopy in Munich Wistar Rats
Authors: Ruben M. Sandoval, Bruce A. Molitoris.
Institutions: Indiana University School of Medicine.
Kidney diseases involving urinary loss of large essential macromolecules, such as serum albumin, have long been thought to be caused by alterations in the permeability barrier comprised of podocytes, vascular endothelial cells, and a basement membrane working in unison. Data from our laboratory using intravital 2-photon microscopy revealed a more permeable glomerular filtration barrier (GFB) than previously thought under physiologic conditions, with retrieval of filtered albumin occurring in an early subset of cells called proximal tubule cells (PTC)1,2,3. Previous techniques used to study renal filtration and establishing the characteristic of the filtration barrier involved micropuncture of the lumen of these early tubular segments with sampling of the fluid content and analysis4. These studies determined albumin concentration in the luminal fluid to be virtually non-existent; corresponding closely to what is normally detected in the urine. However, characterization of dextran polymers with defined sizes by this technique revealed those of a size similar to serum albumin had higher levels in the tubular lumen and urine; suggesting increased permeability5. Herein is a detailed outline of the technique used to directly visualize and quantify glomerular fluorescent albumin permeability in vivo. This method allows for detection of filtered albumin across the filtration barrier into Bowman's space (the initial chamber of urinary filtration); and also allows quantification of albumin reabsorption by proximal tubules and visualization of subsequent albumin transcytosis6. The absence of fluorescent albumin along later tubular segments en route to the bladder highlights the efficiency of the retrieval pathway in the earlier proximal tubule segments. Moreover, when this technique was applied to determine permeability of dextrans having a similar size to albumin virtually identical permeability values were reported2. These observations directly support the need to expand the focus of many proteinuric renal diseases to included alterations in proximal tubule cell reclamation.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Cellular Biology, Anatomy, Physiology, Surgery, Nephrology, Kidney Diseases, Two-photon microscopy, Kidney, Glomerulus, Glomerular Sieving Coefficient (GSC), Permeability, Proximal Tubule, Proteinuria, macromolecules, 2 Photon, microscopy, intravital imaging, munich wistar rat, animal model
50052
Play Button
Measuring Cardiac Autonomic Nervous System (ANS) Activity in Children
Authors: Aimée E. van Dijk, René van Lien, Manon van Eijsden, Reinoud J. B. J. Gemke, Tanja G. M. Vrijkotte, Eco J. de Geus.
Institutions: Academic Medical Center - University of Amsterdam, Public Health Service of Amsterdam (GGD), VU University, VU University Medical Center, VU University, VU University Medical Center.
The autonomic nervous system (ANS) controls mainly automatic bodily functions that are engaged in homeostasis, like heart rate, digestion, respiratory rate, salivation, perspiration and renal function. The ANS has two main branches: the sympathetic nervous system, preparing the human body for action in times of danger and stress, and the parasympathetic nervous system, which regulates the resting state of the body. ANS activity can be measured invasively, for instance by radiotracer techniques or microelectrode recording from superficial nerves, or it can be measured non-invasively by using changes in an organ's response as a proxy for changes in ANS activity, for instance of the sweat glands or the heart. Invasive measurements have the highest validity but are very poorly feasible in large scale samples where non-invasive measures are the preferred approach. Autonomic effects on the heart can be reliably quantified by the recording of the electrocardiogram (ECG) in combination with the impedance cardiogram (ICG), which reflects the changes in thorax impedance in response to respiration and the ejection of blood from the ventricle into the aorta. From the respiration and ECG signals, respiratory sinus arrhythmia can be extracted as a measure of cardiac parasympathetic control. From the ECG and the left ventricular ejection signals, the preejection period can be extracted as a measure of cardiac sympathetic control. ECG and ICG recording is mostly done in laboratory settings. However, having the subjects report to a laboratory greatly reduces ecological validity, is not always doable in large scale epidemiological studies, and can be intimidating for young children. An ambulatory device for ECG and ICG simultaneously resolves these three problems. Here, we present a study design for a minimally invasive and rapid assessment of cardiac autonomic control in children, using a validated ambulatory device 1-5, the VU University Ambulatory Monitoring System (VU-AMS, Amsterdam, the Netherlands, www.vu-ams.nl).
Medicine, Issue 74, Neurobiology, Neuroscience, Anatomy, Physiology, Pediatrics, Cardiology, Heart, Central Nervous System, stress (psychological effects, human), effects of stress (psychological, human), sympathetic nervous system, parasympathetic nervous system, autonomic nervous system, ANS, childhood, ambulatory monitoring system, electrocardiogram, ECG, clinical techniques
50073
Play Button
Quantitative Autonomic Testing
Authors: Peter Novak.
Institutions: University of Massachusetts Medical School.
Disorders associated with dysfunction of autonomic nervous system are quite common yet frequently unrecognized. Quantitative autonomic testing can be invaluable tool for evaluation of these disorders, both in clinic and research. There are number of autonomic tests, however, only few were validated clinically or are quantitative. Here, fully quantitative and clinically validated protocol for testing of autonomic functions is presented. As a bare minimum the clinical autonomic laboratory should have a tilt table, ECG monitor, continuous noninvasive blood pressure monitor, respiratory monitor and a mean for evaluation of sudomotor domain. The software for recording and evaluation of autonomic tests is critical for correct evaluation of data. The presented protocol evaluates 3 major autonomic domains: cardiovagal, adrenergic and sudomotor. The tests include deep breathing, Valsalva maneuver, head-up tilt, and quantitative sudomotor axon test (QSART). The severity and distribution of dysautonomia is quantitated using Composite Autonomic Severity Scores (CASS). Detailed protocol is provided highlighting essential aspects of testing with emphasis on proper data acquisition, obtaining the relevant parameters and unbiased evaluation of autonomic signals. The normative data and CASS algorithm for interpretation of results are provided as well.
Medicine, Issue 53, Deep breathing, Valsalva maneuver, tilt test, sudomotor testing, Composite Autonomic Severity Score, CASS
2502
Play Button
Structure and Coordination Determination of Peptide-metal Complexes Using 1D and 2D 1H NMR
Authors: Michal S. Shoshan, Edit Y. Tshuva, Deborah E. Shalev.
Institutions: The Hebrew University of Jerusalem, The Hebrew University of Jerusalem.
Copper (I) binding by metallochaperone transport proteins prevents copper oxidation and release of the toxic ions that may participate in harmful redox reactions. The Cu (I) complex of the peptide model of a Cu (I) binding metallochaperone protein, which includes the sequence MTCSGCSRPG (underlined is conserved), was determined in solution under inert conditions by NMR spectroscopy. NMR is a widely accepted technique for the determination of solution structures of proteins and peptides. Due to difficulty in crystallization to provide single crystals suitable for X-ray crystallography, the NMR technique is extremely valuable, especially as it provides information on the solution state rather than the solid state. Herein we describe all steps that are required for full three-dimensional structure determinations by NMR. The protocol includes sample preparation in an NMR tube, 1D and 2D data collection and processing, peak assignment and integration, molecular mechanics calculations, and structure analysis. Importantly, the analysis was first conducted without any preset metal-ligand bonds, to assure a reliable structure determination in an unbiased manner.
Chemistry, Issue 82, solution structure determination, NMR, peptide models, copper-binding proteins, copper complexes
50747
Play Button
Analyzing Protein Dynamics Using Hydrogen Exchange Mass Spectrometry
Authors: Nikolai Hentze, Matthias P. Mayer.
Institutions: University of Heidelberg.
All cellular processes depend on the functionality of proteins. Although the functionality of a given protein is the direct consequence of its unique amino acid sequence, it is only realized by the folding of the polypeptide chain into a single defined three-dimensional arrangement or more commonly into an ensemble of interconverting conformations. Investigating the connection between protein conformation and its function is therefore essential for a complete understanding of how proteins are able to fulfill their great variety of tasks. One possibility to study conformational changes a protein undergoes while progressing through its functional cycle is hydrogen-1H/2H-exchange in combination with high-resolution mass spectrometry (HX-MS). HX-MS is a versatile and robust method that adds a new dimension to structural information obtained by e.g. crystallography. It is used to study protein folding and unfolding, binding of small molecule ligands, protein-protein interactions, conformational changes linked to enzyme catalysis, and allostery. In addition, HX-MS is often used when the amount of protein is very limited or crystallization of the protein is not feasible. Here we provide a general protocol for studying protein dynamics with HX-MS and describe as an example how to reveal the interaction interface of two proteins in a complex.   
Chemistry, Issue 81, Molecular Chaperones, mass spectrometers, Amino Acids, Peptides, Proteins, Enzymes, Coenzymes, Protein dynamics, conformational changes, allostery, protein folding, secondary structure, mass spectrometry
50839
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
50680
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
50427
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
51823
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Authors: Mirella Vivoli, Halina R. Novak, Jennifer A. Littlechild, Nicholas J. Harmer.
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
51809
Play Button
Concentration Determination of Nucleic Acids and Proteins Using the Micro-volume Bio-spec Nano Spectrophotometer
Authors: Suja Sukumaran.
Institutions: Scientific Instruments.
Nucleic Acid quantitation procedures have advanced significantly in the last three decades. More and more, molecular biologists require consistent small-volume analysis of nucleic acid samples for their experiments. The BioSpec-nano provides a potential solution to the problems of inaccurate, non-reproducible results, inherent in current DNA quantitation methods, via specialized optics and a sensitive PDA detector. The BioSpec-nano also has automated functionality such that mounting, measurement, and cleaning are done by the instrument, thereby eliminating tedious, repetitive, and inconsistent placement of the fiber optic element and manual cleaning. In this study, data is presented on the quantification of DNA and protein, as well as on measurement reproducibility and accuracy. Automated sample contact and rapid scanning allows measurement in three seconds, resulting in excellent throughput. Data analysis is carried out using the built-in features of the software. The formula used for calculating DNA concentration is: Sample Concentration = DF · (OD260-OD320)· NACF (1) Where DF = sample dilution factor and NACF = nucleic acid concentration factor. The Nucleic Acid concentration factor is set in accordance with the analyte selected1. Protein concentration results can be expressed as μg/ mL or as moles/L by entering e280 and molecular weight values respectively. When residue values for Tyr, Trp and Cysteine (S-S bond) are entered in the e280Calc tab, the extinction coefficient values are calculated as e280 = 5500 x (Trp residues) + 1490 x (Tyr residues) + 125 x (cysteine S-S bond). The e280 value is used by the software for concentration calculation. In addition to concentration determination of nucleic acids and protein, the BioSpec-nano can be used as an ultra micro-volume spectrophotometer for many other analytes or as a standard spectrophotometer using 5 mm pathlength cells.
Molecular Biology, Issue 48, Nucleic acid quantitation, protein quantitation, micro-volume analysis, label quantitation
2699
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.