JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Effect sizes for 2×2 contingency tables.
PUBLISHED: 02-06-2013
Sample size calculations are an important part of research to balance the use of resources and to avoid undue harm to participants. Effect sizes are an integral part of these calculations and meaningful values are often unknown to the researcher. General recommendations for effect sizes have been proposed for several commonly used statistical procedures. For the analysis of 2×2 tables, recommendations have been given for the correlation coefficient ? for binary data; however, it is well known that ? suffers from poor statistical properties. The odds ratio is not problematic, although recommendations based on objective reasoning do not exist. This paper proposes odds ratio recommendations that are anchored to ? for fixed marginal probabilities. It will further be demonstrated that the marginal assumptions can be relaxed resulting in more general results.
Authors: Mark Burke, Shahin Zangenehpour, Peter R. Mouton, Maurice Ptito.
Published: 05-14-2009
The non-human primate is an important translational species for understanding the normal function and disease processes of the human brain. Unbiased stereology, the method accepted as state-of-the-art for quantification of biological objects in tissue sections2, generates reliable structural data for biological features in the mammalian brain3. The key components of the approach are unbiased (systematic-random) sampling of anatomically defined structures (reference spaces), combined with quantification of cell numbers and size, fiber and capillary lengths, surface areas, regional volumes and spatial distributions of biological objects within the reference space4. Among the advantages of these stereological approaches over previous methods is the avoidance of all known sources of systematic (non-random) error arising from faulty assumptions and non-verifiable models. This study documents a biological application of computerized stereology to estimate the total neuronal population in the frontal cortex of the vervet monkey brain (Chlorocebus aethiops sabeus), with assistance from two commercially available stereology programs, BioQuant Life Sciences and Stereologer (Figure 1). In addition to contrast and comparison of results from both the BioQuant and Stereologer systems, this study provides a detailed protocol for the Stereologer system.
27 Related JoVE Articles!
Play Button
Using Continuous Data Tracking Technology to Study Exercise Adherence in Pulmonary Rehabilitation
Authors: Amanda K. Rizk, Rima Wardini, Emilie Chan-Thim, Barbara Trutschnigg, Amélie Forget, Véronique Pepin.
Institutions: Concordia University, Concordia University, Hôpital du Sacré-Coeur de Montréal.
Pulmonary rehabilitation (PR) is an important component in the management of respiratory diseases. The effectiveness of PR is dependent upon adherence to exercise training recommendations. The study of exercise adherence is thus a key step towards the optimization of PR programs. To date, mostly indirect measures, such as rates of participation, completion, and attendance, have been used to determine adherence to PR. The purpose of the present protocol is to describe how continuous data tracking technology can be used to measure adherence to a prescribed aerobic training intensity on a second-by-second basis. In our investigations, adherence has been defined as the percent time spent within a specified target heart rate range. As such, using a combination of hardware and software, heart rate is measured, tracked, and recorded during cycling second-by-second for each participant, for each exercise session. Using statistical software, the data is subsequently extracted and analyzed. The same protocol can be applied to determine adherence to other measures of exercise intensity, such as time spent at a specified wattage, level, or speed on the cycle ergometer. Furthermore, the hardware and software is also available to measure adherence to other modes of training, such as the treadmill, elliptical, stepper, and arm ergometer. The present protocol, therefore, has a vast applicability to directly measure adherence to aerobic exercise.
Medicine, Issue 81, Data tracking, exercise, rehabilitation, adherence, patient compliance, health behavior, user-computer interface.
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Dependence of Laser-induced Breakdown Spectroscopy Results on Pulse Energies and Timing Parameters Using Soil Simulants
Authors: Lauren Kurek, Maya L. Najarian, David A. Cremers, Rosemarie C. Chinni.
Institutions: Alvernia University, Applied Research Associates (ARA), Inc..
The dependence of some LIBS detection capabilities on lower pulse energies (<100 mJ) and timing parameters were examined using synthetic silicate samples. These samples were used as simulants for soil and contained minor and trace elements commonly found in soil at a wide range of concentrations. For this study, over 100 calibration curves were prepared using different pulse energies and timing parameters; detection limits and sensitivities were determined from the calibration curves. Plasma temperatures were also measured using Boltzmann plots for the various energies and the timing parameters tested. The electron density of the plasma was calculated using the full-width half maximum (FWHM) of the hydrogen line at 656.5 nm over the energies tested. Overall, the results indicate that the use of lower pulse energies and non-gated detection do not seriously compromise the analytical results. These results are very relevant to the design of field- and person-portable LIBS instruments.
Chemistry, Issue 79, analytical chemistry, laser research, atomic physics, [LIBS, Laser-induced breakdown spectroscopy, gated and non-gated detection, energy study]
Play Button
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Authors: Colin W. Bell, Barbara E. Fricks, Jennifer D. Rocca, Jessica M. Steinweg, Shawna K. McMahon, Matthew D. Wallenstein.
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample). Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
Play Button
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Authors: Jeremy D. Smith, Abbie E. Ferris, Gary D. Heise, Richard N. Hinrichs, Philip E. Martin.
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Using Informational Connectivity to Measure the Synchronous Emergence of fMRI Multi-voxel Information Across Time
Authors: Marc N. Coutanche, Sharon L. Thompson-Schill.
Institutions: University of Pennsylvania.
It is now appreciated that condition-relevant information can be present within distributed patterns of functional magnetic resonance imaging (fMRI) brain activity, even for conditions with similar levels of univariate activation. Multi-voxel pattern (MVP) analysis has been used to decode this information with great success. FMRI investigators also often seek to understand how brain regions interact in interconnected networks, and use functional connectivity (FC) to identify regions that have correlated responses over time. Just as univariate analyses can be insensitive to information in MVPs, FC may not fully characterize the brain networks that process conditions with characteristic MVP signatures. The method described here, informational connectivity (IC), can identify regions with correlated changes in MVP-discriminability across time, revealing connectivity that is not accessible to FC. The method can be exploratory, using searchlights to identify seed-connected areas, or planned, between pre-selected regions-of-interest. The results can elucidate networks of regions that process MVP-related conditions, can breakdown MVPA searchlight maps into separate networks, or can be compared across tasks and patient groups.
Neuroscience, Issue 89, fMRI, MVPA, connectivity, informational connectivity, functional connectivity, networks, multi-voxel pattern analysis, decoding, classification, method, multivariate
Play Button
Preparation of DNA-crosslinked Polyacrylamide Hydrogels
Authors: Michelle L. Previtera, Noshir A. Langrana.
Institutions: JFK Medical Center, Rutgers University, Rutgers University.
Mechanobiology is an emerging scientific area that addresses the critical role of physical cues in directing cell morphology and function. For example, the effect of tissue elasticity on cell function is a major area of mechanobiology research because tissue stiffness modulates with disease, development, and injury. Static tissue-mimicking materials, or materials that cannot alter stiffness once cells are plated, are predominately used to investigate the effects of tissue stiffness on cell functions. While information gathered from static studies is valuable, these studies are not indicative of the dynamic nature of the cellular microenvironment in vivo. To better address the effects of dynamic stiffness on cell function, we developed a DNA-crosslinked polyacrylamide hydrogel system (DNA gels). Unlike other dynamic substrates, DNA gels have the ability to decrease or increase in stiffness after fabrication without stimuli. DNA gels consist of DNA crosslinks that are polymerized into a polyacrylamide backbone. Adding and removing crosslinks via delivery of single-stranded DNA allows temporal, spatial, and reversible control of gel elasticity. We have shown in previous reports that dynamic modulation of DNA gel elasticity influences fibroblast and neuron behavior. In this report and video, we provide a schematic that describes the DNA gel crosslinking mechanisms and step-by-step instructions on the preparation DNA gels.
Bioengineering, Issue 90, bioengineering (general), Elastic, viscoelastic, bis-acrylamide, substrate, stiffness, dynamic, static, neuron, fibroblast, compliance, ECM, mechanobiology, tunable
Play Button
Development of a Virtual Reality Assessment of Everyday Living Skills
Authors: Stacy A. Ruse, Vicki G. Davis, Alexandra S. Atkins, K. Ranga R. Krishnan, Kolleen H. Fox, Philip D. Harvey, Richard S.E. Keefe.
Institutions: NeuroCog Trials, Inc., Duke-NUS Graduate Medical Center, Duke University Medical Center, Fox Evaluation and Consulting, PLLC, University of Miami Miller School of Medicine.
Cognitive impairments affect the majority of patients with schizophrenia and these impairments predict poor long term psychosocial outcomes.  Treatment studies aimed at cognitive impairment in patients with schizophrenia not only require demonstration of improvements on cognitive tests, but also evidence that any cognitive changes lead to clinically meaningful improvements.  Measures of “functional capacity” index the extent to which individuals have the potential to perform skills required for real world functioning.  Current data do not support the recommendation of any single instrument for measurement of functional capacity.  The Virtual Reality Functional Capacity Assessment Tool (VRFCAT) is a novel, interactive gaming based measure of functional capacity that uses a realistic simulated environment to recreate routine activities of daily living. Studies are currently underway to evaluate and establish the VRFCAT’s sensitivity, reliability, validity, and practicality. This new measure of functional capacity is practical, relevant, easy to use, and has several features that improve validity and sensitivity of measurement of function in clinical trials of patients with CNS disorders.
Behavior, Issue 86, Virtual Reality, Cognitive Assessment, Functional Capacity, Computer Based Assessment, Schizophrenia, Neuropsychology, Aging, Dementia
Play Button
Nanomanipulation of Single RNA Molecules by Optical Tweezers
Authors: William Stephenson, Gorby Wan, Scott A. Tenenbaum, Pan T. X. Li.
Institutions: University at Albany, State University of New York, University at Albany, State University of New York, University at Albany, State University of New York, University at Albany, State University of New York, University at Albany, State University of New York.
A large portion of the human genome is transcribed but not translated. In this post genomic era, regulatory functions of RNA have been shown to be increasingly important. As RNA function often depends on its ability to adopt alternative structures, it is difficult to predict RNA three-dimensional structures directly from sequence. Single-molecule approaches show potentials to solve the problem of RNA structural polymorphism by monitoring molecular structures one molecule at a time. This work presents a method to precisely manipulate the folding and structure of single RNA molecules using optical tweezers. First, methods to synthesize molecules suitable for single-molecule mechanical work are described. Next, various calibration procedures to ensure the proper operations of the optical tweezers are discussed. Next, various experiments are explained. To demonstrate the utility of the technique, results of mechanically unfolding RNA hairpins and a single RNA kissing complex are used as evidence. In these examples, the nanomanipulation technique was used to study folding of each structural domain, including secondary and tertiary, independently. Lastly, the limitations and future applications of the method are discussed.
Bioengineering, Issue 90, RNA folding, single-molecule, optical tweezers, nanomanipulation, RNA secondary structure, RNA tertiary structure
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Measuring Frailty in HIV-infected Individuals. Identification of Frail Patients is the First Step to Amelioration and Reversal of Frailty
Authors: Hilary C. Rees, Voichita Ianas, Patricia McCracken, Shannon Smith, Anca Georgescu, Tirdad Zangeneh, Jane Mohler, Stephen A. Klotz.
Institutions: University of Arizona, University of Arizona.
A simple, validated protocol consisting of a battery of tests is available to identify elderly patients with frailty syndrome. This syndrome of decreased reserve and resistance to stressors increases in incidence with increasing age. In the elderly, frailty may pursue a step-wise loss of function from non-frail to pre-frail to frail. We studied frailty in HIV-infected patients and found that ~20% are frail using the Fried phenotype using stringent criteria developed for the elderly1,2. In HIV infection the syndrome occurs at a younger age. HIV patients were checked for 1) unintentional weight loss; 2) slowness as determined by walking speed; 3) weakness as measured by a grip dynamometer; 4) exhaustion by responses to a depression scale; and 5) low physical activity was determined by assessing kilocalories expended in a week's time. Pre-frailty was present with any two of five criteria and frailty was present if any three of the five criteria were abnormal. The tests take approximately 10-15 min to complete and they can be performed by medical assistants during routine clinic visits. Test results are scored by referring to standard tables. Understanding which of the five components contribute to frailty in an individual patient can allow the clinician to address relevant underlying problems, many of which are not evident in routine HIV clinic visits.
Medicine, Issue 77, Infection, Virology, Infectious Diseases, Anatomy, Physiology, Molecular Biology, Biomedical Engineering, Retroviridae Infections, Body Weight Changes, Diagnostic Techniques and Procedures, Physical Examination, Muscle Strength, Behavior, Virus Diseases, Pathological Conditions, Signs and Symptoms, Diagnosis, Musculoskeletal and Neural Physiological Phenomena, HIV, HIV-1, AIDS, Frailty, Depression, Weight Loss, Weakness, Slowness, Exhaustion, Aging, clinical techniques
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Generation of Comprehensive Thoracic Oncology Database - Tool for Translational Research
Authors: Mosmi Surati, Matthew Robinson, Suvobroto Nandi, Leonardo Faoro, Carley Demchuk, Rajani Kanteti, Benjamin Ferguson, Tara Gangadhar, Thomas Hensing, Rifat Hasina, Aliya Husain, Mark Ferguson, Theodore Karrison, Ravi Salgia.
Institutions: University of Chicago, University of Chicago, Northshore University Health Systems, University of Chicago, University of Chicago, University of Chicago.
The Thoracic Oncology Program Database Project was created to serve as a comprehensive, verified, and accessible repository for well-annotated cancer specimens and clinical data to be available to researchers within the Thoracic Oncology Research Program. This database also captures a large volume of genomic and proteomic data obtained from various tumor tissue studies. A team of clinical and basic science researchers, a biostatistician, and a bioinformatics expert was convened to design the database. Variables of interest were clearly defined and their descriptions were written within a standard operating manual to ensure consistency of data annotation. Using a protocol for prospective tissue banking and another protocol for retrospective banking, tumor and normal tissue samples from patients consented to these protocols were collected. Clinical information such as demographics, cancer characterization, and treatment plans for these patients were abstracted and entered into an Access database. Proteomic and genomic data have been included in the database and have been linked to clinical information for patients described within the database. The data from each table were linked using the relationships function in Microsoft Access to allow the database manager to connect clinical and laboratory information during a query. The queried data can then be exported for statistical analysis and hypothesis generation.
Medicine, Issue 47, Database, Thoracic oncology, Bioinformatics, Biorepository, Microsoft Access, Proteomics, Genomics
Play Button
Aseptic Laboratory Techniques: Plating Methods
Authors: Erin R. Sanders.
Institutions: University of California, Los Angeles .
Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories (BMBL) as well as Material Safety Data Sheets (MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection (ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to: ● Perform plating procedures without contaminating media. ● Isolate single bacterial colonies by the streak-plating method. ● Use pour-plating and spread-plating methods to determine the concentration of bacteria. ● Perform soft agar overlays when working with phage. ● Transfer bacterial cells from one plate to another using the replica-plating procedure. ● Given an experimental task, select the appropriate plating method.
Basic Protocols, Issue 63, Streak plates, pour plates, soft agar overlays, spread plates, replica plates, bacteria, colonies, phage, plaques, dilutions
Play Button
Investigating the Neural Mechanisms of Aware and Unaware Fear Memory with fMRI
Authors: David C. Knight, Kimberly H. Wood.
Institutions: University of Alabama at Birmingham.
Pavlovian fear conditioning is often used in combination with functional magnetic resonance imaging (fMRI) in humans to investigate the neural substrates of associative learning 1-5. In these studies, it is important to provide behavioral evidence of conditioning to verify that differences in brain activity are learning-related and correlated with human behavior. Fear conditioning studies often monitor autonomic responses (e.g. skin conductance response; SCR) as an index of learning and memory 6-8. In addition, other behavioral measures can provide valuable information about the learning process and/or other cognitive functions that influence conditioning. For example, the impact unconditioned stimulus (UCS) expectancies have on the expression of the conditioned response (CR) and unconditioned response (UCR) has been a topic of interest in several recent studies 9-14. SCR and UCS expectancy measures have recently been used in conjunction with fMRI to investigate the neural substrates of aware and unaware fear learning and memory processes 15. Although these cognitive processes can be evaluated to some degree following the conditioning session, post-conditioning assessments cannot measure expectations on a trial-to-trial basis and are susceptible to interference and forgetting, as well as other factors that may distort results 16,17 . Monitoring autonomic and behavioral responses simultaneously with fMRI provides a mechanism by which the neural substrates that mediate complex relationships between cognitive processes and behavioral/autonomic responses can be assessed. However, monitoring autonomic and behavioral responses in the MRI environment poses a number of practical problems. Specifically, 1) standard behavioral and physiological monitoring equipment is constructed of ferrous material that cannot be safely used near the MRI scanner, 2) when this equipment is placed outside of the MRI scanning chamber, the cables projecting to the subject can carry RF noise that produces artifacts in brain images, 3) artifacts can be produced within the skin conductance signal by switching gradients during scanning, 4) the fMRI signal produced by the motor demands of behavioral responses may need to be distinguished from activity related to the cognitive processes of interest. Each of these issues can be resolved with modifications to the setup of physiological monitoring equipment and additional data analysis procedures. Here we present a methodology to simultaneously monitor autonomic and behavioral responses during fMRI, and demonstrate the use of these methods to investigate aware and unaware memory processes during fear conditioning.
Neuroscience, Issue 56, fMRI, conditioning, learning, memory, fear, contingency awareness, neuroscience, skin conductance
Play Button
Eye Tracking Young Children with Autism
Authors: Noah J. Sasson, Jed T. Elison.
Institutions: University of Texas at Dallas, University of North Carolina at Chapel Hill.
The rise of accessible commercial eye-tracking systems has fueled a rapid increase in their use in psychological and psychiatric research. By providing a direct, detailed and objective measure of gaze behavior, eye-tracking has become a valuable tool for examining abnormal perceptual strategies in clinical populations and has been used to identify disorder-specific characteristics1, promote early identification2, and inform treatment3. In particular, investigators of autism spectrum disorders (ASD) have benefited from integrating eye-tracking into their research paradigms4-7. Eye-tracking has largely been used in these studies to reveal mechanisms underlying impaired task performance8 and abnormal brain functioning9, particularly during the processing of social information1,10-11. While older children and adults with ASD comprise the preponderance of research in this area, eye-tracking may be especially useful for studying young children with the disorder as it offers a non-invasive tool for assessing and quantifying early-emerging developmental abnormalities2,12-13. Implementing eye-tracking with young children with ASD, however, is associated with a number of unique challenges, including issues with compliant behavior resulting from specific task demands and disorder-related psychosocial considerations. In this protocol, we detail methodological considerations for optimizing research design, data acquisition and psychometric analysis while eye-tracking young children with ASD. The provided recommendations are also designed to be more broadly applicable for eye-tracking children with other developmental disabilities. By offering guidelines for best practices in these areas based upon lessons derived from our own work, we hope to help other investigators make sound research design and analysis choices while avoiding common pitfalls that can compromise data acquisition while eye-tracking young children with ASD or other developmental difficulties.
Medicine, Issue 61, eye tracking, autism, neurodevelopmental disorders, toddlers, perception, attention, social cognition
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Play Button
Measuring the Subjective Value of Risky and Ambiguous Options using Experimental Economics and Functional MRI Methods
Authors: Ifat Levy, Lior Rosenberg Belmaker, Kirk Manson, Agnieszka Tymula, Paul W. Glimcher.
Institutions: Yale School of Medicine, Yale School of Medicine, New York University , New York University , New York University .
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2, the degree of those aversions vary substantially across individuals, such that the subjective value of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3 to assess the neural representation of the subjective values of risky and ambiguous options4. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations. In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
Neuroscience, Issue 67, Medicine, Molecular Biology, fMRI, magnetic resonance imaging, decision-making, value, uncertainty, risk, ambiguity
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Play Button
Sampling Human Indigenous Saliva Peptidome Using a Lollipop-Like Ultrafiltration Probe: Simplify and Enhance Peptide Detection for Clinical Mass Spectrometry
Authors: Wenhong Zhu, Richard L. Gallo, Chun-Ming Huang.
Institutions: Sanford-Burnham Medical Research Institute, University of California, San Diego , VA San Diego Healthcare Center, University of California, San Diego .
Although human saliva proteome and peptidome have been revealed 1-2 they were majorly identified from tryptic digests of saliva proteins. Identification of indigenous peptidome of human saliva without prior digestion with exogenous enzymes becomes imperative, since native peptides in human saliva provide potential values for diagnosing disease, predicting disease progression, and monitoring therapeutic efficacy. Appropriate sampling is a critical step for enhancement of identification of human indigenous saliva peptidome. Traditional methods of sampling human saliva involving centrifugation to remove debris 3-4 may be too time-consuming to be applicable for clinical use. Furthermore, debris removal by centrifugation may be unable to clean most of the infected pathogens and remove the high abundance proteins that often hinder the identification of low abundance peptidome. Conventional proteomic approaches that primarily utilize two-dimensional gel electrophoresis (2-DE) gels in conjugation with in-gel digestion are capable of identifying many saliva proteins 5-6. However, this approach is generally not sufficiently sensitive to detect low abundance peptides/proteins. Liquid chromatography-Mass spectrometry (LC-MS) based proteomics is an alternative that can identify proteins without prior 2-DE separation. Although this approach provides higher sensitivity, it generally needs prior sample pre-fractionation 7 and pre-digestion with trypsin, which makes it difficult for clinical use. To circumvent the hindrance in mass spectrometry due to sample preparation, we have developed a technique called capillary ultrafiltration (CUF) probes 8-11. Data from our laboratory demonstrated that the CUF probes are capable of capturing proteins in vivo from various microenvironments in animals in a dynamic and minimally invasive manner 8-11. No centrifugation is needed since a negative pressure is created by simply syringe withdrawing during sample collection. The CUF probes combined with LC-MS have successfully identified tryptic-digested proteins 8-11. In this study, we upgraded the ultrafiltration sampling technique by creating a lollipop-like ultrafiltration (LLUF) probe that can easily fit in the human oral cavity. The direct analysis by LC-MS without trypsin digestion showed that human saliva indigenously contains many peptide fragments derived from various proteins. Sampling saliva with LLUF probes avoided centrifugation but effectively removed many larger and high abundance proteins. Our mass spectrometric results illustrated that many low abundance peptides became detectable after filtering out larger proteins with LLUF probes. Detection of low abundance saliva peptides was independent of multiple-step sample separation with chromatography. For clinical application, the LLUF probes incorporated with LC-MS could potentially be used in the future to monitor disease progression from saliva.
Medicine, Issue 66, Molecular Biology, Genetics, Sampling, Saliva, Peptidome, Ultrafiltration, Mass spectrometry
Play Button
Combined Immunofluorescence and DNA FISH on 3D-preserved Interphase Nuclei to Study Changes in 3D Nuclear Organization
Authors: Julie Chaumeil, Mariann Micsinai, Jane A. Skok.
Institutions: New York University School of Medicine, New York University Center for Health Informatics and Bioinformatics, NYU Cancer Institute, Yale University School of Medicine .
Fluorescent in situ hybridization using DNA probes on 3-dimensionally preserved nuclei followed by 3D confocal microscopy (3D DNA FISH) represents the most direct way to visualize the location of gene loci, chromosomal sub-regions or entire territories in individual cells. This type of analysis provides insight into the global architecture of the nucleus as well as the behavior of specific genomic loci and regions within the nuclear space. Immunofluorescence, on the other hand, permits the detection of nuclear proteins (modified histones, histone variants and modifiers, transcription machinery and factors, nuclear sub-compartments, etc). The major challenge in combining immunofluorescence and 3D DNA FISH is, on the one hand to preserve the epitope detected by the antibody as well as the 3D architecture of the nucleus, and on the other hand, to allow the penetration of the DNA probe to detect gene loci or chromosome territories 1-5. Here we provide a protocol that combines visualization of chromatin modifications with genomic loci in 3D preserved nuclei.
Genetics, Issue 72, Molecular Biology, Bioinformatics, Cancer Biology, Pathology, Biomedical Engineering, Immunology, Intranuclear Space, Nuclear Matrix, Fluorescence in situ Hybridization, FISH, 3D DNA FISH, DNA, immunofluorescence, immuno-FISH, 3D microscopy, Nuclear organization, interphase nuclei, chromatin modifications
Play Button
Large Scale Non-targeted Metabolomic Profiling of Serum by Ultra Performance Liquid Chromatography-Mass Spectrometry (UPLC-MS)
Authors: Corey D. Broeckling, Adam L. Heuberger, Jessica E. Prenni.
Institutions: Colorado State University.
Non-targeted metabolite profiling by ultra performance liquid chromatography coupled with mass spectrometry (UPLC-MS) is a powerful technique to investigate metabolism. The approach offers an unbiased and in-depth analysis that can enable the development of diagnostic tests, novel therapies, and further our understanding of disease processes. The inherent chemical diversity of the metabolome creates significant analytical challenges and there is no single experimental approach that can detect all metabolites. Additionally, the biological variation in individual metabolism and the dependence of metabolism on environmental factors necessitates large sample numbers to achieve the appropriate statistical power required for meaningful biological interpretation. To address these challenges, this tutorial outlines an analytical workflow for large scale non-targeted metabolite profiling of serum by UPLC-MS. The procedure includes guidelines for sample organization and preparation, data acquisition, quality control, and metabolite identification and will enable reliable acquisition of data for large experiments and provide a starting point for laboratories new to non-targeted metabolite profiling by UPLC-MS.
Chemistry, Issue 73, Biochemistry, Genetics, Molecular Biology, Physiology, Genomics, Proteins, Proteomics, Metabolomics, Metabolite Profiling, Non-targeted metabolite profiling, mass spectrometry, Ultra Performance Liquid Chromatography, UPLC-MS, serum, spectrometry
Play Button
In Vivo Modeling of the Morbid Human Genome using Danio rerio
Authors: Adrienne R. Niederriter, Erica E. Davis, Christelle Golzio, Edwin C. Oh, I-Chun Tsai, Nicholas Katsanis.
Institutions: Duke University Medical Center, Duke University, Duke University Medical Center.
Here, we present methods for the development of assays to query potentially clinically significant nonsynonymous changes using in vivo complementation in zebrafish. Zebrafish (Danio rerio) are a useful animal system due to their experimental tractability; embryos are transparent to enable facile viewing, undergo rapid development ex vivo, and can be genetically manipulated.1 These aspects have allowed for significant advances in the analysis of embryogenesis, molecular processes, and morphogenetic signaling. Taken together, the advantages of this vertebrate model make zebrafish highly amenable to modeling the developmental defects in pediatric disease, and in some cases, adult-onset disorders. Because the zebrafish genome is highly conserved with that of humans (~70% orthologous), it is possible to recapitulate human disease states in zebrafish. This is accomplished either through the injection of mutant human mRNA to induce dominant negative or gain of function alleles, or utilization of morpholino (MO) antisense oligonucleotides to suppress genes to mimic loss of function variants. Through complementation of MO-induced phenotypes with capped human mRNA, our approach enables the interpretation of the deleterious effect of mutations on human protein sequence based on the ability of mutant mRNA to rescue a measurable, physiologically relevant phenotype. Modeling of the human disease alleles occurs through microinjection of zebrafish embryos with MO and/or human mRNA at the 1-4 cell stage, and phenotyping up to seven days post fertilization (dpf). This general strategy can be extended to a wide range of disease phenotypes, as demonstrated in the following protocol. We present our established models for morphogenetic signaling, craniofacial, cardiac, vascular integrity, renal function, and skeletal muscle disorder phenotypes, as well as others.
Molecular Biology, Issue 78, Genetics, Biomedical Engineering, Medicine, Developmental Biology, Biochemistry, Anatomy, Physiology, Bioengineering, Genomics, Medical, zebrafish, in vivo, morpholino, human disease modeling, transcription, PCR, mRNA, DNA, Danio rerio, animal model
Play Button
T-wave Ion Mobility-mass Spectrometry: Basic Experimental Procedures for Protein Complex Analysis
Authors: Izhak Michaelevski, Noam Kirshenbaum, Michal Sharon.
Institutions: Weizmann Institute of Science.
Ion mobility (IM) is a method that measures the time taken for an ion to travel through a pressurized cell under the influence of a weak electric field. The speed by which the ions traverse the drift region depends on their size: large ions will experience a greater number of collisions with the background inert gas (usually N2) and thus travel more slowly through the IM device than those ions that comprise a smaller cross-section. In general, the time it takes for the ions to migrate though the dense gas phase separates them, according to their collision cross-section (Ω). Recently, IM spectrometry was coupled with mass spectrometry and a traveling-wave (T-wave) Synapt ion mobility mass spectrometer (IM-MS) was released. Integrating mass spectrometry with ion mobility enables an extra dimension of sample separation and definition, yielding a three-dimensional spectrum (mass to charge, intensity, and drift time). This separation technique allows the spectral overlap to decrease, and enables resolution of heterogeneous complexes with very similar mass, or mass-to-charge ratios, but different drift times. Moreover, the drift time measurements provide an important layer of structural information, as Ω is related to the overall shape and topology of the ion. The correlation between the measured drift time values and Ω is calculated using a calibration curve generated from calibrant proteins with defined cross-sections1. The power of the IM-MS approach lies in its ability to define the subunit packing and overall shape of protein assemblies at micromolar concentrations, and near-physiological conditions1. Several recent IM studies of both individual proteins2,3 and non-covalent protein complexes4-9, successfully demonstrated that protein quaternary structure is maintained in the gas phase, and highlighted the potential of this approach in the study of protein assemblies of unknown geometry. Here, we provide a detailed description of IMS-MS analysis of protein complexes using the Synapt (Quadrupole-Ion Mobility-Time-of-Flight) HDMS instrument (Waters Ltd; the only commercial IM-MS instrument currently available)10. We describe the basic optimization steps, the calibration of collision cross-sections, and methods for data processing and interpretation. The final step of the protocol discusses methods for calculating theoretical Ω values. Overall, the protocol does not attempt to cover every aspect of IM-MS characterization of protein assemblies; rather, its goal is to introduce the practical aspects of the method to new researchers in the field.
cellular biology, Issue 41, mass spectrometry, ion-mobility, protein complexes, non-covalent interactions, structural biology
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.